WorldWideScience

Sample records for modeling workflow including

  1. Context-aware Workflow Model for Supporting Composite Workflows

    Institute of Scientific and Technical Information of China (English)

    Jong-sun CHOI; Jae-young CHOI; Yong-yun CHO

    2010-01-01

    -In recent years,several researchers have applied workflow technologies for service automation on ubiquitous computing environments.However,most context-aware oprkflows do not offer a method to compose several workflows in order to get more large-scale or complicated workflow.They only provide a simple workflow model,not a composite workflow model.In this paper,the autorhs propose a context-aware worrkflow model to support composite workflows by expanding the patterns of the existing context-aware workflows,which support the basic workflow patterns.The suggested worklow modei offers composite workflow patterns for a context-aware workflow,which consists of various flow patterns,such as simple,split,parallel flows,and subflow.With the suggested model,the model can easily reuse few of existing workflows to make a new workflow.As a result,it can save the development efforts and time of cantext-aware workflows and increase the workflow reusability.Therefore,the suggested model is expected to make it easy to develop applications related to context-aware workflow services on ubiquitous computing environments.

  2. Data Workflow - A Workflow Model for Continuous Data Processing

    NARCIS (Netherlands)

    Wombacher, Andreas

    2010-01-01

    Online data or streaming data are getting more and more important for enterprise information systems, e.g. by integrating sensor data and workflows. The continuous flow of data provided e.g. by sensors requires new workflow models addressing the data perspective of these applications, since

  3. Modeling workflow using XML and Petri net

    Institute of Scientific and Technical Information of China (English)

    杨东; 温泉; 张申生

    2004-01-01

    Nowadays an increasing number of workflow products and research prototypes begin to adopt XML for representing workflow models owing to its easy use and well understanding for people and machines. However, most of workflow products and research prototypes provide the few supports for the verification of XML-based workflow model, such as free-deadlock properties, which is essential to successful application of workflow technology. In this paper, we tackle this problem by mapping the XML-based workflow model into Petri-net, a kind of well-known formalism for modeling,analyzing and verifying system. As a result, the XML-based workflow model can be automatically verified with the help of general Petri-net tools, such as DANAMICS. The presented approach not only enables end users to represent workflow model with XML-based modeling language, but also the correctness of model can be ensured, thus satisfying the needs of business processes.

  4. Modeling Workflow Using UML Activity Diagram

    Institute of Scientific and Technical Information of China (English)

    Wei Yinxing(韦银星); Zhang Shensheng

    2004-01-01

    An enterprise can improve its adaptability in the changing market by means of workflow technologies. In the build time, the main function of Workflow Management System (WFMS) is to model business process. Workflow model is an abstract representation of the real-world business process. The Unified Modeling Language (UML) activity diagram is an important visual process modeling language proposed by the Object Management Group (OMG). The novelty of this paper is representing workflow model by means of UML activity diagram. A translation from UML activity diagram to π-calculus is established. Using π-calculus, the deadlock property of workflow is analyzed.

  5. Implementation of WPDL Conforming Workflow Model

    Institute of Scientific and Technical Information of China (English)

    张志君; 范玉顺

    2003-01-01

    Workflow process definition language (WPDL) facilitates the transfer of workflow process definitions between separate workflow products. However, much work is still needed to transfer the specific workflow model to a WPDL conforming model. CIMFlow is a workflow management system developed by the National CIMS Engineering Research Center. This paper discusses the methods by which the CIMFlow model conforms to the WPDL meta-model and the differences between the WPDL meta-model and the CIMFlow model. Some improvements are proposed for the WPDL specification. Finally, the mapping and translating methods between the entities and attributes are given for the two models. The proposed methods and improvements are valuable as a reference for other mapping applications and the WPDL specification.

  6. A Multi-Dimensional Classification Model for Scientific Workflow Characteristics

    Energy Technology Data Exchange (ETDEWEB)

    Ramakrishnan, Lavanya; Plale, Beth

    2010-04-05

    Workflows have been used to model repeatable tasks or operations in manufacturing, business process, and software. In recent years, workflows are increasingly used for orchestration of science discovery tasks that use distributed resources and web services environments through resource models such as grid and cloud computing. Workflows have disparate re uirements and constraints that affects how they might be managed in distributed environments. In this paper, we present a multi-dimensional classification model illustrated by workflow examples obtained through a survey of scientists from different domains including bioinformatics and biomedical, weather and ocean modeling, astronomy detailing their data and computational requirements. The survey results and classification model contribute to the high level understandingof scientific workflows.

  7. Workflow Fault Tree Generation Through Model Checking

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2014-01-01

    We present a framework for the automated generation of fault trees from models of realworld process workflows, expressed in a formalised subset of the popular Business Process Modelling and Notation (BPMN) language. To capture uncertainty and unreliability in workflows, we extend this formalism...... of the system being modelled. From these calculations, a comprehensive fault tree is generated. Further, we show that annotating the model with rewards (data) allows the expected mean values of reward structures to be calculated at points of failure....

  8. An architecture including network QoS in scientific workflows

    NARCIS (Netherlands)

    Zhao, Z.; Grosso, P.; Koning, R.; van der Ham, J.; de Laat, C.

    2010-01-01

    The quality of the network services has so far rarely been considered in composing and executing scientific workflows. Currently, scientific applications tune the execution quality of workflows neglecting network resources, and by selecting only optimal software services and computing resources. One

  9. Workflow-Based Dynamic Enterprise Modeling

    Institute of Scientific and Technical Information of China (English)

    黄双喜; 范玉顺; 罗海滨; 林慧萍

    2002-01-01

    Traditional systems for enterprise modeling and business process control are often static and cannot adapt to the changing environment. This paper presents a workflow-based method to dynamically execute the enterprise model. This method gives an explicit representation of the business process logic and the relationships between the elements involved in the process. An execution-oriented integrated enterprise modeling system is proposed in combination with other enterprise views. The enterprise model can be established and executed dynamically in the actual environment due to the dynamic properties of the workflow model.

  10. An ultrasound image-guided surgical workflow model

    Science.gov (United States)

    Guo, Bing; Lemke, Heinz; Liu, Brent; Huang, H. K.; Grant, Edward G.

    2006-03-01

    A 2003 report in the Journal of Annual Surgery predicted an increase in demand for surgical services to be as high as 14 to 47% in the workload of all surgical fields by 2020. Medical difficulties which are already now apparent in the surgical OR (Operation Room) will be amplified in the near future and it is necessary to address this problem and develop strategies to handle the workload. Workflow issues are central to the efficiency of the OR and in response to today's continuing workforce shortages and escalating costs. Among them include: Inefficient and redundant processes, System Inflexibility, Ergonomic deficiencies, Scattered Data, Lack of Guidelines, Standards, and Organization. The objective of this research is to validate the hypothesis that a workflow model does improve the efficiency and quality of surgical procedure. We chose to study the image-guided surgical workflow for US as a first proof of concept by minimizing the OR workflow issues. We developed, and implemented deformable workflow models using existing and projected future clinical environment data as well as a customized ICT system with seamless integration and real-time availability. An ultrasound (US) image-guided surgical workflow (IG SWF) for a specific surgical procedure, the US IG Liver Biopsy, was researched to find out the inefficient and redundant processes, scattered data in clinical systems, and improve the overall quality of surgical procedures to the patient.

  11. Perti Net-Based Workflow Access Control Model

    Institute of Scientific and Technical Information of China (English)

    陈卓; 骆婷; 石磊; 洪帆

    2004-01-01

    Access control is an important protection mechanism for information systems. This paper shows how to make access control in workflow system. We give a workflow access control model (WACM) based on several current access control models. The model supports roles assignment and dynamic authorization. The paper defines the workflow using Petri net. It firstly gives the definition and description of the workflow, and then analyzes the architecture of the workflow access control model (WACM). Finally, an example of an e-commerce workflow access control model is discussed in detail.

  12. Statistical modeling and recognition of surgical workflow.

    Science.gov (United States)

    Padoy, Nicolas; Blum, Tobias; Ahmadi, Seyed-Ahmad; Feussner, Hubertus; Berger, Marie-Odile; Navab, Nassir

    2012-04-01

    In this paper, we contribute to the development of context-aware operating rooms by introducing a novel approach to modeling and monitoring the workflow of surgical interventions. We first propose a new representation of interventions in terms of multidimensional time-series formed by synchronized signals acquired over time. We then introduce methods based on Dynamic Time Warping and Hidden Markov Models to analyze and process this data. This results in workflow models combining low-level signals with high-level information such as predefined phases, which can be used to detect actions and trigger an event. Two methods are presented to train these models, using either fully or partially labeled training surgeries. Results are given based on tool usage recordings from sixteen laparoscopic cholecystectomies performed by several surgeons.

  13. Research on an Integrated Enterprise Workflow Model

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    An integrated enterprise workflow model called PPROCE is presented firstly. Then, an enterprise's ontology established by TOVE and Process Specification Language (PSL) is studied. Combined with TOVE's partition idea, PSL is extended and new PSL Extensions is created to define the ontology of process, organization, resource and product in the PPROCE model. As a result, PPROCE model can be defined by a set of corresponding formal language. It facilitates the future work not only in the model verification, model optimization and model simulation, but also in the model translation.

  14. Rapid Energy Modeling Workflow Demonstration

    Science.gov (United States)

    2013-10-31

    sustainable building . Models produced through the REM process can be updated and accessed continually, thus allowing energy managers to continuously explore...time and cost of audits 4. Review the energy analysis findings under the High Performance and Sustainable Building Guiding Principles Compliance

  15. Evaluation of Workflow Management Systems - A Meta Model Approach

    Directory of Open Access Journals (Sweden)

    Michael Rosemann

    1998-11-01

    Full Text Available The automated enactment of processes through the use of workflow management systems enables the outsourcing of the control flow from application systems. By now a large number of systems, that follow different workflow paradigms, are available. This leads to the problem of selecting the appropriate workflow management system for a given situation. In this paper we outline the benefits of a meta model approach for the evaluation and comparison of different workflow management systems. After a general introduction on the topic of meta modeling the meta models of the workflow management systems WorkParty (Siemens Nixdorf and FlowMark (IBM are compared as an example. These product specific meta models can be generalized to meta reference models, which helps to specify a workflow methodology. Exemplary, an organisational reference meta model is presented, which helps users in specifying their requirements for a workflow management system.

  16. Applying direct observation to model workflow and assess adoption.

    Science.gov (United States)

    Unertl, Kim M; Weinger, Matthew B; Johnson, Kevin B

    2006-01-01

    Lack of understanding about workflow can impair health IT system adoption. Observational techniques can provide valuable information about clinical workflow. A pilot study using direct observation was conducted in an outpatient chronic disease clinic. The goals of the study were to assess workflow and information flow and to develop a general model of workflow and information behavior. Over 55 hours of direct observation showed that the pilot site utilized many of the features of the informatics systems available to them, but also employed multiple non-electronic artifacts and workarounds. Gaps existed between clinic workflow and informatics tool workflow, as well as between institutional expectations of informatics tool use and actual use. Concurrent use of both paper-based and electronic systems resulted in duplication of effort and inefficiencies. A relatively short period of direct observation revealed important information about workflow and informatics tool adoption.

  17. A Formal Model For Declarative Workflows

    DEFF Research Database (Denmark)

    Mukkamala, Raghava Rao

    it as a general formal model for specification and execution of declarative, event-based business processes, as a generalization of a concurrency model, the classic event structures. The model allows for an intuitive operational semantics and mapping of execution state by a notion of markings of the graphs and we...... the declarative nature of the projected graphs (which are also DCR graphs). We have also provided semantics for distributed executions based on synchronous communication among network of projected graphs and proved that global and distributed executions are equivalent. Further, to support modeling of processes......Current business process technology is pretty good in supporting well-structured business processes and aim at achieving a fixed goal by carrying out an exact set of operations. In contrast, those exact operations needed to fulfill a business pro- cess/workflow may not be always possible to foresee...

  18. Design, Modelling and Analysis of a Workflow Reconfiguration

    DEFF Research Database (Denmark)

    Mazzara, Manuel; Abouzaid, Faisal; Dragoni, Nicola

    2011-01-01

    This paper describes a case study involving the reconfiguration of an office workflow. We state the requirements on a system implementing the workflow and its reconfiguration, and describe the system’s design in BPMN. We then use an asynchronous pi-calculus and Web.1 to model the design and to ve......This paper describes a case study involving the reconfiguration of an office workflow. We state the requirements on a system implementing the workflow and its reconfiguration, and describe the system’s design in BPMN. We then use an asynchronous pi-calculus and Web.1 to model the design...

  19. Model Checking Workflow Net Based on Petri Net

    Institute of Scientific and Technical Information of China (English)

    ZHOU Conghua; CHEN Zhenyu

    2006-01-01

    The soundness is a very important criterion for the correctness of the workflow.Specifying the soundness with Computation Tree Logic (CTL) allows us to verify the soundness with symbolic model checkers.Therefore the state explosion problem in verifying soundness can be overcome efficiently.When the property is not satisfied by the system,model checking can give a counter-example, which can guide us to correct the workflow.In addition, relaxed soundness is another important criterion for the workflow.We also prove that Computation Tree Logic * (CTL * ) can be used to character the relaxed soundness of the workflow.

  20. A workflow learning model to improve geovisual analytics utility.

    Science.gov (United States)

    Roth, Robert E; Maceachren, Alan M; McCabe, Craig A

    2009-01-01

    INTRODUCTION: This paper describes the design and implementation of the G-EX Portal Learn Module, a web-based, geocollaborative application for organizing and distributing digital learning artifacts. G-EX falls into the broader context of geovisual analytics, a new research area with the goal of supporting visually-mediated reasoning about large, multivariate, spatiotemporal information. Because this information is unprecedented in amount and complexity, GIScientists are tasked with the development of new tools and techniques to make sense of it. Our research addresses the challenge of implementing these geovisual analytics tools and techniques in a useful manner. OBJECTIVES: The objective of this paper is to develop and implement a method for improving the utility of geovisual analytics software. The success of software is measured by its usability (i.e., how easy the software is to use?) and utility (i.e., how useful the software is). The usability and utility of software can be improved by refining the software, increasing user knowledge about the software, or both. It is difficult to achieve transparent usability (i.e., software that is immediately usable without training) of geovisual analytics software because of the inherent complexity of the included tools and techniques. In these situations, improving user knowledge about the software through the provision of learning artifacts is as important, if not more so, than iterative refinement of the software itself. Therefore, our approach to improving utility is focused on educating the user. METHODOLOGY: The research reported here was completed in two steps. First, we developed a model for learning about geovisual analytics software. Many existing digital learning models assist only with use of the software to complete a specific task and provide limited assistance with its actual application. To move beyond task-oriented learning about software use, we propose a process-oriented approach to learning based on

  1. Modelling and analysis of workflow for lean supply chains

    Science.gov (United States)

    Ma, Jinping; Wang, Kanliang; Xu, Lida

    2011-11-01

    Cross-organisational workflow systems are a component of enterprise information systems which support collaborative business process among organisations in supply chain. Currently, the majority of workflow systems is developed in perspectives of information modelling without considering actual requirements of supply chain management. In this article, we focus on the modelling and analysis of the cross-organisational workflow systems in the context of lean supply chain (LSC) using Petri nets. First, the article describes the assumed conditions of cross-organisation workflow net according to the idea of LSC and then discusses the standardisation of collaborating business process between organisations in the context of LSC. Second, the concept of labelled time Petri nets (LTPNs) is defined through combining labelled Petri nets with time Petri nets, and the concept of labelled time workflow nets (LTWNs) is also defined based on LTPNs. Cross-organisational labelled time workflow nets (CLTWNs) is then defined based on LTWNs. Third, the article proposes the notion of OR-silent CLTWNS and a verifying approach to the soundness of LTWNs and CLTWNs. Finally, this article illustrates how to use the proposed method by a simple example. The purpose of this research is to establish a formal method of modelling and analysis of workflow systems for LSC. This study initiates a new perspective of research on cross-organisational workflow management and promotes operation management of LSC in real world settings.

  2. Modeling Workflow Management in a Distributed Computing System ...

    African Journals Online (AJOL)

    Modeling Workflow Management in a Distributed Computing System Using Petri Nets. ... who use it to share information more rapidly and increases their productivity. ... Petri nets are an established tool for modelling and analyzing processes.

  3. Approach for workflow modeling using π-calculus

    Institute of Scientific and Technical Information of China (English)

    杨东; 张申生

    2003-01-01

    As a variant of process algebra, π-calculus can describe the interactions between evolving processes. By modeling activity as a process interacting with other processes through ports, this paper presents a new approach: representing workflow models using π-calculus. As a result, the model can characterize the dynamic behaviors of the workflow process in terms of the LTS (Labeled Transition Semantics) semantics of π-calculus. The main advantage of the workflow model's formal semantic is that it allows for verification of the model's properties, such as deadlock-free and normal termination. Moreover, the equivalence of workflow models can be checked through weak bisimulation theorem in the π-calculus, thus facilitating the optimization of business processes.

  4. Task-driven equipment inspection system based on safe workflow model

    Science.gov (United States)

    Guo, Xinyou; Liu, Yangguang

    2010-12-01

    An equipment inspection system is one that contains a number of equipment queues served in cyclic order. In order to satisfy multi-task scheduling and multi-task combination requirements for equipment inspection system, we propose a model based on inspection workflow in this paper. On the one hand, the model organizes all kinds of equipments according to inspection workflow, elemental work units according to inspection tasks, combination elements according to the task defined by users. We proposed a 3-dimensional workflow model for equipments inspection system including organization sub-model, process sub-model and data sub-model. On the other hand, the model is based on the security authorization which defined by relation between roles, tasks, pre-defined business workflows and inspection data. The system based on proposed framework is safe and efficient. Our implement shows that the system is easy to operate and manage according to the basic performance.

  5. A process model for work-flow management in construction

    OpenAIRE

    Jongeling, Rogier

    2006-01-01

    This thesis describes a novel approach for management of work-flow in construction, based on the combination of location-based scheduling and modelling with 4D CAD. Construction planners need to carefully design a process that ensures a continuous and reliable flow of resources through different locations in a project. The flow of resources through locations, termed work-flow, and the resultant ability to control the hand-over between both locations and crews, greatly empowers the management ...

  6. A Family of RBAC- Based Workflow Authorization Models

    Institute of Scientific and Technical Information of China (English)

    HONG Fan; XING Guang-lin

    2005-01-01

    A family of RBAC-based workflow authorization models, called RWAM, are proposed. RWAM consists of a basic model and other models constructed from the basic model. The basic model provides the notion of temporal permission which means that a user can perform certain operation on a task only for a time interval, this not only ensure that only authorized users could execute a task but also ensure that the authorization flow is synchronized with workflow. The two advance models of RWAM deal with role hierarchy and constraints respectively. RWAM ranges from simple to complex and provides a general reference model for other researches and developments of such area.

  7. Task Delegation Based Access Control Models for Workflow Systems

    Science.gov (United States)

    Gaaloul, Khaled; Charoy, François

    e-Government organisations are facilitated and conducted using workflow management systems. Role-based access control (RBAC) is recognised as an efficient access control model for large organisations. The application of RBAC in workflow systems cannot, however, grant permissions to users dynamically while business processes are being executed. We currently observe a move away from predefined strict workflow modelling towards approaches supporting flexibility on the organisational level. One specific approach is that of task delegation. Task delegation is a mechanism that supports organisational flexibility, and ensures delegation of authority in access control systems. In this paper, we propose a Task-oriented Access Control (TAC) model based on RBAC to address these requirements. We aim to reason about task from organisational perspectives and resources perspectives to analyse and specify authorisation constraints. Moreover, we present a fine grained access control protocol to support delegation based on the TAC model.

  8. A Hybrid Authorization Model For Project-Oriented Workflow

    Institute of Scientific and Technical Information of China (English)

    Zhang Xiaoguang(张晓光); Cao Jian; Zhang Shensheng

    2003-01-01

    In the context of workflow systems, security-relevant aspect is related to the assignment of activities to (human or automated) agents. This paper intends to cast light on the management of project-oriented workflow. A comprehensive authorization model is proposed from the perspective of project management. In this model, the concept of activity decomposition and team is introduced, which improves the security of conventional role-based access control. Furthermore, policy is provided to define the static and dynamic constraints such as Separation of Duty (SoD). Validity of constraints is proposed to provide a fine-grained assignment, which improves the performance of policy management. The model is applicable not only to project-oriented workflow applications but also to other teamwork environments such as virtual enterprise.

  9. A HYBRID PETRI-NET MODEL OF GRID WORKFLOW

    Institute of Scientific and Technical Information of China (English)

    Ji Yimu; Wang Ruchuan; Ren Xunyi

    2008-01-01

    In order to effectively control the random tasks submitted and executed in grid workflow, a grid workflow model based on hybrid petri-net is presented. This model is composed of random petri-net, colored petri-net and general petri-net. Therein random petri-net declares the relationship between the number of grid users' random tasks and the size of service window and computes the server intensity of grid system. Colored petri-net sets different color for places with grid services and provides the valid interfaces for grid resource allocation and task scheduling. The experiment indicated that the model presented in this letter could compute the valve between the number of users' random tasks and the size of grid service window in grid workflow management system.

  10. Workflow modeling in the graphic arts and printing industry

    Science.gov (United States)

    Tuijn, Chris

    2003-12-01

    The last few years, a lot of effort has been spent on the standardization of the workflow in the graphic arts and printing industry. The main reasons for this standardization are two-fold: first of all, the need to represent all aspects of products, processes and resources in a uniform, digital framework and, secondly, the need to have different systems communicate with each other without having to implement dedicated drivers or protocols. Since many years, a number of organizations in the IT sector have been quite busy developing models and languages on the topic of workflow modeling. In addition to the more formal methods (such as, e.g., extended finite state machines, Petri Nets, Markov Chains etc.) introduced a number of decades ago, more pragmatic methods have been proposed quite recently. We hereby think in particular of the activities of the Workflow Management Coalition that resulted in an XML based Process Definition Language. Although one might be tempted to use the already established standards in the graphic environment, one should be well aware of the complexity and uniqueness of the graphic arts workflow. In this paper, we will show that it is quite hard though not impossible to model the graphic arts workflow using the already established workflow systems. After a brief summary of the graphic arts workflow requirements, we will show why the traditional models are less suitable to use. It will turn out that one of the main reasons for the incompatibility is that the graphic arts workflow is primarily resource driven; this means that the activation of processes depends on the status of different incoming resources. The fact that processes can start running with a partial availability of the input resources is a further complication that asks for additional knowledge on process level. In the second part of this paper, we will discuss in more detail the different software components that are available in any graphic enterprise. In the last part, we will

  11. A Workflow-Oriented Approach To Propagation Models In Heliophysics

    Directory of Open Access Journals (Sweden)

    Gabriele Pierantoni

    2014-01-01

    Full Text Available The Sun is responsible for the eruption of billions of tons of plasma andthe generation of near light-speed particles that propagate throughout the solarsystem and beyond. If directed towards Earth, these events can be damaging toour tecnological infrastructure. Hence there is an effort to understand the causeof the eruptive events and how they propagate from Sun to Earth. However, thephysics governing their propagation is not well understood, so there is a need todevelop a theoretical description of their propagation, known as a PropagationModel, in order to predict when they may impact Earth. It is often difficultto define a single propagation model that correctly describes the physics ofsolar eruptive events, and even more difficult to implement models capable ofcatering for all these complexities and to validate them using real observational data.In this paper, we envisage that workflows offer both a theoretical andpractical framerwork for a novel approach to propagation models. We definea mathematical framework that aims at encompassing the different modalitieswith which workflows can be used, and provide a set of generic building blockswritten in the TAVERNA workflow language that users can use to build theirown propagation models. Finally we test both the theoretical model and thecomposite building blocks of the workflow with a real Science Use Case that wasdiscussed during the 4th CDAW (Coordinated Data Analysis Workshop eventheld by the HELIO project. We show that generic workflow building blocks canbe used to construct a propagation model that succesfully describes the transitof solar eruptive events toward Earth and predict a correct Earth-impact time

  12. Integrated Enterprise Modeling Method Based on Workflow Model and Multiviews%Integrated Enterprise Modeling Method Based on Workflow Model and Multiviews

    Institute of Scientific and Technical Information of China (English)

    林慧苹; 范玉顺; 吴澄

    2001-01-01

    Many enterprise modeling methods are proposed to model thebusiness process of enterprises and to implement CIM systems. But difficulties are still encountered when these methods are applied to the CIM system design and implementation. This paper proposes a new integrated enterprise modeling methodology based on the workflow model. The system architecture and the integrated modeling environment are described with a new simulation strategy. The modeling process and the relationship between the workflow model and the views are discussed.

  13. Research on Architecture of Enterprise Modeling in Workflow System

    Institute of Scientific and Technical Information of China (English)

    李伟平; 齐慧彬; 薛劲松; 朱云龙

    2002-01-01

    The market that an enterprise is faced is changing and can 't be forecastedaccurately in this information time. In order to find the chances in the marketpractitioners have focused on business processes through their re-engineeringprogramme to improve enterprise efficiency. It is necessary to manage an enterpriseusing process-based method for the requirement of enhancing work efficiency and theability of competition in the market. And information system developers haveemphasized the use of standard models to accelerate the speed of configuration andimplementation of integrated systems for enterprises. So we have to model anenterprise with process-based modeling method. An architecture of enterprise modelingis presented in this paper. This architecture is composed of four views and supportingthe whole lifecycle of enterprise model. Because workflow management system is basedon process definition, this architecture can be directly used in the workflowmanagement system. The implement method of this model was thoroughly describedmeanwhile the workflow management software supporting the building and running themodel was also given.

  14. Towards an integrated workflow for structural reservoir model updating and history matching

    NARCIS (Netherlands)

    Leeuwenburgh, O.; Peters, E.; Wilschut, F.

    2011-01-01

    A history matching workflow, as typically used for updating of petrophysical reservoir model properties, is modified to include structural parameters including the top reservoir and several fault properties: position, slope, throw and transmissibility. A simple 2D synthetic oil reservoir produced by

  15. Towards an integrated workflow for structural reservoir model updating and history matching

    NARCIS (Netherlands)

    Leeuwenburgh, O.; Peters, E.; Wilschut, F.

    2011-01-01

    A history matching workflow, as typically used for updating of petrophysical reservoir model properties, is modified to include structural parameters including the top reservoir and several fault properties: position, slope, throw and transmissibility. A simple 2D synthetic oil reservoir produced by

  16. Workflow Modelling and Analysis Based on the Construction of Task Models

    Directory of Open Access Journals (Sweden)

    Glória Cravo

    2015-01-01

    Full Text Available We describe the structure of a workflow as a graph whose vertices represent tasks and the arcs are associated to workflow transitions in this paper. To each task an input/output logic operator is associated. Furthermore, we associate a Boolean term to each transition present in the workflow. We still identify the structure of workflows and describe their dynamism through the construction of new task models. This construction is very simple and intuitive since it is based on the analysis of all tasks present on the workflow that allows us to describe the dynamism of the workflow very easily. So, our approach has the advantage of being very intuitive, which is an important highlight of our work. We also introduce the concept of logical termination of workflows and provide conditions under which this property is valid. Finally, we provide a counter-example which shows that a conjecture presented in a previous article is false.

  17. The medical simulation markup language - simplifying the biomechanical modeling workflow.

    Science.gov (United States)

    Suwelack, Stefan; Stoll, Markus; Schalck, Sebastian; Schoch, Nicolai; Dillmann, Rüdiger; Bendl, Rolf; Heuveline, Vincent; Speidel, Stefanie

    2014-01-01

    Modeling and simulation of the human body by means of continuum mechanics has become an important tool in diagnostics, computer-assisted interventions and training. This modeling approach seeks to construct patient-specific biomechanical models from tomographic data. Usually many different tools such as segmentation and meshing algorithms are involved in this workflow. In this paper we present a generalized and flexible description for biomechanical models. The unique feature of the new modeling language is that it not only describes the final biomechanical simulation, but also the workflow how the biomechanical model is constructed from tomographic data. In this way, the MSML can act as a middleware between all tools used in the modeling pipeline. The MSML thus greatly facilitates the prototyping of medical simulation workflows for clinical and research purposes. In this paper, we not only detail the XML-based modeling scheme, but also present a concrete implementation. Different examples highlight the flexibility, robustness and ease-of-use of the approach.

  18. A Workflow for Global Sensitivity Analysis of PBPK Models

    Directory of Open Access Journals (Sweden)

    Kevin eMcNally

    2011-06-01

    Full Text Available Physiologically based pharmacokinetic models have a potentially significant role in the development of a reliable predictive toxicity testing strategy. The structure of PBPK models are ideal frameworks into which disparate in vitro and in vivo data can be integrated and utilised to translate information generated, using alternative to animal measures of toxicity and human biological monitoring data, into plausible corresponding exposures. However, these models invariably include the description of well known non-linear biological processes such as, enzyme saturation and interactions between parameters such as, organ mass and body mass. Therefore, an appropriate sensitivity analysis technique is required which can quantify the influences associated with individual parameters, interactions between parameters and any non-linear processes. In this report we have defined a workflow for sensitivity analysis of PBPK models that is computationally feasible, accounts for interactions between parameters, and can be displayed in the form of a bar chart and cumulative sum line (Lowry plot, which we believe is intuitive and appropriate for toxicologists, risk assessors and regulators.

  19. Layered Workflow Process Model Based on Extended Synchronizer

    Directory of Open Access Journals (Sweden)

    Gang Ni

    2014-07-01

    Full Text Available The layered workflow process model provide a modeling approach and analysis for the key process with Petri Net. It not only describes the relation between the process of business flow and transition nodes clearly, but also limits the rapid increase in the scale of libraries, transition and directed arcs. This paper studies the process like reservation and complaint handling information management system, especially for the multi-mergence and discriminator patterns which can not be directly modeled with existing synchronizers. Petri Net is adopted to provide formalization description for the workflow patterns and the relation between Arcs and weight class are also analyzed. We use the number of in and out arcs to generalize the workflow into three synchronous modes: fully synchronous mode, competition synchronous mode and asynchronous mode. The types and parameters for synchronization are added to extend the modeling ability of the synchronizers and the synchronous distance is also expanded. The extended synchronizers have the ability to terminate branches automatically or activate the next link many times, besides the ability of original synchronizers. By the analyses on cases of the key business, it is verified that the original synchronizers can not model directly, while the extended synchronizers based on Petri Net can provide modeling for multi-mergence and discriminator modes.

  20. Restructuring of workflows to minimise errors via stochastic model checking: An automated evolutionary approach

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee

    2016-01-01

    This article presents a framework for the automated restructuring of stochastic workflows to reduce the impact of faults. The framework allows for the modelling of workflows by means of a formalised subset of the BPMN workflow language. We extend this modelling formalism to describe faults...

  1. Automated evolutionary restructuring of workflows to minimise errors via stochastic model checking

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    This paper presents a framework for the automated restructuring of workflows that allows one to minimise the impact of errors on a production workflow. The framework allows for the modelling of workflows by means of a formalised subset of the Business Process Modelling and Notation (BPMN) language...

  2. Coupling of a continuum ice sheet model and a discrete element calving model using a scientific workflow system

    Science.gov (United States)

    Memon, Shahbaz; Vallot, Dorothée; Zwinger, Thomas; Neukirchen, Helmut

    2017-04-01

    Scientific communities generate complex simulations through orchestration of semi-structured analysis pipelines which involves execution of large workflows on multiple, distributed and heterogeneous computing and data resources. Modeling ice dynamics of glaciers requires workflows consisting of many non-trivial, computationally expensive processing tasks which are coupled to each other. From this domain, we present an e-Science use case, a workflow, which requires the execution of a continuum ice flow model and a discrete element based calving model in an iterative manner. Apart from the execution, this workflow also contains data format conversion tasks that support the execution of ice flow and calving by means of transition through sequential, nested and iterative steps. Thus, the management and monitoring of all the processing tasks including data management and transfer of the workflow model becomes more complex. From the implementation perspective, this workflow model was initially developed on a set of scripts using static data input and output references. In the course of application usage when more scripts or modifications introduced as per user requirements, the debugging and validation of results were more cumbersome to achieve. To address these problems, we identified a need to have a high-level scientific workflow tool through which all the above mentioned processes can be achieved in an efficient and usable manner. We decided to make use of the e-Science middleware UNICORE (Uniform Interface to Computing Resources) that allows seamless and automated access to different heterogenous and distributed resources which is supported by a scientific workflow engine. Based on this, we developed a high-level scientific workflow model for coupling of massively parallel High-Performance Computing (HPC) jobs: a continuum ice sheet model (Elmer/Ice) and a discrete element calving and crevassing model (HiDEM). In our talk we present how the use of a high

  3. CSPBuilder - CSP based Scientific Workflow Modelling

    DEFF Research Database (Denmark)

    Friborg, Rune Møllegaard; Vinter, Brian

    2008-01-01

    This paper introduces a framework for building CSP based applications, targeted for clusters and next generation CPU designs. CPUs are produced with several cores today and every future CPU generation will feature increasingly more cores, resulting in a requirement for concurrency that has...... not previously been called for. The framework is CSP presented as a scienti¿c work¿ow model, specialized for scienti¿c computing applications. The purpose of the framework is to enable scientists to exploit large parallel computation resources, which has previously been hard due of the dif¿culty of concurrent...... programming using threads and locks....

  4. Design Tools and Workflows for Braided Structures

    DEFF Research Database (Denmark)

    Vestartas, Petras; Heinrich, Mary Katherine; Zwierzycki, Mateusz

    2017-01-01

    and merits of our method, demonstrated though four example design and analysis workflows. The workflows frame specific aspects of enquiry for the ongoing research project flora robotica. These include modelling target geometries, automatically producing instructions for fabrication, conducting structural...

  5. Approach for workflow modeling using π-calculus

    Institute of Scientific and Technical Information of China (English)

    杨东; 张申生

    2003-01-01

    As a variant of process algebra, π-calculus can describe the interactions between evolving processes. By modeling activity as a process interacting with other processes through ports, this paper presents a new approach: representing workilow models using ~-calculus. As a result, the model can characterize the dynamic behaviors of the workflow process in terms of the LTS ( Labeled Transition Semantics) semantics of π-calculus. The main advantage of the worktlow model's formal semantic is that it allows for verification of the model's properties, such as deadlock-free and normal termination. Moreover, the equivalence of worktlow models can be checked thlx)ugh weak bisimulation theorem in the π-caleulus, thus facilitating the optimizationof business processes.

  6. Making Sense of Complexity with FRE, a Scientific Workflow System for Climate Modeling (Invited)

    Science.gov (United States)

    Langenhorst, A. R.; Balaji, V.; Yakovlev, A.

    2010-12-01

    A workflow is a description of a sequence of activities that is both precise and comprehensive. Capturing the workflow of climate experiments provides a record which can be queried or compared, and allows reproducibility of the experiments - sometimes even to the bit level of the model output. This reproducibility helps to verify the integrity of the output data, and enables easy perturbation experiments. GFDL's Flexible Modeling System Runtime Environment (FRE) is a production-level software project which defines and implements building blocks of the workflow as command line tools. The scientific, numerical and technical input needed to complete the workflow of an experiment is recorded in an experiment description file in XML format. Several key features add convenience and automation to the FRE workflow: ● Experiment inheritance makes it possible to define a new experiment with only a reference to the parent experiment and the parameters to override. ● Testing is a basic element of the FRE workflow: experiments define short test runs which are verified before the main experiment is run, and a set of standard experiments are verified with new code releases. ● FRE is flexible enough to support short runs with mere megabytes of data, to high-resolution experiments that run on thousands of processors for months, producing terabytes of output data. Experiments run in segments of model time; after each segment, the state is saved and the model can be checkpointed at that level. Segment length is defined by the user, but the number of segments per system job is calculated to fit optimally in the batch scheduler requirements. FRE provides job control across multiple segments, and tools to monitor and alter the state of long-running experiments. ● Experiments are entered into a Curator Database, which stores query-able metadata about the experiment and the experiment's output. ● FRE includes a set of standardized post-processing functions as well as the ability

  7. geoKepler Workflow Module for Computationally Scalable and Reproducible Geoprocessing and Modeling

    Science.gov (United States)

    Cowart, C.; Block, J.; Crawl, D.; Graham, J.; Gupta, A.; Nguyen, M.; de Callafon, R.; Smarr, L.; Altintas, I.

    2015-12-01

    The NSF-funded WIFIRE project has developed an open-source, online geospatial workflow platform for unifying geoprocessing tools and models for for fire and other geospatially dependent modeling applications. It is a product of WIFIRE's objective to build an end-to-end cyberinfrastructure for real-time and data-driven simulation, prediction and visualization of wildfire behavior. geoKepler includes a set of reusable GIS components, or actors, for the Kepler Scientific Workflow System (https://kepler-project.org). Actors exist for reading and writing GIS data in formats such as Shapefile, GeoJSON, KML, and using OGC web services such as WFS. The actors also allow for calling geoprocessing tools in other packages such as GDAL and GRASS. Kepler integrates functions from multiple platforms and file formats into one framework, thus enabling optimal GIS interoperability, model coupling, and scalability. Products of the GIS actors can be fed directly to models such as FARSITE and WRF. Kepler's ability to schedule and scale processes using Hadoop and Spark also makes geoprocessing ultimately extensible and computationally scalable. The reusable workflows in geoKepler can be made to run automatically when alerted by real-time environmental conditions. Here, we show breakthroughs in the speed of creating complex data for hazard assessments with this platform. We also demonstrate geoKepler workflows that use Data Assimilation to ingest real-time weather data into wildfire simulations, and for data mining techniques to gain insight into environmental conditions affecting fire behavior. Existing machine learning tools and libraries such as R and MLlib are being leveraged for this purpose in Kepler, as well as Kepler's Distributed Data Parallel (DDP) capability to provide a framework for scalable processing. geoKepler workflows can be executed via an iPython notebook as a part of a Jupyter hub at UC San Diego for sharing and reporting of the scientific analysis and results from

  8. A Model of Workflow-oriented Attributed Based Access Control

    Directory of Open Access Journals (Sweden)

    Guoping Zhang

    2011-02-01

    Full Text Available the emergence of “Internet of Things” breaks previous traditional thinking, which integrates physical infrastructure and network infrastructure into unified infrastructure. There will be a lot of resources or information in IoT, so computing and processing of information is the core supporting of IoT. In this paper, we introduce “Service-Oriented Computing” to solve the problem where each device can offer its functionality as standard services. Here we mainly discuss the access control issue of service-oriented computing in Internet of Things. This paper puts forward a model of Workflow-oriented Attributed Based Access Control (WABAC, and design an access control framework based on WABAC model. The model grants permissions to subjects according to subject atttribute, resource attribute, environment attribute and current task, meeting access control request of SOC. Using the approach presented can effectively enhance the access control security for SOC applications, and prevent the abuse of subject permissions.

  9. Leveraging an existing data warehouse to annotate workflow models for operations research and optimization.

    Science.gov (United States)

    Borlawsky, Tara; LaFountain, Jeanne; Petty, Lynda; Saltz, Joel H; Payne, Philip R O

    2008-11-06

    Workflow analysis is frequently performed in the context of operations research and process optimization. In order to develop a data-driven workflow model that can be employed to assess opportunities to improve the efficiency of perioperative care teams at The Ohio State University Medical Center (OSUMC), we have developed a method for integrating standard workflow modeling formalisms, such as UML activity diagrams with data-centric annotations derived from our existing data warehouse.

  10. Enhancing population pharmacokinetic modeling efficiency and quality using an integrated workflow.

    Science.gov (United States)

    Schmidt, Henning; Radivojevic, Andrijana

    2014-08-01

    Population pharmacokinetic (popPK) analyses are at the core of Pharmacometrics and need to be performed regularly. Although these analyses are relatively standard, a large variability can be observed in both the time (efficiency) and the way they are performed (quality). Main reasons for this variability include the level of experience of a modeler, personal preferences and tools. This paper aims to examine how the process of popPK model building can be supported in order to increase its efficiency and quality. The presented approach to the conduct of popPK analyses is centered around three key components: (1) identification of most common and important popPK model features, (2) required information content and formatting of the data for modeling, and (3) methodology, workflow and workflow supporting tools. This approach has been used in several popPK modeling projects and a documented example is provided in the supplementary material. Efficiency of model building is improved by avoiding repetitive coding and other labor-intensive tasks and by putting the emphasis on a fit-for-purpose model. Quality is improved by ensuring that the workflow and tools are in alignment with a popPK modeling guidance which is established within an organization. The main conclusion of this paper is that workflow based approaches to popPK modeling are feasible and have significant potential to ameliorate its various aspects. However, the implementation of such an approach in a pharmacometric organization requires openness towards innovation and change-the key ingredient for evolution of integrative and quantitative drug development in the pharmaceutical industry.

  11. Robust Workflow Systems + Flexible Geoprocessing Services = Geo-enabled Model Web?

    OpenAIRE

    GRANELL CANUT CARLOS

    2013-01-01

    The chapter begins briefly exploring the concept of modeling in geosciences which notably benefits from advances on the integration of geoprocessing services and workflow systems. In section 3, we provide a comprehensive background on the technology trends we treat in the chapter. On one hand we deal with workflow systems, categorized normally in the literature as scientific and business workflow systems (Barga and Gannon 2007). In particular, we introduce some prominent examples of scient...

  12. Design and Implementation of Visualized Workflow Modeling System Based on B/S Structure

    Institute of Scientific and Technical Information of China (English)

    WANG Jian; LI wei-li

    2007-01-01

    According to the necessity of flexible workflow management system, the solution to set up the visualized workflow modelling system based on B/S structure is put forward, which conforms to the relevant specifications of WfMC and the workflow process definition meta-modei. The design for system structure is presented in detail, and the key technologies for system implementation are also introduced. Additionally, an example is illustrated to demonstrate the validity of system.

  13. VisTrails SAHM: visualization and workflow management for species habitat modeling

    Science.gov (United States)

    Morisette, Jeffrey T.; Jarnevich, Catherine S.; Holcombe, Tracy R.; Talbert, Colin B.; Ignizio, Drew; Talbert, Marian K.; Silva, Claudio; Koop, David; Swanson, Alan; Young, Nicholas E.

    2013-01-01

    The Software for Assisted Habitat Modeling (SAHM) has been created to both expedite habitat modeling and help maintain a record of the various input data, pre- and post-processing steps and modeling options incorporated in the construction of a species distribution model through the established workflow management and visualization VisTrails software. This paper provides an overview of the VisTrails:SAHM software including a link to the open source code, a table detailing the current SAHM modules, and a simple example modeling an invasive weed species in Rocky Mountain National Park, USA.

  14. Next-Generation Climate Modeling Science Challenges for Simulation, Workflow and Analysis Systems

    Science.gov (United States)

    Koch, D. M.; Anantharaj, V. G.; Bader, D. C.; Krishnan, H.; Leung, L. R.; Ringler, T.; Taylor, M.; Wehner, M. F.; Williams, D. N.

    2016-12-01

    We will present two examples of current and future high-resolution climate-modeling research that are challenging existing simulation run-time I/O, model-data movement, storage and publishing, and analysis. In each case, we will consider lessons learned as current workflow systems are broken by these large-data science challenges, as well as strategies to repair or rebuild the systems. First we consider the science and workflow challenges to be posed by the CMIP6 multi-model HighResMIP, involving around a dozen modeling groups performing quarter-degree simulations, in 3-member ensembles for 100 years, with high-frequency (1-6 hourly) diagnostics, which is expected to generate over 4PB of data. An example of science derived from these experiments will be to study how resolution affects the ability of models to capture extreme-events such as hurricanes or atmospheric rivers. Expected methods to transfer (using parallel Globus) and analyze (using parallel "TECA" software tools) HighResMIP data for such feature-tracking by the DOE CASCADE project will be presented. A second example will be from the Accelerated Climate Modeling for Energy (ACME) project, which is currently addressing challenges involving multiple century-scale coupled high resolution (quarter-degree) climate simulations on DOE Leadership Class computers. ACME is anticipating production of over 5PB of data during the next 2 years of simulations, in order to investigate the drivers of water cycle changes, sea-level-rise, and carbon cycle evolution. The ACME workflow, from simulation to data transfer, storage, analysis and publication will be presented. Current and planned methods to accelerate the workflow, including implementing run-time diagnostics, and implementing server-side analysis to avoid moving large datasets will be presented.

  15. A workflow model to analyse pediatric emergency overcrowding.

    Science.gov (United States)

    Zgaya, Hayfa; Ajmi, Ines; Gammoudi, Lotfi; Hammadi, Slim; Martinot, Alain; Beuscart, Régis; Renard, Jean-Marie

    2014-01-01

    The greatest source of delay in patient flow is the waiting time from the health care request, and especially the bed request to exit from the Pediatric Emergency Department (PED) for hospital admission. It represents 70% of the time that these patients occupied in the PED waiting rooms. Our objective in this study is to identify tension indicators and bottlenecks that contribute to overcrowding. Patient flow mapping through the PED was carried out in a continuous 2 years period from January 2011 to December 2012. Our method is to use the collected real data, basing on accurate visits made in the PED of the Regional University Hospital Center (CHRU) of Lille (France), in order to construct an accurate and complete representation of the PED processes. The result of this representation is a Workflow model of the patient journey in the PED representing most faithfully possible the reality of the PED of CHRU of Lille. This model allowed us to identify sources of delay in patient flow and aspects of the PED activity that could be improved. It must be enough retailed to produce an analysis allowing to identify the dysfunctions of the PED and also to propose and to estimate prevention indicators of tensions. Our survey is integrated into the French National Research Agency project, titled: "Hospital: optimization, simulation and avoidance of strain" (ANR HOST).

  16. An Inter-enterprise Workflow Model for Supply Chain and B2B E-commerce

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The goals of B2B electronic commerce and supply chain management system are to implement interoperability of independent enterprises, to smooth the information flow between them and to deploy business processes over multiple enterprises. The inherent characteristics of workflow system make it suitable to implement the cross organization management. This paper, firstly, proposes an inter-enterprises workflow model based on the agreement to support the construction of supply chain management system and B2B electronic commerce. This model has extended the standard workflow model. After that, an architecture which supports the model has been introduced, especially it details the structure and implementation of interfaces between enterprises.

  17. Towards a framework for standardized semantic workflow modeling and management in the surgical domain

    Directory of Open Access Journals (Sweden)

    Neumann Juliane

    2015-09-01

    Full Text Available An essential aspect for workflow management support in operating room environments is the description and visualization of the underlying processes and activities in a machine readable format as Surgical Process Models (SPM. However, the process models often vary in terms of granularity, naming and representation of process elements and their modeling structure. The aim of this paper is to present a new methodology for standardized semantic workflow modeling and a framework for semantic work-flow execution and management in the surgical domain.

  18. Structuring research methods and data with the research object model: genomics workflows as a case study.

    Science.gov (United States)

    Hettne, Kristina M; Dharuri, Harish; Zhao, Jun; Wolstencroft, Katherine; Belhajjame, Khalid; Soiland-Reyes, Stian; Mina, Eleni; Thompson, Mark; Cruickshank, Don; Verdes-Montenegro, Lourdes; Garrido, Julian; de Roure, David; Corcho, Oscar; Klyne, Graham; van Schouwen, Reinout; 't Hoen, Peter A C; Bechhofer, Sean; Goble, Carole; Roos, Marco

    2014-01-01

    One of the main challenges for biomedical research lies in the computer-assisted integrative study of large and increasingly complex combinations of data in order to understand molecular mechanisms. The preservation of the materials and methods of such computational experiments with clear annotations is essential for understanding an experiment, and this is increasingly recognized in the bioinformatics community. Our assumption is that offering means of digital, structured aggregation and annotation of the objects of an experiment will provide necessary meta-data for a scientist to understand and recreate the results of an experiment. To support this we explored a model for the semantic description of a workflow-centric Research Object (RO), where an RO is defined as a resource that aggregates other resources, e.g., datasets, software, spreadsheets, text, etc. We applied this model to a case study where we analysed human metabolite variation by workflows. We present the application of the workflow-centric RO model for our bioinformatics case study. Three workflows were produced following recently defined Best Practices for workflow design. By modelling the experiment as an RO, we were able to automatically query the experiment and answer questions such as "which particular data was input to a particular workflow to test a particular hypothesis?", and "which particular conclusions were drawn from a particular workflow?". Applying a workflow-centric RO model to aggregate and annotate the resources used in a bioinformatics experiment, allowed us to retrieve the conclusions of the experiment in the context of the driving hypothesis, the executed workflows and their input data. The RO model is an extendable reference model that can be used by other systems as well. The Research Object is available at http://www.myexperiment.org/packs/428 The Wf4Ever Research Object Model is available at http://wf4ever.github.io/ro.

  19. Declarative Modelling and Safe Distribution of Healthcare Workflows

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Mukkamala, Raghava Rao; Slaats, Tijs

    2012-01-01

    a set of local process graphs communicating by shared events, such that the distributed execution of the local processes is equivalent to executing the original process. The technique is based on our recent similar work on safe distribution of Dynamic Condition Response (DCR) Graphs applied to cross......We present a formal technique for safe distribution of workflow processes described declaratively as Nested Condition Response (NCR) Graphs and apply the technique to a distributed healthcare workflow. Concretely, we provide a method to synthesize from a NCR Graph and any distribution of its events...

  20. Advanced computational workflow for the multi-scale modeling of the bone metabolic processes.

    Science.gov (United States)

    Dao, Tien Tuan

    2017-06-01

    Multi-scale modeling of the musculoskeletal system plays an essential role in the deep understanding of complex mechanisms underlying the biological phenomena and processes such as bone metabolic processes. Current multi-scale models suffer from the isolation of sub-models at each anatomical scale. The objective of this present work was to develop a new fully integrated computational workflow for simulating bone metabolic processes at multi-scale levels. Organ-level model employs multi-body dynamics to estimate body boundary and loading conditions from body kinematics. Tissue-level model uses finite element method to estimate the tissue deformation and mechanical loading under body loading conditions. Finally, cell-level model includes bone remodeling mechanism through an agent-based simulation under tissue loading. A case study on the bone remodeling process located on the human jaw was performed and presented. The developed multi-scale model of the human jaw was validated using the literature-based data at each anatomical level. Simulation outcomes fall within the literature-based ranges of values for estimated muscle force, tissue loading and cell dynamics during bone remodeling process. This study opens perspectives for accurately simulating bone metabolic processes using a fully integrated computational workflow leading to a better understanding of the musculoskeletal system function from multiple length scales as well as to provide new informative data for clinical decision support and industrial applications.

  1. Inheritance Optimization of Extend Case Transfer Model of Interoganization Workflows Management

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Extend case transfer architecture inter-organization workflow management fits the needs of Collaboration commerce. However, during the third step of extend case transfer architecture, modifications of private workflows might cause some fatal problems, such as dead locks, live locks and dead tasks. These problems could change the soundness and efficiency of overall work flow. This paper presents a Petri net based approach to protect the inheritance of public work flows in private domains, and discusses an implementation of our collaboration commerce workflow model.

  2. Populating a Library of Reusable H-Boms Assessment of a Feasible Image Based Modeling Workflow

    Science.gov (United States)

    Santagati, C.; Lo Turco, M.; D'Agostino, G.

    2017-08-01

    The paper shows the intermediate results of a research activity aimed at populating a library of reusable Historical Building Object Models (H-BOMs) by testing a full digital workflow that takes advantages from using Structure from Motion (SfM) models and is centered on the geometrical/stylistic/materic analysis of the architectural element (portal, window, altar). The aim is to find common (invariant) and uncommon (variant) features in terms of identification of architectural parts and their relationships, geometrical rules, dimensions and proportions, construction materials and measure units, in order to model archetypal shapes from which it is possible to derive all the style variations. At this regard, a set of 14th - 16th century gothic portals of the catalan-aragonese architecture in Etnean area of Eastern Sicily has been studied and used to assess the feasibility of the identified workflow. This approach tries to answer the increasingly demand for guidelines and standards in the field of Cultural Heritage Conservation to create and manage semantic-aware 3D models able to include all the information (both geometrical and alphanumerical ones) concerning historical buildings and able to be reused in several projects.

  3. A Grid Middleware Framework Support for a Workflow Model Based on Virtualized Resources

    Science.gov (United States)

    Lee, Jinbock; Lee, Sangkeon; Choi, Jaeyoung

    Nowadays, the virtualization technologies are widely used to overcome the difficulty of managing Grid computing infrastructures. The virtual account and the virtual workspace are very optimistic to allocate Grid resources to specific user, but they lacks of capability of interaction between portal services and virtualized resources which required by Grid portal. The virtual application is fitted to wrap simple application as a Grid portal service, but integrating some applications to compose larger application service is difficult. In this paper, we present a Grid middleware framework which supports for a workflow model based on virtualized resources. Meta Services in the framework exposes workflow as a portal service and service call is converted different workflow according to parameter and workflow generated by the Meta Services is scheduled in a virtual cluster which configured by this framework. Because of virtual application service can be composed of workflow and service interface wraps the workflow providing a complex portal services composed by small application could effectively integrated to Grid portal and scheduled in virtual computing resources.

  4. Integration of 3D photogrammetric outcrop models in the reservoir modelling workflow

    Science.gov (United States)

    Deschamps, Remy; Joseph, Philippe; Lerat, Olivier; Schmitz, Julien; Doligez, Brigitte; Jardin, Anne

    2014-05-01

    3D technologies are now widely used in geosciences to reconstruct outcrops in 3D. The technology used for the 3D reconstruction is usually based on Lidar, which provides very precise models. Such datasets offer the possibility to build well-constrained outcrop analogue models for reservoir study purposes. The photogrammetry is an alternate methodology which principles are based in determining the geometric properties of an object from photographic pictures taken from different angles. Outcrop data acquisition is easy, and this methodology allows constructing 3D outcrop models with many advantages such as: - light and fast acquisition, - moderate processing time (depending on the size of the area of interest), - integration of field data and 3D outcrops into the reservoir modelling tools. Whatever the method, the advantages of digital outcrop model are numerous as already highlighted by Hodgetts (2013), McCaffrey et al. (2005) and Pringle et al. (2006): collection of data from otherwise inaccessible areas, access to different angles of view, increase of the possible measurements, attributes analysis, fast rate of data collection, and of course training and communication. This paper proposes a workflow where 3D geocellular models are built by integrating all sources of information from outcrops (surface picking, sedimentological sections, structural and sedimentary dips…). The 3D geomodels that are reconstructed can be used at the reservoir scale, in order to compare the outcrop information with subsurface models: the detailed facies models of the outcrops are transferred into petrophysical and acoustic models, which are used to test different scenarios of seismic and fluid flow modelling. The detailed 3D models are also used to test new techniques of static reservoir modelling, based either on geostatistical approaches or on deterministic (process-based) simulation techniques. A modelling workflow has been designed to model reservoir geometries and properties from

  5. Designing Collaborative Developmental Standards by Refactoring of the Earth Science Models, Libraries, Workflows and Frameworks.

    Science.gov (United States)

    Mirvis, E.; Iredell, M.

    2015-12-01

    The operational (OPS) NOAA National Centers for Environmental Prediction (NCEP) suite, traditionally, consist of a large set of multi- scale HPC models, workflows, scripts, tools and utilities, which are very much depending on the variety of the additional components. Namely, this suite utilizes a unique collection of the in-house developed 20+ shared libraries (NCEPLIBS), certain versions of the 3-rd party libraries (like netcdf, HDF, ESMF, jasper, xml etc.), HPC workflow tool within dedicated (sometimes even vendors' customized) HPC system homogeneous environment. This domain and site specific, accompanied with NCEP's product- driven large scale real-time data operations complicates NCEP collaborative development tremendously by reducing chances to replicate this OPS environment anywhere else. The NOAA/NCEP's Environmental Modeling Center (EMC) missions to develop and improve numerical weather, climate, hydrological and ocean prediction through the partnership with the research community. Realizing said difficulties, lately, EMC has been taken an innovative approach to improve flexibility of the HPC environment by building the elements and a foundation for NCEP OPS functionally equivalent environment (FEE), which can be used to ease the external interface constructs as well. Aiming to reduce turnaround time of the community code enhancements via Research-to-Operations (R2O) cycle, EMC developed and deployed several project sub-set standards that already paved the road to NCEP OPS implementation standards. In this topic we will discuss the EMC FEE for O2R requirements and approaches in collaborative standardization, including NCEPLIBS FEE and models code version control paired with the models' derived customized HPC modules and FEE footprints. We will share NCEP/EMC experience and potential in the refactoring of EMC development processes, legacy codes and in securing model source code quality standards by using combination of the Eclipse IDE, integrated with the

  6. Workflow automation architecture standard

    Energy Technology Data Exchange (ETDEWEB)

    Moshofsky, R.P.; Rohen, W.T. [Boeing Computer Services Co., Richland, WA (United States)

    1994-11-14

    This document presents an architectural standard for application of workflow automation technology. The standard includes a functional architecture, process for developing an automated workflow system for a work group, functional and collateral specifications for workflow automation, and results of a proof of concept prototype.

  7. Load Composition Model Workflow (BPA TIP-371 Deliverable 1A)

    Energy Technology Data Exchange (ETDEWEB)

    Chassin, David P.; Cezar, Gustavo V.; /SLAC

    2017-07-17

    This project is funded under Bonneville Power Administration (BPA) Strategic Partnership Project (SPP) 17-005 between BPA and SLAC National Accelerator Laboratory. The project in a BPA Technology Improvement Project (TIP) that builds on and validates the Composite Load Model developed by the Western Electric Coordinating Council's (WECC) Load Modeling Task Force (LMTF). The composite load model is used by the WECC Modeling and Validation Work Group to study the stability and security of the western electricity interconnection. The work includes development of load composition data sets, collection of load disturbance data, and model development and validation. This work supports reliable and economic operation of the power system. This report was produced for Deliverable 1A of the BPA TIP-371 Project entitled \\TIP 371: Advancing the Load Composition Model". The deliverable documents the proposed work ow for the Composite Load Model, which provides the basis for the instrumentation, data acquisition, analysis and data dissemination activities addressed by later phases of the project.

  8. Quantitative Regression Models for the Prediction of Chemical Properties by an Efficient Workflow.

    Science.gov (United States)

    Yin, Yongmin; Xu, Congying; Gu, Shikai; Li, Weihua; Liu, Guixia; Tang, Yun

    2015-10-01

    Rapid safety assessment is more and more needed for the increasing chemicals both in chemical industries and regulators around the world. The traditional experimental methods couldn't meet the current demand any more. With the development of the information technology and the growth of experimental data, in silico modeling has become a practical and rapid alternative for the assessment of chemical properties, especially for the toxicity prediction of organic chemicals. In this study, a quantitative regression workflow was built by KNIME to predict chemical properties. With this regression workflow, quantitative values of chemical properties can be obtained, which is different from the binary-classification model or multi-classification models that can only give qualitative results. To illustrate the usage of the workflow, two predictive models were constructed based on datasets of Tetrahymena pyriformis toxicity and Aqueous solubility. The qcv (2) and qtest (2) of 5-fold cross validation and external validation for both types of models were greater than 0.7, which implies that our models are robust and reliable, and the workflow is very convenient and efficient in prediction of various chemical properties. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. KNOWLEDGE MANAGEMENT DRIVEN BUSINESS PROCESS AND WORKFLOW MODELING WITHIN AN ORGANIZATION FOR CUSTOMER SATISFACTION

    Directory of Open Access Journals (Sweden)

    Atsa Etoundi roger,

    2010-12-01

    Full Text Available Enterprises to deal with the competitive pressure of the network economy have to design their business processes and workflow systems based on the satisfaction of customers. Therefore, mass product productions have to be abandoned for individual and customized products. Enterprises that fail to meet this challenge will beobliged to step down. Those which tackle this problem need to manage the knowledge of various customers in order to come out with a set of criteria for the delivery of services or production of products. These criteria should then be use to reengineer the business processes and workflows accordingly for the satisfaction of eachof the customers. In this paper, based on the knowledge management approach, we define an enterprise business process and workflow model for the delivery of services and production of goods based on the satisfaction of customers.

  10. Optimize Internal Workflow Management

    Directory of Open Access Journals (Sweden)

    Lucia RUSU

    2010-01-01

    Full Text Available Workflow Management has the role of creating andmaintaining an efficient flow of information and tasks inside anorganization. The major benefit of workflows is the solutions tothe growing needs of organizations. The external and theinternal processes associated with a business need to be carefullyorganized in order to provide a strong foundation for the dailywork. This paper focuses internal workflow within a company,attempts to provide some basic principles related to workflows,and a workflows solution for modeling and deploying usingVisual Studio and SharePoint Server.

  11. A system for deduction-based formal verification of workflow-oriented software models

    Directory of Open Access Journals (Sweden)

    Klimek Radosław

    2014-12-01

    Full Text Available The work concerns formal verification of workflow-oriented software models using the deductive approach. The formal correctness of a model’s behaviour is considered. Manually building logical specifications, which are regarded as a set of temporal logic formulas, seems to be a significant obstacle for an inexperienced user when applying the deductive approach. A system, along with its architecture, for deduction-based verification of workflow-oriented models is proposed. The process inference is based on the semantic tableaux method, which has some advantages when compared with traditional deduction strategies. The algorithm for automatic generation of logical specifications is proposed. The generation procedure is based on predefined workflow patterns for BPMN, which is a standard and dominant notation for the modeling of business processes. The main idea behind the approach is to consider patterns, defined in terms of temporal logic, as a kind of (logical primitives which enable the transformation of models to temporal logic formulas constituting a logical specification. Automation of the generation process is crucial for bridging the gap between the intuitiveness of deductive reasoning and the difficulty of its practical application when logical specifications are built manually. This approach has gone some way towards supporting, hopefully enhancing, our understanding of deduction-based formal verification of workflow-oriented models.

  12. Perti Net-Based Workflow Access Control Model%基于Perti网的工作流访问控制模型研究

    Institute of Scientific and Technical Information of China (English)

    陈卓; 骆婷; 石磊; 洪帆

    2004-01-01

    Access control is an important protection mechanism for information systems.This paper shows how to make access control in workflow system.We give a workflow access control model (WACM) based on several current access control models.The model supports roles assignment and dynamic authorization.The paper defines the workflow using Petri net.It firstly gives the definition and description of the workflow, and then analyzes the architecture of the workflow access control model (WACM).Finally, an example of an e-commerce workflow access control model is discussed in detail.

  13. Enabling Structured Exploration of Workflow Performance Variability in Extreme-Scale Environments

    Energy Technology Data Exchange (ETDEWEB)

    Kleese van Dam, Kerstin; Stephan, Eric G.; Raju, Bibi; Altintas, Ilkay; Elsethagen, Todd O.; Krishnamoorthy, Sriram

    2015-11-15

    Workflows are taking an Workflows are taking an increasingly important role in orchestrating complex scientific processes in extreme scale and highly heterogeneous environments. However, to date we cannot reliably predict, understand, and optimize workflow performance. Sources of performance variability and in particular the interdependencies of workflow design, execution environment and system architecture are not well understood. While there is a rich portfolio of tools for performance analysis, modeling and prediction for single applications in homogenous computing environments, these are not applicable to workflows, due to the number and heterogeneity of the involved workflow and system components and their strong interdependencies. In this paper, we investigate workflow performance goals and identify factors that could have a relevant impact. Based on our analysis, we propose a new workflow performance provenance ontology, the Open Provenance Model-based WorkFlow Performance Provenance, or OPM-WFPP, that will enable the empirical study of workflow performance characteristics and variability including complex source attribution.

  14. A priori modeling of chemical reactions on computational grid platforms: Workflows and data models

    Energy Technology Data Exchange (ETDEWEB)

    Rampino, S., E-mail: ser_ram@dyn.unipg.it [Dipartimento di Chimica, Universita degli Studi di Perugia, Via Elce di Sotto 8, 06123 Perugia (Italy); Monari, A. [SRSMC-Equipe de Chimie et Biochimie Theoriques, Nancy-Universite et CNRS, Bp70239 Boulevard des Aiguilettes, 54506 Vandoeuvre-les-Nancy Cedex (France); Rossi, E. [CINECA, Via Manganelli 6/3, 40033 Casalecchio di Reno, Bologna (Italy); Evangelisti, S. [Laboratoire de Chimie et de Physique Quantiques, Universite Paul Sabatier Toulouse III et CNRS, 118 Route de Narbonne, 31062 Toulouse Cedex 4 (France); Lagana, A. [Dipartimento di Chimica, Universita degli Studi di Perugia, Via Elce di Sotto 8, 06123 Perugia (Italy)

    2012-04-04

    Graphical abstract: The quantum framework of the Grid Empowered Molecular Simulator GEMS assembled on the European Grid allows the ab initio evaluation of the dynamics of small systems starting from the calculation of the electronic properties. Highlights: Black-Right-Pointing-Pointer The grid based GEMS simulator accurately models small chemical systems. Black-Right-Pointing-Pointer Q5Cost and D5Cost file formats provide interoperability in the workflow. Black-Right-Pointing-Pointer Benchmark runs on H + H{sub 2} highlight the Grid empowering. Black-Right-Pointing-Pointer O + O{sub 2} and N + N{sub 2} calculated k (T)'s fall within the error bars of the experiment. - Abstract: The quantum framework of the Grid Empowered Molecular Simulator GEMS has been assembled on the segment of the European Grid devoted to the Computational Chemistry Virtual Organization. The related grid based workflow allows the ab initio evaluation of the dynamics of small systems starting from the calculation of the electronic properties. Interoperability between computational codes across the different stages of the workflow was made possible by the use of the common data formats Q5Cost and D5Cost. Illustrative benchmark runs have been performed on the prototype H + H{sub 2}, N + N{sub 2} and O + O{sub 2} gas phase exchange reactions and thermal rate coefficients have been calculated for the last two. Results are discussed in terms of the modeling of the interaction and advantages of using the Grid is highlighted.

  15. Precise Quantitative Analysis of Probabilistic Business Process Model and Notation Workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2013-01-01

    We present a framework for modeling and analysis of real-world business workflows. We present a formalized core subset of the business process modeling and notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations....... We present an algorithm for the translation of such models into Markov decision processes (MDP) expressed in the syntax of the PRISM model checker. This enables precise quantitative analysis of business processes for the following properties: transient and steady-state probabilities, the timing......, occurrence and ordering of events, reward-based properties, and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover...

  16. An Approach to Design Reusable Workflow Engine

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Developers still need design workflow system according to users' specific needs, though workflow management coalition standardized the five kinds of abstract interfaces in workflow reference model. Specific business process characteristics are still supported by specific workflow system. A set of common functionalities of workflow engine are abstracted from business component, so the reusability of business component is extended into workflow engine and composition method is proposed. Needs of different business requirements and characteristics are met by reusing the workflow engine.

  17. A Survey of Workflow Modeling Approaches and Model Verification%工作流过程建模方法及模型的形式化验证

    Institute of Scientific and Technical Information of China (English)

    杨东; 王英林; 张申生; 傅谦

    2003-01-01

    Work/low technology is widely used in business process modeling, software process modeling as well as en-terprise information integration. At present, there exist a variety of workflow modeling approaches, which differ in the easiness of modeling, expressiveness and formalism. In this paper, the modeling approaches most used in research project and workflow products are compared. And the verification of workflow model is also dealt. We argue that a ideal workflow modelin~ approach is a hybrid one, i.e. the inteuration of the above approaches.

  18. Efficient Workflows for Curation of Heterogeneous Data Supporting Modeling of U-Nb Alloy Aging

    Energy Technology Data Exchange (ETDEWEB)

    Ward, Logan Timothy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hackenberg, Robert Errol [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-08-31

    These are slides from a presentation summarizing a graduate research associate's summer project. The following topics are covered in these slides: data challenges in materials, aging in U-Nb Alloys, Building an Aging Model, Different Phase Trans. in U-Nb, the Challenge, Storing Materials Data, Example Data Source, Organizing Data: What is a Schema?, What does a "XML Schema" look like?, Our Data Schema: Nice and Simple, Storing Data: Materials Data Curation System (MDCS), Problem with MDCS: Slow Data Entry, Getting Literature into MDCS, Staging Data in Excel Document, Final Result: MDCS Records, Analyzing Image Data, Process for Making TTT Diagram, Bottleneck Number 1: Image Analysis, Fitting a TTP Boundary, Fitting a TTP Curve: Comparable Results, How Does it Compare to Our Data?, Image Analysis Workflow, Curating Hardness Records, Hardness Data: Two Key Decisions, Before Peak Age? - Automation, Interactive Viz, Which Transformation?, Microstructure-Informed Model, Tracking the Entire Process, General Problem with Property Models, Pinyon: Toolkit for Managing Model Creation, Tracking Individual Decisions, Jupyter: Docs and Code in One File, Hardness Analysis Workflow, Workflow for Aging Models, and conclusions.

  19. Mining workflow processes from distributed workflow enactment event logs

    Directory of Open Access Journals (Sweden)

    Kwanghoon Pio Kim

    2012-12-01

    Full Text Available Workflow management systems help to execute, monitor and manage work process flow and execution. These systems, as they are executing, keep a record of who does what and when (e.g. log of events. The activity of using computer software to examine these records, and deriving various structural data results is called workflow mining. The workflow mining activity, in general, needs to encompass behavioral (process/control-flow, social, informational (data-flow, and organizational perspectives; as well as other perspectives, because workflow systems are "people systems" that must be designed, deployed, and understood within their social and organizational contexts. This paper particularly focuses on mining the behavioral aspect of workflows from XML-based workflow enactment event logs, which are vertically (semantic-driven distribution or horizontally (syntactic-driven distribution distributed over the networked workflow enactment components. That is, this paper proposes distributed workflow mining approaches that are able to rediscover ICN-based structured workflow process models through incrementally amalgamating a series of vertically or horizontally fragmented temporal workcases. And each of the approaches consists of a temporal fragment discovery algorithm, which is able to discover a set of temporal fragment models from the fragmented workflow enactment event logs, and a workflow process mining algorithm which rediscovers a structured workflow process model from the discovered temporal fragment models. Where, the temporal fragment model represents the concrete model of the XML-based distributed workflow fragment events log.

  20. An enhancement of the role-based access control model to facilitate information access management in context of team collaboration and workflow.

    Science.gov (United States)

    Le, Xuan Hung; Doll, Terry; Barbosu, Monica; Luque, Amneris; Wang, Dongwen

    2012-12-01

    Although information access control models have been developed and applied to various applications, few of the previous works have addressed the issue of managing information access in the combined context of team collaboration and workflow. To facilitate this requirement, we have enhanced the Role-Based Access Control (RBAC) model through formulating universal constraints, defining bridging entities and contributing attributes, extending access permissions to include workflow contexts, synthesizing a role-based access delegation model to target on specific objects, and developing domain ontologies as instantiations of the general model to particular applications. We have successfully applied this model to the New York State HIV Clinical Education Initiative (CEI) project to address the specific needs of information management in collaborative processes. An initial evaluation has shown this model achieved a high level of agreement with an existing system when applied to 4576 cases (kappa=0.801). Comparing to a reference standard, the sensitivity and specificity of the enhanced RBAC model were at the level of 97-100%. These results indicate that the enhanced RBAC model can be effectively used for information access management in context of team collaboration and workflow to coordinate clinical education programs. Future research is required to incrementally develop additional types of universal constraints, to further investigate how the workflow context and access delegation can be enriched to support the various needs on information access management in collaborative processes, and to examine the generalizability of the enhanced RBAC model for other applications in clinical education, biomedical research, and patient care.

  1. Including Magnetostriction in Micromagnetic Models

    Science.gov (United States)

    Conbhuí, Pádraig Ó.; Williams, Wyn; Fabian, Karl; Nagy, Lesleis

    2016-04-01

    The magnetic anomalies that identify crustal spreading are predominantly recorded by basalts formed at the mid-ocean ridges, whose magnetic signals are dominated by iron-titanium-oxides (Fe3-xTixO4), so called "titanomagnetites", of which the Fe2.4Ti0.6O4 (TM60) phase is the most common. With sufficient quantities of titanium present, these minerals exhibit strong magnetostriction. To date, models of these grains in the pseudo-single domain (PSD) range have failed to accurately account for this effect. In particular, a popular analytic treatment provided by Kittel (1949) for describing the magnetostrictive energy as an effective increase of the anisotropy constant can produce unphysical strains for non-uniform magnetizations. I will present a rigorous approach based on work by Brown (1966) and by Kroner (1958) for including magnetostriction in micromagnetic codes which is suitable for modelling hysteresis loops and finding remanent states in the PSD regime. Preliminary results suggest the more rigorously defined micromagnetic models exhibit higher coercivities and extended single domain ranges when compared to more simplistic approaches.

  2. DOMstudio: an integrated workflow for Digital Outcrop Model reconstruction and interpretation

    Science.gov (United States)

    Bistacchi, Andrea

    2015-04-01

    Different Remote Sensing technologies, including photogrammetry and LIDAR, allow collecting 3D dataset that can be used to create 3D digital representations of outcrop surfaces, called Digital Outcrop Models (DOM), or sometimes Virtual Outcrop Models (VOM). Irrespective of the Remote Sensing technique used, DOMs can be represented either by photorealistic point clouds (PC-DOM) or textured surfaces (TS-DOM). The first are datasets composed of millions of points with XYZ coordinates and RGB colour, whilst the latter are triangulated surfaces onto which images of the outcrop have been mapped or "textured" (applying a tech-nology originally developed for movies and videogames). Here we present a workflow that allows exploiting in an integrated and efficient, yet flexible way, both kinds of dataset: PC-DOMs and TS-DOMs. The workflow is composed of three main steps: (1) data collection and processing, (2) interpretation, and (3) modelling. Data collection can be performed with photogrammetry, LIDAR, or other techniques. The quality of photogrammetric datasets obtained with Structure From Motion (SFM) techniques has shown a tremendous improvement over the past few years, and this is becoming the more effective way to collect DOM datasets. The main advantages of photogrammetry over LIDAR are represented by the very simple and lightweight field equipment (a digital camera), and by the arbitrary spatial resolution, that can be increased simply getting closer to the out-crop or by using a different lens. It must be noted that concerns about the precision of close-range photogrammetric surveys, that were justified in the past, are no more a problem if modern software and acquisition schemas are applied. In any case, LIDAR is a well-tested technology and it is still very common. Irrespective of the data collection technology, the output will be a photorealistic point cloud and a collection of oriented photos, plus additional imagery in special projects (e.g. infrared images

  3. Parallel workflows for data-driven structural equation modeling in functional neuroimaging

    Directory of Open Access Journals (Sweden)

    Sarah Kenny

    2009-10-01

    Full Text Available We present a computational framework suitable for a data-driven approach to structural equation modeling (SEM and describe several workflows for modeling functional magnetic resonance imaging (fMRI data within this framework. The Computational Neuroscience Applications Research Infrastructure (CNARI employs a high-level scripting language called Swift, which is capable of spawning hundreds of thousands of simultaneous R processes (R Core Development Team, 2008, consisting of self-contained structural equation models, on a high performance computing system (HPC. These self-contained R processing jobs are data objects generated by OpenMx, a plug-in for R, which can generate a single model object containing the matrices and algebraic information necessary to estimate parameters of the model. With such an infrastructure in place a structural modeler may begin to investigate exhaustive searches of the model space. Specific applications of the infrastructure, statistics related to model fit, and limitations are discussed in relation to exhaustive SEM. In particular, we discuss how workflow management techniques can help to solve large computational problems in neuroimaging.

  4. The Open Physiology workflow: modeling processes over physiology circuitboards of interoperable tissue units

    Science.gov (United States)

    de Bono, Bernard; Safaei, Soroush; Grenon, Pierre; Nickerson, David P.; Alexander, Samuel; Helvensteijn, Michiel; Kok, Joost N.; Kokash, Natallia; Wu, Alan; Yu, Tommy; Hunter, Peter; Baldock, Richard A.

    2015-01-01

    A key challenge for the physiology modeling community is to enable the searching, objective comparison and, ultimately, re-use of models and associated data that are interoperable in terms of their physiological meaning. In this work, we outline the development of a workflow to modularize the simulation of tissue-level processes in physiology. In particular, we show how, via this approach, we can systematically extract, parcellate and annotate tissue histology data to represent component units of tissue function. These functional units are semantically interoperable, in terms of their physiological meaning. In particular, they are interoperable with respect to [i] each other and with respect to [ii] a circuitboard representation of long-range advective routes of fluid flow over which to model long-range molecular exchange between these units. We exemplify this approach through the combination of models for physiology-based pharmacokinetics and pharmacodynamics to quantitatively depict biological mechanisms across multiple scales. Links to the data, models and software components that constitute this workflow are found at http://open-physiology.org/. PMID:25759670

  5. Soil-atmosphere and vadose zone water fluxes at the Wagna - lysimeter: Workflow, models, and results

    Science.gov (United States)

    Fank, Johann

    2014-05-01

    A precise knowledge of the water fluxes between the soil-plant system and the atmosphere is of great importance for understanding and modeling water, solute and energy transfer in the soil-plant-atmosphere system. Weighing lysimeters are precise tools to allow the determination of the hydrological cycle components in very short time intervals. Lysimeters with controlled suction at the lower boundary allow estimation of capillary rise and deep water percolation on short time scales. Evapotranspiration, rainfall, and irrigation can be computed from weight changes. In the last decades resolution and precision of the weighing systems have been substantially improved, so that modern lysimeters, resting on weighing cells can reach resolutions of up to 0.01 mm. Nevertheless, a lot of external effects (e.g. from maintenance, surface treatment) and small mechanical disturbances (e.g. caused by wind) became visible in the data. Seepage mass data are affected by water sampling and the emptying process of the seepage water container. Increasing parts of corrected seepage mass data show deep water percolation, decreasing parts in dry weather periods can be interpreted as capillary rise. In the evaluation process of corrected lysimeter mass data every increase in system weight (lysimeter mass + cumulative seepage mass) might be interpreted as rainfall or irrigation, whereas every decrease in system weight is interpreted as evapotranspiration. To apply this concept correctly, the noise in both data sets has to be separated from signals using a filtering routine (e.g. Peters et al., 2013) which is appropriate for any event, including events with low disturbances as well as strong wind and heavy precipitation in small time intervals. Based on the data set from the "Wagna" lysimeter in Austria with a high resolution of the scale (~ 0,015 mm) and very low noise due to low wind velocities for the year 2010 a lysimeter data preparation workflow will be executed: (a) correction of the

  6. Improved workflow modelling using role activity diagram-based modelling with application to a radiology service case study.

    Science.gov (United States)

    Shukla, Nagesh; Keast, John E; Ceglarek, Darek

    2014-10-01

    The modelling of complex workflows is an important problem-solving technique within healthcare settings. However, currently most of the workflow models use a simplified flow chart of patient flow obtained using on-site observations, group-based debates and brainstorming sessions, together with historic patient data. This paper presents a systematic and semi-automatic methodology for knowledge acquisition with detailed process representation using sequential interviews of people in the key roles involved in the service delivery process. The proposed methodology allows the modelling of roles, interactions, actions, and decisions involved in the service delivery process. This approach is based on protocol generation and analysis techniques such as: (i) initial protocol generation based on qualitative interviews of radiology staff, (ii) extraction of key features of the service delivery process, (iii) discovering the relationships among the key features extracted, and, (iv) a graphical representation of the final structured model of the service delivery process. The methodology is demonstrated through a case study of a magnetic resonance (MR) scanning service-delivery process in the radiology department of a large hospital. A set of guidelines is also presented in this paper to visually analyze the resulting process model for identifying process vulnerabilities. A comparative analysis of different workflow models is also conducted. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  7. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-2 analysis model

    CERN Document Server

    FARRELL, Steven; The ATLAS collaboration; Calafiura, Paolo; Delsart, Pierre-Antoine; Elsing, Markus; Koeneke, Karsten; Krasznahorkay, Attila; Krumnack, Nils; Lancon, Eric; Lavrijsen, Wim; Laycock, Paul; Lei, Xiaowen; Strandberg, Sara Kristina; Verkerke, Wouter; Vivarelli, Iacopo; Woudstra, Martin

    2015-01-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This paper will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  8. Standards for business analytics and departmental workflow.

    Science.gov (United States)

    Erickson, Bradley J; Meenan, Christopher; Langer, Steve

    2013-02-01

    Efficient workflow is essential for a successful business. However, there is relatively little literature on analytical tools and standards for defining workflow and measuring workflow efficiency. Here, we describe an effort to define a workflow lexicon for medical imaging departments, including the rationale, the process, and the resulting lexicon.

  9. A comprehensive workflow for general-purpose neural modeling with highly configurable neuromorphic hardware systems.

    Science.gov (United States)

    Brüderle, Daniel; Petrovici, Mihai A; Vogginger, Bernhard; Ehrlich, Matthias; Pfeil, Thomas; Millner, Sebastian; Grübl, Andreas; Wendt, Karsten; Müller, Eric; Schwartz, Marc-Olivier; de Oliveira, Dan Husmann; Jeltsch, Sebastian; Fieres, Johannes; Schilling, Moritz; Müller, Paul; Breitwieser, Oliver; Petkov, Venelin; Muller, Lyle; Davison, Andrew P; Krishnamurthy, Pradeep; Kremkow, Jens; Lundqvist, Mikael; Muller, Eilif; Partzsch, Johannes; Scholze, Stefan; Zühl, Lukas; Mayr, Christian; Destexhe, Alain; Diesmann, Markus; Potjans, Tobias C; Lansner, Anders; Schüffny, René; Schemmel, Johannes; Meier, Karlheinz

    2011-05-01

    In this article, we present a methodological framework that meets novel requirements emerging from upcoming types of accelerated and highly configurable neuromorphic hardware systems. We describe in detail a device with 45 million programmable and dynamic synapses that is currently under development, and we sketch the conceptual challenges that arise from taking this platform into operation. More specifically, we aim at the establishment of this neuromorphic system as a flexible and neuroscientifically valuable modeling tool that can be used by non-hardware experts. We consider various functional aspects to be crucial for this purpose, and we introduce a consistent workflow with detailed descriptions of all involved modules that implement the suggested steps: The integration of the hardware interface into the simulator-independent model description language PyNN; a fully automated translation between the PyNN domain and appropriate hardware configurations; an executable specification of the future neuromorphic system that can be seamlessly integrated into this biology-to-hardware mapping process as a test bench for all software layers and possible hardware design modifications; an evaluation scheme that deploys models from a dedicated benchmark library, compares the results generated by virtual or prototype hardware devices with reference software simulations and analyzes the differences. The integration of these components into one hardware-software workflow provides an ecosystem for ongoing preparative studies that support the hardware design process and represents the basis for the maturity of the model-to-hardware mapping software. The functionality and flexibility of the latter is proven with a variety of experimental results.

  10. A workflow for 3D model building in fold-thrust belts

    Science.gov (United States)

    Watkins, Hannah; Bond, Clare; Butler, Rob

    2016-04-01

    3D geological models can be used in fold-thrust belts for many purposes such as analysing geometric variation in folds, kinematic modelling to restore fold surfaces, generating strain distribution maps and predicting fracture network distribution. We present a workflow for 3D model building using outcrop bedding data, geological maps, Digital Terrain Models (DTM's), air photos and field photographs. We discuss the challenges of software limitations for 3D kinematic restoration and forward modelling in fold-thrust belt settings. We then discuss the sensitivity of model building approaches to the application of 3D geological models in fold-thrust belts for further analysis e.g. changes in along strike fold geometry, restoration using kinematic and geomechanical modelling, strain prediction and Discrete Fracture Network (DFN) modelling. To create 3D models geological maps and bedding data are digitised using Move software; digitised maps and data are then draped onto DTM's. A series of closely spaced cross section lines are selected; the orientation of these is calculated by determining the average orientation of bedding dip direction. Fault and horizon line intersections, along with bedding data from within a narrow margin of the section lines are projected onto each cross section. Field photographs and sketches are integrated into the cross sections to determine thrust angles at the surface. Horizon lines are then constructed using bedding data. Displacement profiles for thrusts are plotted to ensure thrust displacements are valid with respect to neighbouring cross section interpretations; any discrepancies are alleviated by making minor adjustments to horizon and thrust lines, while ensuring that resultant cross section geometries still adhere to bedding data and other field observations. Once the cross sections have been finalised, 3D surfaces are created using the horizon and thrust line interpretations on each cross section. The simple curvature of 3D surfaces

  11. Estimating a patient surface model for optimizing the medical scanning workflow.

    Science.gov (United States)

    Singh, Vivek; Chang, Yao-Jen; Ma, Kai; Wels, Michael; Soza, Grzegorz; Chen, Terrence

    2014-01-01

    In this paper, we present the idea of equipping a tomographic medical scanner with a range imaging device (e.g. a 3D camera) to improve the current scanning workflow. A novel technical approach is proposed to robustly estimate patient surface geometry by a single snapshot from the camera. Leveraging the information of the patient surface geometry can provide significant clinical benefits, including automation of the scan, motion compensation for better image quality, sanity check of patient movement, augmented reality for guidance, patient specific dose optimization, and more. Our approach overcomes the technical difficulties resulting from suboptimal camera placement due to practical considerations. Experimental results on more than 30 patients from a real CT scanner demonstrate the robustness of our approach.

  12. a Workflow for the Application of Sensitivity Analysis to Earth System Models

    Science.gov (United States)

    Pianosi, F.; Wagener, T.; Rougier, J.; Freer, J. E.; Hall, J.

    2013-12-01

    Predictions of any earth system model are affected by unavoidable and potentially large uncertainty. When models are used to support risk management of natural hazard, such uncertainties can undermine the transparency and defensibility of the risk assessment. When models are applied to understand dominant controls or other aspects of the system under study, uncertainties will reduce our ability to chose between competing hypotheses. Sensitivity Analysis (SA) provides quantitative information about the contribution of the different input factors (e.g. parameters, boundary conditions or forcing data) to such uncertainty. SA application thus provides insights into the model behavior and potential for model simplification, indicates where further data collection and research is needed or would be beneficial, and enhances the credibility of our modelling results. The value of such analysis has motivated an increasing research effort in the development, application and comparison of SA techniques. Still, comprehensive understanding to guide choices between available SA methods and practical guidelines for their application in the context of earth system models is still insufficient. In this contribution, we aim at filling this gap by (i) providing a map of the existing SA techniques and their appropriateness in different contexts of earth system modeling; (ii) developing a workflow for the choice and application of SA techniques to environmental models; (iii) presenting a suite of visualization tools that can support the assessment and communication of SA results; (iv) defining challenges and opportunities for future research.

  13. Operational Semantic of Workflow Engine and the Realizing Technique

    Institute of Scientific and Technical Information of China (English)

    FU Yan-ning; LIU Lei; ZHAO Dong-fan; JIN Long-fei

    2005-01-01

    At present, there is no formalized description of the executing procedure of workflow models. The procedure of workflow models executing in workflow engine is described using operational semantic. The formalized description of process instances and activity instances leads to very clear structure of the workflow engine, has easy cooperation of the heterogeneous workflow engines and guides the realization of the workflow engine function. Meanwhile, the software of workflow engine has been completed by means of the formalized description.

  14. Towards a High Reliable Enforcement of Safety Regulations - A Workflow Meta Data Model and Probabilistic Failure Management Approach

    Directory of Open Access Journals (Sweden)

    Heiko Henning Thimm

    2016-10-01

    Full Text Available Today’s companies are able to automate the enforcement of Environmental, Health and Safety (EH&S duties through the use of workflow management technology. This approach requires to specify activities that are combined to workflow models for EH&S enforcement duties. In order to meet given safety regulations these activities are to be completed correctly and within given deadlines. Otherwise, activity failures emerge which may lead to breaches against safety regulations. A novel domain-specific workflow meta data model is proposed. The model enables a system to detect and predict activity failures through the use of data about the company, failure statistics, and activity proxies. Since the detection and prediction methods are based on the evaluation of constraints specified on EH&S regulations, a system approach is proposed that builds on the integration of a Workflow Management System (WMS with an EH&S Compliance Information System. Main principles of the failure detection and prediction are described. For EH&S managers the system shall provide insights into the current failure situation. This can help to prevent and mitigate critical situations such as safety enforcement measures that are behind their deadlines. As a result a more reliable enforcement of safety regulations can be achieved.

  15. A Basic Protein Comparative Three-Dimensional Modeling Methodological Workflow Theory and Practice.

    Science.gov (United States)

    Bitar, Mainá; Franco, Glória Regina

    2014-01-01

    When working with proteins and studying its properties, it is crucial to have access to the three-dimensional structure of the molecule. If experimentally solved structures are not available, comparative modeling techniques can be used to generate useful protein models to subsidize structure-based research projects. In recent years, with Bioinformatics becoming the basis for the study of protein structures, there is a crescent need for the exposure of details about the algorithms behind the softwares and servers, as well as a need for protocols to guide in silico predictive experiments. In this article, we explore different steps of the comparative modeling technique, such as template identification, sequence alignment, generation of candidate structures and quality assessment, its peculiarities and theoretical description. We then present a practical step-by-step workflow, to support the Biologist on the in silico generation of protein structures. Finally, we explore further steps on comparative modeling, presenting perspectives to the study of protein structures through Bioinformatics. We trust that this is a thorough guide for beginners that wish to work on the comparative modeling of proteins.

  16. Data Exchange in Grid Workflow

    Institute of Scientific and Technical Information of China (English)

    ZENG Hongwei; MIAO Huaikou

    2006-01-01

    In existing web services-based workflow, data exchanging across the web services is centralized, the workflow engine intermediates at each step of the application sequence.However, many grid applications, especially data intensive scientific applications, require exchanging large amount of data across the grid services.Having a central workfiow engine relay the data between the services would results in a bottleneck in these cases.This paper proposes a data exchange model for individual grid workflow and multiworkflows composition respectively.The model enables direct communication for large amounts of data between two grid services.To enable data to exchange among multiple workflows, the bridge data service is used.

  17. Information Management Workflow and Tools Enabling Multiscale Modeling Within ICME Paradigm

    Science.gov (United States)

    Arnold, Steven M.; Bednarcyk, Brett A.; Austin, Nic; Terentjev, Igor; Cebon, Dave; Marsden, Will

    2016-01-01

    With the increased emphasis on reducing the cost and time to market of new materials, the need for analytical tools that enable the virtual design and optimization of materials throughout their processing - internal structure - property - performance envelope, along with the capturing and storing of the associated material and model information across its lifecycle, has become critical. This need is also fueled by the demands for higher efficiency in material testing; consistency, quality and traceability of data; product design; engineering analysis; as well as control of access to proprietary or sensitive information. Fortunately, material information management systems and physics-based multiscale modeling methods have kept pace with the growing user demands. Herein, recent efforts to establish workflow for and demonstrate a unique set of web application tools for linking NASA GRC's Integrated Computational Materials Engineering (ICME) Granta MI database schema and NASA GRC's Integrated multiscale Micromechanics Analysis Code (ImMAC) software toolset are presented. The goal is to enable seamless coupling between both test data and simulation data, which is captured and tracked automatically within Granta MI®, with full model pedigree information. These tools, and this type of linkage, are foundational to realizing the full potential of ICME, in which materials processing, microstructure, properties, and performance are coupled to enable application-driven design and optimization of materials and structures.

  18. MODELING WORKFLOW AND EVALUATING ITS RELIABILITY%工作流系统模型的建立及其可靠性的评价

    Institute of Scientific and Technical Information of China (English)

    林晖; 赵泽超; 张优云

    2001-01-01

    Product with complicated forms has perplexing workf low indesign,manufacture and market analysis and its workflow has high uncertai n specialty,hence a workflow management system (WMSs) with more flexibility will be required.This implies that a workflow having dynamic,distributed and intelle ctualized characteristic be introduced into,but uncertain specialty will occur i n system,thereby evaluating its reliability is an important aspect in workflow s chema.An approach for analyzing workflow,is presented,modeling relationships amo ng tasks in workflow;providing a method for calculating resource consumption and uncertain specialty of flows is provided;a method for detecting cycle included in the system is be given;subsequently,a approach for evaluating system reliabil ity is be drawn down,A case study,relating the experimental platform, will exemp lify the introduced approach.%复杂产品的设计、制造及销售是高度复杂的工作流程,且流程具有极大的不确定性,这就要求工作流管理系统有较大的柔性。动态的、分布式的以及智能化的工作流是支持复杂流程管理的最佳选择,但这样的工作流却引起系统的不确定性,因此,必须给出工作流的可靠性评价。建立了工作流结构的数学模型;提供了计算系统资源耗费及流程不确定性的方法;对包含‘环’结构的系统,该法实现对‘环’的提取,以便于资源耗费的计算;随后给出了可靠性评价方案;对一工程实例进行了详细的分析。

  19. A Novel Verification Approach of Workflow Schema

    Institute of Scientific and Technical Information of China (English)

    WANG Guangqi; WANG Juying; WANG Yan; SONG Baoyan; YU Ge

    2006-01-01

    A workflow schema is an abstract description of the business processed by workflow model, and plays a critical role in analyzing, executing and reorganizing business processes. The verification issue on the correctness of complicated workflow schemas is difficult in the field of workflow. We make an intensive study of it in this paper. We present here local errors and schema logic errors (global errors) in workflow schemas in detail, and offer some constraint rules trying to avoid schema errors during modeling. In addition, we propose a verification approach based on graph reduction and graph spread, and give the algorithm. The algorithm is implemented in a workflow prototype system e-ScopeWork.

  20. A workflow example of PBPK modeling to support pediatric research and development: case study with lorazepam.

    Science.gov (United States)

    Maharaj, A R; Barrett, J S; Edginton, A N

    2013-04-01

    The use of physiologically based pharmacokinetic (PBPK) models in the field of pediatric drug development has garnered much interest of late due to a recent Food and Drug Administration recommendation. The purpose of this study is to illustrate the developmental processes involved in creation of a pediatric PBPK model incorporating existing adult drug data. Lorazepam, a benzodiazepine utilized in both adults and children, was used as an example. A population-PBPK model was developed in PK-Sim v4.2® and scaled to account for age-related changes in size and composition of tissue compartments, protein binding, and growth/maturation of elimination processes. Dose (milligrams per kilogram) requirements for children aged 0-18 years were calculated based on simulations that achieved targeted exposures based on adult references. Predictive accuracy of the PBPK model for producing comparable plasma concentrations among 63 pediatric subjects was assessed using average-fold error (AFE). Estimates of clearance (CL) and volume of distribution (V(ss)) were compared with observed values for a subset of 15 children using fold error (FE). Pediatric dose requirements in young children (1-3 years) exceeded adult levels on a linear weight-adjusted (milligrams per kilogram) basis. AFE values for model-derived concentration estimates were within 1.5- and 2-fold deviation from observed values for 73% and 92% of patients, respectively. For CL, 60% and 80% of predictions were within 1.5 and 2 FE, respectively. Comparatively, predictions of V(ss) were more accurate with 80% and 100% of estimates within 1.5 and 2 FE, respectively. Using the presented workflow, the developed pediatric model estimated lorazepam pharmacokinetics in children as a function of age.

  1. Sharing on Web 3d Models of Ancient Theatres. a Methodological Workflow

    Science.gov (United States)

    Scianna, A.; La Guardia, M.; Scaduto, M. L.

    2016-06-01

    In the last few years, the need to share on the Web the knowledge of Cultural Heritage (CH) through navigable 3D models has increased. This need requires the availability of Web-based virtual reality systems and 3D WEBGIS. In order to make the information available to all stakeholders, these instruments should be powerful and at the same time very user-friendly. However, research and experiments carried out so far show that a standardized methodology doesn't exist. All this is due both to complexity and dimensions of geometric models to be published, on the one hand, and to excessive costs of hardware and software tools, on the other. In light of this background, the paper describes a methodological approach for creating 3D models of CH, freely exportable on the Web, based on HTML5 and free and open source software. HTML5, supporting the WebGL standard, allows the exploration of 3D spatial models using most used Web browsers like Chrome, Firefox, Safari, Internet Explorer. The methodological workflow here described has been tested for the construction of a multimedia geo-spatial platform developed for three-dimensional exploration and documentation of the ancient theatres of Segesta and of Carthage, and the surrounding landscapes. The experimental application has allowed us to explore the potential and limitations of sharing on the Web of 3D CH models based on WebGL standard. Sharing capabilities could be extended defining suitable geospatial Web-services based on capabilities of HTML5 and WebGL technology.

  2. A Three-Layer Model for Business Processes——Process Logic, Case Semantics and Workflow Management

    Institute of Scientific and Technical Information of China (English)

    Chong-Yi Yuan; Wen Zhao; Shi-Kun Zhang; Yu Huang

    2007-01-01

    Workflow management aims at the controlling, monitoring, optimizing and supporting of business processes.Well designed formal models will facilitate such management since they provide explicit representations of business processesas the basis for computerized analysis, verification and execution. Petri Nets have been recognized as the most suitablecandidate for workflow modeling, and as such, formal models based on Petri Nets have been proposed, among them WF-netby Aalst is the most popular one. But WF-net has turned out to be conceptually chaotic as will be illustrated in this paperwith an example from Aalst's book. This paper proposes a series of models for the description and analysis of businessprocesses at conceptually different hierarchical layers. Analytic goals and methods at these layers are also discussed. Theunderlying structure, shared by all these models, is SYNCHRONIZER, which is designed with the guidance of synchronytheory of GNT (General Net Theory) and serves as the conceptual foundation of workflow formal models. Structurally,synchronizers connect tasks to form a whole while dynamically synchronizers control tasks to achieve synchronization.

  3. 3d-modelling workflows for trans-nationally shared geological models - first approaches from the project GeoMol

    Science.gov (United States)

    Rupf, Isabel

    2013-04-01

    To meet the EU's ambitious targets for carbon emission reduction, renewable energy production has to be strongly upgraded and made more efficient for grid energy storage. Alpine Foreland Basins feature a unique geological inventory which can contribute substantially to tackle these challenges. They offer a geothermal potential and storage capacity for compressed air, as well as space for underground storage of CO2. Exploiting these natural subsurface resources will strongly compete with existing oil and gas claims and groundwater issues. The project GeoMol will provide consistent 3-dimensional subsurface information about the Alpine Foreland Basins based on a holistic and transnational approach. Core of the project GeoMol is a geological framework model for the entire Northern Molasse Basin, complemented by five detailed models in pilot areas, also in the Po Basin, which are dedicated to specific questions of subsurface use. The models will consist of up to 13 litho-stratigraphic horizons ranging from the Cenozoic basin fill down to Mesozoic and late Paleozoic sedimentary rocks and the crystalline basement. More than 5000 wells and 28 000 km seismic lines serve as input data sets for the geological subsurface model. The data have multiple sources and various acquisition dates, and their interpretations have gone through several paradigm changes. Therefore, it is necessary to standardize the data with regards to technical parameters and content prior to further analysis (cf. Capar et al. 2013, EGU2013-5349). Each partner will build its own geological subsurface model with different software solutions for seismic interpretation and 3d-modelling. Therefore, 3d-modelling follows different software- and partner-specific workflows. One of the main challenges of the project is to ensure a seamlessly fitting framework model. It is necessary to define several milestones for cross border checks during the whole modelling process. Hence, the main input data set of the

  4. Surgical workflow analysis with Gaussian mixture multivariate autoregressive (GMMAR) models: a simulation study.

    Science.gov (United States)

    Loukas, Constantinos; Georgiou, Evangelos

    2013-01-01

    There is currently great interest in analyzing the workflow of minimally invasive operations performed in a physical or simulation setting, with the aim of extracting important information that can be used for skills improvement, optimization of intraoperative processes, and comparison of different interventional strategies. The first step in achieving this goal is to segment the operation into its key interventional phases, which is currently approached by modeling a multivariate signal that describes the temporal usage of a predefined set of tools. Although this technique has shown promising results, it is challenged by the manual extraction of the tool usage sequence and the inability to simultaneously evaluate the surgeon's skills. In this paper we describe an alternative methodology for surgical phase segmentation and performance analysis based on Gaussian mixture multivariate autoregressive (GMMAR) models of the hand kinematics. Unlike previous work in this area, our technique employs signals from orientation sensors, attached to the endoscopic instruments of a virtual reality simulator, without considering which tools are employed at each time-step of the operation. First, based on pre-segmented hand motion signals, a training set of regression coefficients is created for each surgical phase using multivariate autoregressive (MAR) models. Then, a signal from a new operation is processed with GMMAR, wherein each phase is modeled by a Gaussian component of regression coefficients. These coefficients are compared to those of the training set. The operation is segmented according to the prior probabilities of the surgical phases estimated via GMMAR. The method also allows for the study of motor behavior and hand motion synchronization demonstrated in each phase, a quality that can be incorporated into modern laparoscopic simulators for skills assessment.

  5. Use of a Workflow Engine to Create a Hydrologic Community Modeling System

    Science.gov (United States)

    Piasecki, M.; Lu, B.

    2011-12-01

    With the increasing use of internet infrastructure the hydrologic modeling community is seeking to adopt new paradigms for their modeling needs. One such paradigm is the use of a Community Modeling System where the community can funnel their efforts to create widely accepted modeling approaches, share the data needed, and also exchange their results for defined benchmark problems. Several possibilities have been explored and there are quite a number of strategies deployed to make a community system a reality. Also, the possibility of using the "cloud" for high performance computing, or using SOAP protocol based machine-to-machine communications have allowed for the possibility to spread computational cycle and simulation data needs across the globe, allowing an unprecedented degree if connectivity to both data sources and between computational modules. Data is now often just a web service call away thus providing much faster access to it than just a few years ago. However, in order to provide a "plug" for all these high power venues of data transmission a user typically needs an environment where these lines end and can be harvested into desired forms and formats. In this paper we use a workflow engine (TRIDENT) to provide the framework for the data terminus in which we provide a library of data connections that are conditioned to allow hydrologic modeling tasks to be carried out. We also use this environment to host the hydrologic modules, or when using Web-Processing Service calls the call-client, for which we have developed a library of hydrologic process activities such as evapotranspiration, infiltration or flood routing among others. The system also contains more complex models that we have adopted to function within this environment such as SWAT with the intent to demonstrate how legacy codes can be ported and embedded within this environment. We will report on some of our experiences on developing data access modules, the use of web-service standards such

  6. DEVELOPMENT OF WATER CIRCULATION MODEL INCLUDING IRRIGATION

    Science.gov (United States)

    Kotsuki, Shunji; Tanaka, Kenji; Kojiri, Toshiharu; Hamaguchi, Toshio

    It is well known that since agricultural water withdrawal has much affect on water circulation system, accurate analysis of river discharge or water balance are difficult with less regard for it. In this study, water circulation model composed of land surface model and distributed runoff model is proposed at 10km 10km resolution. In this model, irrigation water, which is estimated with land surface model, is introduced to river discharge analysis. The model is applied to the Chao Phraya River in Thailand, and reproduced seasonal water balance. Additionally, the discharge on dry season simulated with the model is improved as a result of including irrigation. Since the model, which is basically developed from global data sets, simulated seasonal change of river discharge, it can be suggested that our model has university to other river basins.

  7. An integrated workflow for stress and flow modelling using outcrop-derived discrete fracture networks

    DEFF Research Database (Denmark)

    Bisdom, Kevin; Nick, Hamid; Bertotti, Giovanni

    2017-01-01

    stresssensitive fracture permeability and matrix flow to determine the full permeability tensor. The applicability of this workflow is illustrated using an outcropping carbonate pavement in the Potiguar basin in Brazil, from which 1082 fractures are digitised. The permeability tensor for a range of matrix...

  8. Constructing reservoir-scale 3D geomechanical FE-models. A refined workflow for model generation and calculation

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, K.; Henk, A. [Technische Univ. Darmstadt (Germany). Inst. fuer Angewandte Geowissenschaften

    2013-08-01

    The tectonic stress field strongly affects the optimal exploitation of conventional and unconventional hydrocarbon reservoirs. Amongst others, wellbore stability, orientation of hydraulically induced fractures and - particularly in fractured reservoirs - permeability anisotropies depend on the magnitudes and orientations of the recent stresses. Geomechanical reservoir models can provide unique insights into the tectonic stress field revealing the local perturbations resulting from faults and lithological changes. In order to provide robust predictions, such numerical models are based on the finite element (FE) method and account for the complexities of real reservoirs with respect to subsurface geometry, inhomogeneous material distribution and nonlinear rock mechanical behavior. We present a refined workflow for geomechanical reservoir modeling which allows for an easier set-up of the model geometry, high resolution submodels and faster calculation times due to element savings in the load frame. Transferring the reservoir geometry from the geological subsurface model, e.g., a Petrel {sup registered} project, to the FE model represents a special challenge as the faults are discontinuities in the numerical model and no direct interface exists between the two software packages used. Point clouds displaying faults and lithostratigraphic horizons can be used for geometry transfer but this labor-intensive approach is not feasible for complex field-scale models with numerous faults. Instead, so-called Coon's patches based on horizon lines, i.e. the intersection lines between horizons and faults, are well suited to re-generate the various surfaces in the FE software while maintaining their topology. High-resolution submodels of individual fault blocks can be incorporated into the field-scale model. This allows to consider both a locally refined mechanical stratigraphy and the impact of the large-scale fault pattern. A pressure load on top of the model represents the

  9. Standardizing clinical trials workflow representation in UML for international site comparison.

    Science.gov (United States)

    de Carvalho, Elias Cesar Araujo; Jayanti, Madhav Kishore; Batilana, Adelia Portero; Kozan, Andreia M O; Rodrigues, Maria J; Shah, Jatin; Loures, Marco R; Patil, Sunita; Payne, Philip; Pietrobon, Ricardo

    2010-11-09

    With the globalization of clinical trials, a growing emphasis has been placed on the standardization of the workflow in order to ensure the reproducibility and reliability of the overall trial. Despite the importance of workflow evaluation, to our knowledge no previous studies have attempted to adapt existing modeling languages to standardize the representation of clinical trials. Unified Modeling Language (UML) is a computational language that can be used to model operational workflow, and a UML profile can be developed to standardize UML models within a given domain. This paper's objective is to develop a UML profile to extend the UML Activity Diagram schema into the clinical trials domain, defining a standard representation for clinical trial workflow diagrams in UML. Two Brazilian clinical trial sites in rheumatology and oncology were examined to model their workflow and collect time-motion data. UML modeling was conducted in Eclipse, and a UML profile was developed to incorporate information used in discrete event simulation software. Ethnographic observation revealed bottlenecks in workflow: these included tasks requiring full commitment of CRCs, transferring notes from paper to computers, deviations from standard operating procedures, and conflicts between different IT systems. Time-motion analysis revealed that nurses' activities took up the most time in the workflow and contained a high frequency of shorter duration activities. Administrative assistants performed more activities near the beginning and end of the workflow. Overall, clinical trial tasks had a greater frequency than clinic routines or other general activities. This paper describes a method for modeling clinical trial workflow in UML and standardizing these workflow diagrams through a UML profile. In the increasingly global environment of clinical trials, the standardization of workflow modeling is a necessary precursor to conducting a comparative analysis of international clinical trials

  10. Observing health professionals' workflow patterns for diabetes care - First steps towards an ontology for EHR services.

    Science.gov (United States)

    Schweitzer, M; Lasierra, N; Hoerbst, A

    2015-01-01

    Increasing the flexibility from a user-perspective and enabling a workflow based interaction, facilitates an easy user-friendly utilization of EHRs for healthcare professionals' daily work. To offer such versatile EHR-functionality, our approach is based on the execution of clinical workflows by means of a composition of semantic web-services. The backbone of such architecture is an ontology which enables to represent clinical workflows and facilitates the selection of suitable services. In this paper we present the methods and results after running observations of diabetes routine consultations which were conducted in order to identify those workflows and the relation among the included tasks. Mentioned workflows were first modeled by BPMN and then generalized. As a following step in our study, interviews will be conducted with clinical personnel to validate modeled workflows.

  11. 3-D geomechanical modelling of a gas reservoir in the North German Basin: workflow for model building and calibration

    Directory of Open Access Journals (Sweden)

    K. Fischer

    2013-06-01

    Full Text Available The optimal use of conventional and unconventional hydrocarbon reservoirs depends, amongst others, on the local tectonic stress field. For example, wellbore stability, orientation of hydraulically induced fractures and – especially in fractured reservoirs – permeability anisotropies are controlled by the recent in situ stresses. Faults and lithological changes can lead to stress perturbations and produce local stresses that can significantly deviate from the regional stress field. Geomechanical reservoir models aim for a robust, ideally "pre-drilling" prediction of the local variations in stress magnitude and orientation. This requires a~numerical modelling approach that is capable to incorporate the specific geometry and mechanical properties of the subsurface reservoir. The workflow presented in this paper can be used to build 3-D geomechanical models based on the Finite Element Method (FEM and ranging from field-scale models to smaller, detailed submodels of individual fault blocks. The approach is successfully applied to an intensively faulted gas reservoir in the North German Basin. The in situ stresses predicted by the geomechanical FE model were calibrated against stress data actually observed, e.g. borehole breakouts and extended leak-off tests. Such a validated model can provide insights into the stress perturbations in the inter-well space and undrilled parts of the reservoir. In addition, the tendency of the existing fault network to slip or dilate in the present-day stress regime can be addressed.

  12. A Photogrammetric Workflow for the Creation of a Forest Canopy Height Model from Small Unmanned Aerial System Imagery

    Directory of Open Access Journals (Sweden)

    Philippe Lejeune

    2013-11-01

    Full Text Available The recent development of operational small unmanned aerial systems (UASs opens the door for their extensive use in forest mapping, as both the spatial and temporal resolution of UAS imagery better suit local-scale investigation than traditional remote sensing tools. This article focuses on the use of combined photogrammetry and “Structure from Motion” approaches in order to model the forest canopy surface from low-altitude aerial images. An original workflow, using the open source and free photogrammetric toolbox, MICMAC (acronym for Multi Image Matches for Auto Correlation Methods, was set up to create a digital canopy surface model of deciduous stands. In combination with a co-registered light detection and ranging (LiDAR digital terrain model, the elevation of vegetation was determined, and the resulting hybrid photo/LiDAR canopy height model was compared to data from a LiDAR canopy height model and from forest inventory data. Linear regressions predicting dominant height and individual height from plot metrics and crown metrics showed that the photogrammetric canopy height model was of good quality for deciduous stands. Although photogrammetric reconstruction significantly smooths the canopy surface, the use of this workflow has the potential to take full advantage of the flexible revisit period of drones in order to refresh the LiDAR canopy height model and to collect dense multitemporal canopy height series.

  13. E-BioFlow: Different perspectives on scientific workflows

    NARCIS (Netherlands)

    Wassink, I.; Rauwerda, H.; van der Vet, P.; Breit, T.; Nijholt, A.

    2008-01-01

    We introduce a new type of workflow design system called e-BioFlow and illustrate it by means of a simple sequence alignment workflow. E-BioFlow, intended to model advanced scientific workflows, enables the user to model a workflow from three different but strongly coupled perspectives: the control

  14. E-BioFlow: Different Perspectives on Scientific Workflows

    NARCIS (Netherlands)

    Wassink, I.; Rauwerda, H.; Vet, van der P.E.; Breit, T.; Nijholt, A.; Elloumi, M.; Küng, J.; Linial, M.; Murphy, R.F.; Schneider, K.; Toma, C.

    2008-01-01

    We introduce a new type of workflow design system called e-BioFlow and illustrate it by means of a simple sequence alignment workflow. E-BioFlow, intended to model advanced scientific workflows, enables the user to model a workflow from three different but strongly coupled perspectives: the control

  15. A standard-enabled workflow for synthetic biology

    KAUST Repository

    Myers, Chris J.

    2017-06-15

    A synthetic biology workflow is composed of data repositories that provide information about genetic parts, sequence-level design tools to compose these parts into circuits, visualization tools to depict these designs, genetic design tools to select parts to create systems, and modeling and simulation tools to evaluate alternative design choices. Data standards enable the ready exchange of information within such a workflow, allowing repositories and tools to be connected from a diversity of sources. The present paper describes one such workflow that utilizes, among others, the Synthetic Biology Open Language (SBOL) to describe genetic designs, the Systems Biology Markup Language to model these designs, and SBOL Visual to visualize these designs. We describe how a standard-enabled workflow can be used to produce types of design information, including multiple repositories and software tools exchanging information using a variety of data standards. Recently, the ACS Synthetic Biology journal has recommended the use of SBOL in their publications.

  16. CSP for Executable Scientific Workflows

    DEFF Research Database (Denmark)

    Friborg, Rune Møllegaard

    is demonstrated through examples. By providing a robust library for organising scientific workflows in a Python application I hope to inspire scientific users to adopt PyCSP. As a proof-of-concept this thesis demonstrates three scientific applications: kNN, stochastic minimum search and McStas to scale well......This thesis presents CSP as a means of orchestrating the execution of tasks in a scientific workflow. Scientific workflow systems are popular in a wide range of scientific areas, where tasks are organised in directed graphs. Execution of such graphs is handled by the scientific workflow systems...... on multi-processing and cluster computing using PyCSP. Additionally, McStas is demonstrated to utilise grid computing resources using PyCSP. Finally, this thesis presents a new dynamic channel model, which has not yet been implemented for PyCSP. The dynamic channel is able to change the internal...

  17. Models of bovine babesiosis including juvenile cattle.

    Science.gov (United States)

    Saad-Roy, C M; Shuai, Zhisheng; van den Driessche, P

    2015-03-01

    Bovine Babesiosis in cattle is caused by the transmission of protozoa of Babesia spp. by ticks as vectors. Juvenile cattle (Babesiosis, rarely show symptoms, and acquire immunity upon recovery. Susceptibility to the disease varies between breeds of cattle. Models of the dynamics of Bovine Babesiosis transmitted by the cattle tick that include these factors are formulated as systems of ordinary differential equations. Basic reproduction numbers are calculated, and it is proved that if these numbers are below the threshold value of one, then Bovine Babesiosis dies out. However, above the threshold number of one, the disease may approach an endemic state. In this case, control measures are suggested by determining target reproduction numbers. The percentage of a particular population (for example, the adult bovine population) needed to be controlled to eradicate the disease is evaluated numerically using Columbia data from the literature.

  18. Study on bionic flexible workflow modeling and adaptation algorithm%仿生柔性工作流建模与适应算法研究

    Institute of Scientific and Technical Information of China (English)

    王颖慧; 王东勃; 王增磊; 刘志忠

    2011-01-01

    为提高柔性工作流对外界动态变化的响应速度,将生物的反射机理引入柔性工作流,构建了柔性工作流神经网络系统.模仿生物响应外界刺激的反射机理,利用人工神经网络技术,在柔性工作流中建立了仿生柔性工作流模型,定义了该模型中的人工神经网络概念模型,并以该模型为基础,提出了柔性工作流适应算法框架.最后,以企业生产计划节点工时定额的制定为例,构建了处理元群为BP网络的仿生柔性工作流模型,对柔性工作流适应算法进行仿真.仿真结果显示,建立的模型能够根据参数的动态变化作出正确的响应,从而证明仿生柔性工作流适应算法能够智能响应外界动态变化.%To improve the responding speed of the flexible workflow to the external dynamic changes, introduced the biological mechanism of reflection into the flexible workflow, builded the neural network system in the flexible workflow. By using of the biological mechanism of reflection to the stimulus, created the bionic flexible workflow model and defined the concept model of artificial neural network in a flexible workflow with artificial neural network technology, then, proposed a framework of flexible workflow adaptation algorithm based on the above model. Finally, with an example of working hour quota set in the enterprise production planning node, simulated the flexible workflow adaptation algorithm based on the bionic flexible workflow model which the type of the processing neural group was BP model. Simulation results show that the established model can make correct response to the dynamic changes according to the parameters, thus proving the adaptation algorithm of bionic flexible workflow can respond to outside dynamic change intelligently.

  19. Vel-IO 3D: A tool for 3D velocity model construction, optimization and time-depth conversion in 3D geological modeling workflow

    Science.gov (United States)

    Maesano, Francesco E.; D'Ambrogi, Chiara

    2017-02-01

    We present Vel-IO 3D, a tool for 3D velocity model creation and time-depth conversion, as part of a workflow for 3D model building. The workflow addresses the management of large subsurface dataset, mainly seismic lines and well logs, and the construction of a 3D velocity model able to describe the variation of the velocity parameters related to strong facies and thickness variability and to high structural complexity. Although it is applicable in many geological contexts (e.g. foreland basins, large intermountain basins), it is particularly suitable in wide flat regions, where subsurface structures have no surface expression. The Vel-IO 3D tool is composed by three scripts, written in Python 2.7.11, that automate i) the 3D instantaneous velocity model building, ii) the velocity model optimization, iii) the time-depth conversion. They determine a 3D geological model that is consistent with the primary geological constraints (e.g. depth of the markers on wells). The proposed workflow and the Vel-IO 3D tool have been tested, during the EU funded Project GeoMol, by the construction of the 3D geological model of a flat region, 5700 km2 in area, located in the central part of the Po Plain. The final 3D model showed the efficiency of the workflow and Vel-IO 3D tool in the management of large amount of data both in time and depth domain. A 4 layer-cake velocity model has been applied to a several thousand (5000-13,000 m) thick succession, with 15 horizons from Triassic up to Pleistocene, complicated by a Mesozoic extensional tectonics and by buried thrusts related to Southern Alps and Northern Apennines.

  20. Planning bioinformatics workflows using an expert system.

    Science.gov (United States)

    Chen, Xiaoling; Chang, Jeffrey T

    2017-04-15

    Bioinformatic analyses are becoming formidably more complex due to the increasing number of steps required to process the data, as well as the proliferation of methods that can be used in each step. To alleviate this difficulty, pipelines are commonly employed. However, pipelines are typically implemented to automate a specific analysis, and thus are difficult to use for exploratory analyses requiring systematic changes to the software or parameters used. To automate the development of pipelines, we have investigated expert systems. We created the Bioinformatics ExperT SYstem (BETSY) that includes a knowledge base where the capabilities of bioinformatics software is explicitly and formally encoded. BETSY is a backwards-chaining rule-based expert system comprised of a data model that can capture the richness of biological data, and an inference engine that reasons on the knowledge base to produce workflows. Currently, the knowledge base is populated with rules to analyze microarray and next generation sequencing data. We evaluated BETSY and found that it could generate workflows that reproduce and go beyond previously published bioinformatics results. Finally, a meta-investigation of the workflows generated from the knowledge base produced a quantitative measure of the technical burden imposed by each step of bioinformatics analyses, revealing the large number of steps devoted to the pre-processing of data. In sum, an expert system approach can facilitate exploratory bioinformatic analysis by automating the development of workflows, a task that requires significant domain expertise. https://github.com/jefftc/changlab. jeffrey.t.chang@uth.tmc.edu.

  1. Workflow User Interfaces Patterns

    Directory of Open Access Journals (Sweden)

    Jean Vanderdonckt

    2012-03-01

    Full Text Available Este trabajo presenta una colección de patrones de diseño de interfaces de usuario para sistemas de información para el flujo de trabajo; la colección incluye cuarenta y tres patrones clasificados en siete categorías identificados a partir de la lógica del ciclo de vida de la tarea sobre la base de la oferta y la asignación de tareas a los responsables de realizarlas (i. e. recursos humanos durante el flujo de trabajo. Cada patrón de la interfaz de usuario de flujo de trabajo (WUIP, por sus siglas en inglés se caracteriza por las propiedades expresadas en el lenguaje PLML para expresar patrones y complementado por otros atributos y modelos que se adjuntan a dicho modelo: la interfaz de usuario abstracta y el modelo de tareas correspondiente. Estos modelos se especifican en un lenguaje de descripción de interfaces de usuario. Todos los WUIPs se almacenan en una biblioteca y se pueden recuperar a través de un editor de flujo de trabajo que vincula a cada patrón de asignación de trabajo a su WUIP correspondiente.A collection of user interface design patterns for workflow information systems is presented that contains forty three resource patterns classified in seven categories. These categories and their corresponding patterns have been logically identified from the task life cycle based on offering and allocation operations. Each Workflow User Interface Pattern (WUIP is characterized by properties expressed in the PLML markup language for expressing patterns and augmented by additional attributes and models attached to the pattern: the abstract user interface and the corresponding task model. These models are specified in a User Interface Description Language. All WUIPs are stored in a library and can be retrieved within a workflow editor that links each workflow pattern to its corresponding WUIP, thus giving rise to a user interface for each workflow pattern.

  2. Search and Result Presentation in Scientific Workflow Repositories

    OpenAIRE

    Davidson, Susan B.; Huang, Xiaocheng; Stoyanovich, Julia; Yuan, Xiaojie

    2013-01-01

    We study the problem of searching a repository of complex hierarchical workflows whose component modules, both composite and atomic, have been annotated with keywords. Since keyword search does not use the graph structure of a workflow, we develop a model of workflows using context-free bag grammars. We then give efficient polynomial-time algorithms that, given a workflow and a keyword query, determine whether some execution of the workflow matches the query. Based on these algorithms we deve...

  3. A Comparison of Using Taverna and BPEL in Building Scientific Workflows: the case of caGrid.

    Science.gov (United States)

    Tan, Wei; Missier, Paolo; Foster, Ian; Madduri, Ravi; Goble, Carole

    2010-06-25

    With the emergence of "service oriented science," the need arises to orchestrate multiple services to facilitate scientific investigation-that is, to create "science workflows." We present here our findings in providing a workflow solution for the caGrid service-based grid infrastructure. We choose BPEL and Taverna as candidates, and compare their usability in the lifecycle of a scientific workflow, including workflow composition, execution, and result analysis. Our experience shows that BPEL as an imperative language offers a comprehensive set of modeling primitives for workflows of all flavors; while Taverna offers a dataflow model and a more compact set of primitives that facilitates dataflow modeling and pipelined execution. We hope that this comparison study not only helps researchers select a language or tool that meets their specific needs, but also offers some insight on how a workflow language and tool can fulfill the requirement of the scientific community.

  4. An Integrated Biochemistry Laboratory, Including Molecular Modeling

    Science.gov (United States)

    Hall, Adele J. Wolfson Mona L.; Branham, Thomas R.

    1996-11-01

    ) experience with methods of protein purification; (iii) incorporation of appropriate controls into experiments; (iv) use of basic statistics in data analysis; (v) writing papers and grant proposals in accepted scientific style; (vi) peer review; (vii) oral presentation of results and proposals; and (viii) introduction to molecular modeling. Figure 1 illustrates the modular nature of the lab curriculum. Elements from each of the exercises can be separated and treated as stand-alone exercises, or combined into short or long projects. We have been able to offer the opportunity to use sophisticated molecular modeling in the final module through funding from an NSF-ILI grant. However, many of the benefits of the research proposal can be achieved with other computer programs, or even by literature survey alone. Figure 1.Design of project-based biochemistry laboratory. Modules (projects, or portions of projects) are indicated as boxes. Each of these can be treated independently, or used as part of a larger project. Solid lines indicate some suggested paths from one module to the next. The skills and knowledge required for protein purification and design are developed in three units: (i) an introduction to critical assays needed to monitor degree of purification, including an evaluation of assay parameters; (ii) partial purification by ion-exchange techniques; and (iii) preparation of a grant proposal on protein design by mutagenesis. Brief descriptions of each of these units follow, with experimental details of each project at the end of this paper. Assays for Lysozyme Activity and Protein Concentration (4 weeks) The assays mastered during the first unit are a necessary tool for determining the purity of the enzyme during the second unit on purification by ion exchange. These assays allow an introduction to the concept of specific activity (units of enzyme activity per milligram of total protein) as a measure of purity. In this first sequence, students learn a turbidimetric assay

  5. BEAM: A computational workflow system for managing and modeling material characterization data in HPC environments

    Energy Technology Data Exchange (ETDEWEB)

    Lingerfelt, Eric J [ORNL; Endeve, Eirik [ORNL; Ovchinnikov, Oleg S [ORNL; Borreguero Calvo, Jose M [ORNL; Park, Byung H [ORNL; Archibald, Richard K [ORNL; Symons, Christopher T [ORNL; Kalinin, Sergei V [ORNL; Messer, Bronson [ORNL; Shankar, Mallikarjun [ORNL; Jesse, Stephen [ORNL

    2016-01-01

    Improvements in scientific instrumentation allow imaging at mesoscopic to atomic length scales, many spectroscopic modes, and now with the rise of multimodal acquisition systems and the associated processing capability the era of multidimensional, informationally dense data sets has arrived. Technical issues in these combinatorial scientific fields are exacerbated by computational challenges best summarized as a necessity for drastic improvement in the capability to transfer, store, and analyze large volumes of data. The Bellerophon Environment for Analysis of Materials (BEAM) platform provides material scientists the capability to directly leverage the integrated computational and analytical power of High Performance Computing (HPC) to perform scalable data analysis and simulation via an intuitive, cross-platform client user interface. This framework delivers authenticated, push-button execution of complex user workflows that deploy data analysis algorithms and computational simulations utilizing the converged compute-and-data infrastructure at Oak Ridge National Laboratory s (ORNL) Compute and Data Environment for Science (CADES) and HPC environments like Titan at the Oak Ridge Leadership Computing Facility (OLCF). In this work we address the underlying HPC needs for characterization in the material science community, elaborate how BEAM s design and infrastructure tackle those needs, and present a small sub-set of user cases where scientists utilized BEAM across a broad range of analytical techniques and analysis modes.

  6. Review time in peer review: quantitative analysis and modelling of editorial workflows.

    Science.gov (United States)

    Mrowinski, Maciej J; Fronczak, Agata; Fronczak, Piotr; Nedic, Olgica; Ausloos, Marcel

    In this paper, we undertake a data-driven theoretical investigation of editorial workflows. We analyse a dataset containing information about 58 papers submitted to the Biochemistry and Biotechnology section of the Journal of the Serbian Chemical Society. We separate the peer review process into stages that each paper has to go through and introduce the notion of completion rate - the probability that an invitation sent to a potential reviewer will result in a finished review. Using empirical transition probabilities and probability distributions of the duration of each stage we create a directed weighted network, the analysis of which allows us to obtain the theoretical probability distributions of review time for different classes of reviewers. These theoretical distributions underlie our numerical simulations of different editorial strategies. Through these simulations, we test the impact of some modifications of the editorial policy on the efficiency of the whole review process. We discover that the distribution of review time is similar for all classes of reviewers, and that the completion rate of reviewers known personally by the editor is very high, which means that they are much more likely to answer the invitation and finish the review than other reviewers. Thus, the completion rate is the key factor that determines the efficiency of each editorial policy. Our results may be of great importance for editors and act as a guide in determining the optimal number of reviewers.

  7. Seepage Model for PA Including Dift Collapse

    Energy Technology Data Exchange (ETDEWEB)

    G. Li; C. Tsang

    2000-12-20

    The purpose of this Analysis/Model Report (AMR) is to document the predictions and analysis performed using the Seepage Model for Performance Assessment (PA) and the Disturbed Drift Seepage Submodel for both the Topopah Spring middle nonlithophysal and lower lithophysal lithostratigraphic units at Yucca Mountain. These results will be used by PA to develop the probability distribution of water seepage into waste-emplacement drifts at Yucca Mountain, Nevada, as part of the evaluation of the long term performance of the potential repository. This AMR is in accordance with the ''Technical Work Plan for Unsaturated Zone (UZ) Flow and Transport Process Model Report'' (CRWMS M&O 2000 [153447]). This purpose is accomplished by performing numerical simulations with stochastic representations of hydrological properties, using the Seepage Model for PA, and evaluating the effects of an alternative drift geometry representing a partially collapsed drift using the Disturbed Drift Seepage Submodel. Seepage of water into waste-emplacement drifts is considered one of the principal factors having the greatest impact of long-term safety of the repository system (CRWMS M&O 2000 [153225], Table 4-1). This AMR supports the analysis and simulation that are used by PA to develop the probability distribution of water seepage into drift, and is therefore a model of primary (Level 1) importance (AP-3.15Q, ''Managing Technical Product Inputs''). The intended purpose of the Seepage Model for PA is to support: (1) PA; (2) Abstraction of Drift-Scale Seepage; and (3) Unsaturated Zone (UZ) Flow and Transport Process Model Report (PMR). Seepage into drifts is evaluated by applying numerical models with stochastic representations of hydrological properties and performing flow simulations with multiple realizations of the permeability field around the drift. The Seepage Model for PA uses the distribution of permeabilities derived from air injection testing in

  8. 面向协同审批的工作流模型构建研究%Research on co-approval oriented workflow model construction

    Institute of Scientific and Technical Information of China (English)

    施雅贤; 王继伟; 郑荣纬; 胡素芳

    2011-01-01

    针对协同审批机制的特殊性,提出了协同审批三维工作流模型及构建过程。文中详细分析了组织、数据、过程3类子模型在协同审批机制中的形式化描述,相比其他的模型范例,协同审批三维工作流模型不仅通过IPO模型清楚明了地描述了业务流程。还体现业务过程所访问的关键数据流及组织结构,更利于系统、科学地统筹协同审批工作流开发。%Against the particularity of co-approval mechanism, three-dimensional co-approval workflow model and its building process is proposed, the formal description of organization, data, and process, which is the three sub-model types in the collaborative process of approval mechanism is analyzed in detail. Compared with other examples of workflow model, threedimensional co-approval workflow model not only describes the business processes clearly, but also reflects the critical accessed data flow and organizational structure in business process. So the proposed model is benefit for the more systematic and scientific development of co-approval workflow.

  9. Designing a road map for geoscience workflows

    Science.gov (United States)

    Duffy, Christopher; Gil, Yolanda; Deelman, Ewa; Marru, Suresh; Pierce, Marlon; Demir, Ibrahim; Wiener, Gerry

    2012-06-01

    Advances in geoscience research and discovery are fundamentally tied to data and computation, but formal strategies for managing the diversity of models and data resources in the Earth sciences have not yet been resolved or fully appreciated. The U.S. National Science Foundation (NSF) EarthCube initiative (http://earthcube.ning.com), which aims to support community-guided cyberinfrastructure to integrate data and information across the geosciences, recently funded four community development activities: Geoscience Workflows; Semantics and Ontologies; Data Discovery, Mining, and Integration; and Governance. The Geoscience Workflows working group, with broad participation from the geosciences, cyberinfrastructure, and other relevant communities, is formulating a workflows road map (http://sites.google.com/site/earthcubeworkflow/). The Geoscience Workflows team coordinates with each of the other community development groups given their direct relevance to workflows. Semantics and ontologies are mechanisms for describing workflows and the data they process.

  10. Enhanced battery model including temperature effects

    NARCIS (Netherlands)

    Rosca, B.; Wilkins, S.

    2013-01-01

    Within electric and hybrid vehicles, batteries are used to provide/buffer the energy required for driving. However, battery performance varies throughout the temperature range specific to automotive applications, and as such, models that describe this behaviour are required. This paper presents a dy

  11. Enhanced battery model including temperature effects

    NARCIS (Netherlands)

    Rosca, B.; Wilkins, S.

    2013-01-01

    Within electric and hybrid vehicles, batteries are used to provide/buffer the energy required for driving. However, battery performance varies throughout the temperature range specific to automotive applications, and as such, models that describe this behaviour are required. This paper presents a dy

  12. Enhanced battery model including temperature effects

    NARCIS (Netherlands)

    Rosca, B.; Wilkins, S.

    2013-01-01

    Within electric and hybrid vehicles, batteries are used to provide/buffer the energy required for driving. However, battery performance varies throughout the temperature range specific to automotive applications, and as such, models that describe this behaviour are required. This paper presents a

  13. From Peer-Reviewed to Peer-Reproduced in Scholarly Publishing: The Complementary Roles of Data Models and Workflows in Bioinformatics.

    Science.gov (United States)

    González-Beltrán, Alejandra; Li, Peter; Zhao, Jun; Avila-Garcia, Maria Susana; Roos, Marco; Thompson, Mark; van der Horst, Eelke; Kaliyaperumal, Rajaram; Luo, Ruibang; Lee, Tin-Lap; Lam, Tak-Wah; Edmunds, Scott C; Sansone, Susanna-Assunta; Rocca-Serra, Philippe

    2015-01-01

    Reproducing the results from a scientific paper can be challenging due to the absence of data and the computational tools required for their analysis. In addition, details relating to the procedures used to obtain the published results can be difficult to discern due to the use of natural language when reporting how experiments have been performed. The Investigation/Study/Assay (ISA), Nanopublications (NP), and Research Objects (RO) models are conceptual data modelling frameworks that can structure such information from scientific papers. Computational workflow platforms can also be used to reproduce analyses of data in a principled manner. We assessed the extent by which ISA, NP, and RO models, together with the Galaxy workflow system, can capture the experimental processes and reproduce the findings of a previously published paper reporting on the development of SOAPdenovo2, a de novo genome assembler. Executable workflows were developed using Galaxy, which reproduced results that were consistent with the published findings. A structured representation of the information in the SOAPdenovo2 paper was produced by combining the use of ISA, NP, and RO models. By structuring the information in the published paper using these data and scientific workflow modelling frameworks, it was possible to explicitly declare elements of experimental design, variables, and findings. The models served as guides in the curation of scientific information and this led to the identification of inconsistencies in the original published paper, thereby allowing its authors to publish corrections in the form of an errata. SOAPdenovo2 scripts, data, and results are available through the GigaScience Database: http://dx.doi.org/10.5524/100044; the workflows are available from GigaGalaxy: http://galaxy.cbiit.cuhk.edu.hk; and the representations using the ISA, NP, and RO models are available through the SOAPdenovo2 case study website http://isa-tools.github.io/soapdenovo2/. philippe

  14. Pro WF Windows Workflow in NET 40

    CERN Document Server

    Bukovics, Bruce

    2010-01-01

    Windows Workflow Foundation (WF) is a revolutionary part of the .NET 4 Framework that allows you to orchestrate human and system interactions as a series of workflows that can be easily mapped, analyzed, adjusted, and implemented. As business problems become more complex, the need for workflow-based solutions has never been more evident. WF provides a simple and consistent way to model and implement complex problems. As a developer, you focus on developing the business logic for individual workflow tasks. The runtime handles the execution of those tasks after they have been composed into a wor

  15. Integration of services into workflow applications

    CERN Document Server

    Czarnul, Pawel

    2015-01-01

    Describing state-of-the-art solutions in distributed system architectures, Integration of Services into Workflow Applications presents a concise approach to the integration of loosely coupled services into workflow applications. It discusses key challenges related to the integration of distributed systems and proposes solutions, both in terms of theoretical aspects such as models and workflow scheduling algorithms, and technical solutions such as software tools and APIs.The book provides an in-depth look at workflow scheduling and proposes a way to integrate several different types of services

  16. Decaf: Decoupled Dataflows for In Situ High-Performance Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Dreher, M.; Peterka, T.

    2017-07-31

    Decaf is a dataflow system for the parallel communication of coupled tasks in an HPC workflow. The dataflow can perform arbitrary data transformations ranging from simply forwarding data to complex data redistribution. Decaf does this by allowing the user to allocate resources and execute custom code in the dataflow. All communication through the dataflow is efficient parallel message passing over MPI. The runtime for calling tasks is entirely message-driven; Decaf executes a task when all messages for the task have been received. Such a messagedriven runtime allows cyclic task dependencies in the workflow graph, for example, to enact computational steering based on the result of downstream tasks. Decaf includes a simple Python API for describing the workflow graph. This allows Decaf to stand alone as a complete workflow system, but Decaf can also be used as the dataflow layer by one or more other workflow systems to form a heterogeneous task-based computing environment. In one experiment, we couple a molecular dynamics code with a visualization tool using the FlowVR and Damaris workflow systems and Decaf for the dataflow. In another experiment, we test the coupling of a cosmology code with Voronoi tessellation and density estimation codes using MPI for the simulation, the DIY programming model for the two analysis codes, and Decaf for the dataflow. Such workflows consisting of heterogeneous software infrastructures exist because components are developed separately with different programming models and runtimes, and this is the first time that such heterogeneous coupling of diverse components was demonstrated in situ on HPC systems.

  17. Multiscale cartilage biomechanics: technical challenges in realizing a high-throughput modelling and simulation workflow.

    Science.gov (United States)

    Erdemir, Ahmet; Bennetts, Craig; Davis, Sean; Reddy, Akhil; Sibole, Scott

    2015-04-06

    interpretation of the results. This study aims to summarize various strategies to address the technical challenges of post-processing-based simulations of cartilage and chondrocyte mechanics with the ultimate goal of establishing the foundations of a high-throughput multiscale analysis framework. At the joint-tissue scale, rapid development of regional models of articular contact is possible by automating the process of generating parametric representations of cartilage boundaries and depth-dependent zonal delineation with associated constitutive relationships. At the tissue-cell scale, models descriptive of multicellular and fibrillar architecture of cartilage zones can also be generated in an automated fashion. Through post-processing, scripts can extract biphasic mechanical metrics at a desired point in the cartilage to assign loading and boundary conditions to models at the lower spatial scale of cells. Cell deformation metrics can be extracted from simulation results to provide a simplified description of individual chondrocyte responses. Simulations at the tissue-cell scale can be parallelized owing to the loosely coupled nature of the feed-forward approach. Verification studies illustrated the necessity of a second-order data passing scheme between scales and evaluated the role that the microscale representative volume size plays in appropriately predicting the mechanical response of the chondrocytes. The tools summarized in this study collectively provide a framework for high-throughput exploration of cartilage biomechanics, which includes minimally supervised model generation, and prediction of multiscale biomechanical metrics across a range of spatial scales, from joint regions and cartilage zones, down to that of the chondrocytes.

  18. Research of Dynamic Binding Modeling TechnologyBased on WEB Workflow%基于WEB工作流的动态绑定建模技术研究

    Institute of Scientific and Technical Information of China (English)

    江进

    2014-01-01

    工作流技术是实现业务流程建模、仿真、优化、集成的有效手段之一。尤其是基于WEB的工作流动态建模技术,是增强信息系统业务流程互联互通、实现业务流程动态可配置、流程优化重组的有力措施。文章首先分析了工作流控制模型,然后构建了基于WEB的多流程工作流协同控制体系。采用基于XML的技术方案给出了流程初始化定义的配置框架,通过工作流引擎调用WEB服务描述语言WSDL,实现了面向web服务的多流程动态绑定。达到了实际业务流程逻辑与流程组织逻辑分离的目的。%Workflow technology is one of the business process modeling, simulation, optimization, integrated and effective means of achieving. Especially the dynamic workflow modeling technology based on WEB, is to enhance the information system business process interoperability, realize the effective measures of dynamic configurable business process, process reengineering. Thesis analyzes the workflow control model, and then build a WEB-based multi-process workflows cooperative control system. XML-based technology solutions gives the configuration framework defined initialization process, call the WEB Services Description Language WSDL through the workflow engine to achieve a dynamic and multi-process-oriented web service bindings. Achieve the purpose of the actual business process logic and process organization logical separation.

  19. Phase Segmentation Methods for an Automatic Surgical Workflow Analysis

    Science.gov (United States)

    Sakurai, Ryuhei; Yamazoe, Hirotake

    2017-01-01

    In this paper, we present robust methods for automatically segmenting phases in a specified surgical workflow by using latent Dirichlet allocation (LDA) and hidden Markov model (HMM) approaches. More specifically, our goal is to output an appropriate phase label for each given time point of a surgical workflow in an operating room. The fundamental idea behind our work lies in constructing an HMM based on observed values obtained via an LDA topic model covering optical flow motion features of general working contexts, including medical staff, equipment, and materials. We have an awareness of such working contexts by using multiple synchronized cameras to capture the surgical workflow. Further, we validate the robustness of our methods by conducting experiments involving up to 12 phases of surgical workflows with the average length of each surgical workflow being 12.8 minutes. The maximum average accuracy achieved after applying leave-one-out cross-validation was 84.4%, which we found to be a very promising result. PMID:28408921

  20. The application of workflows to digital heritage systems

    OpenAIRE

    Al-Barakati, Abdullah

    2012-01-01

    Digital heritage systems usually handle a rich and varied mix of digital objects, accompanied by complex and intersecting workflows and processes. However, they usually lack effective workflow management within their components as evident in the lack of integrated solutions that include workflow components. There are a number of reasons for this limitation in workflow management utilization including some technical challenges, the unique nature of each digital resource and the challenges impo...

  1. a Geometric Processing Workflow for Transforming Reality-Based 3d Models in Volumetric Meshes Suitable for Fea

    Science.gov (United States)

    Gonizzi Barsanti, S.; Guidi, G.

    2017-02-01

    Conservation of Cultural Heritage is a key issue and structural changes and damages can influence the mechanical behaviour of artefacts and buildings. The use of Finite Elements Methods (FEM) for mechanical analysis is largely used in modelling stress behaviour. The typical workflow involves the use of CAD 3D models made by Non-Uniform Rational B-splines (NURBS) surfaces, representing the ideal shape of the object to be simulated. Nowadays, 3D documentation of CH has been widely developed through reality-based approaches, but the models are not suitable for a direct use in FEA: the mesh has in fact to be converted to volumetric, and the density has to be reduced since the computational complexity of a FEA grows exponentially with the number of nodes. The focus of this paper is to present a new method aiming at generate the most accurate 3D representation of a real artefact from highly accurate 3D digital models derived from reality-based techniques, maintaining the accuracy of the high-resolution polygonal models in the solid ones. The approach proposed is based on a wise use of retopology procedures and a transformation of this model to a mathematical one made by NURBS surfaces suitable for being processed by volumetric meshers typically embedded in standard FEM packages. The strong simplification with little loss of consistency possible with the retopology step is used for maintaining as much coherence as possible between the original acquired mesh and the simplified model, creating in the meantime a topology that is more favourable for the automatic NURBS conversion.

  2. The standard-based open workflow system in GeoBrain (Invited)

    Science.gov (United States)

    Di, L.; Yu, G.; Zhao, P.; Deng, M.

    2013-12-01

    GeoBrain is an Earth science Web-service system developed and operated by the Center for Spatial Information Science and Systems, George Mason University. In GeoBrain, a standard-based open workflow system has been implemented to accommodate the automated processing of geospatial data through a set of complex geo-processing functions for advanced production generation. The GeoBrain models the complex geoprocessing at two levels, the conceptual and concrete. At the conceptual level, the workflows exist in the form of data and service types defined by ontologies. The workflows at conceptual level are called geo-processing models and cataloged in GeoBrain as virtual product types. A conceptual workflow is instantiated into a concrete, executable workflow when a user requests a product that matches a virtual product type. Both conceptual and concrete workflows are encoded in Business Process Execution Language (BPEL). A BPEL workflow engine, called BPELPower, has been implemented to execute the workflow for the product generation. A provenance capturing service has been implemented to generate the ISO 19115-compliant complete product provenance metadata before and after the workflow execution. The generation of provenance metadata before the workflow execution allows users to examine the usability of the final product before the lengthy and expensive execution takes place. The three modes of workflow executions defined in the ISO 19119, transparent, translucent, and opaque, are available in GeoBrain. A geoprocessing modeling portal has been developed to allow domain experts to develop geoprocessing models at the type level with the support of both data and service/processing ontologies. The geoprocessing models capture the knowledge of the domain experts and are become the operational offering of the products after a proper peer review of models is conducted. An automated workflow composition has been experimented successfully based on ontologies and artificial

  3. Agreement Workflow Tool (AWT)

    Data.gov (United States)

    Social Security Administration — The Agreement Workflow Tool (AWT) is a role-based Intranet application used for processing SSA's Reimbursable Agreements according to SSA's standards. AWT provides...

  4. Automatically updating predictive modeling workflows support decision-making in drug design.

    Science.gov (United States)

    Muegge, Ingo; Bentzien, Jörg; Mukherjee, Prasenjit; Hughes, Robert O

    2016-09-01

    Using predictive models for early decision-making in drug discovery has become standard practice. We suggest that model building needs to be automated with minimum input and low technical maintenance requirements. Models perform best when tailored to answering specific compound optimization related questions. If qualitative answers are required, 2-bin classification models are preferred. Integrating predictive modeling results with structural information stimulates better decision making. For in silico models supporting rapid structure-activity relationship cycles the performance deteriorates within weeks. Frequent automated updates of predictive models ensure best predictions. Consensus between multiple modeling approaches increases the prediction confidence. Combining qualified and nonqualified data optimally uses all available information. Dose predictions provide a holistic alternative to multiple individual property predictions for reaching complex decisions.

  5. Understanding latent structures of clinical information logistics: A bottom-up approach for model building and validating the workflow composite score.

    Science.gov (United States)

    Esdar, Moritz; Hübner, Ursula; Liebe, Jan-David; Hüsers, Jens; Thye, Johannes

    2017-01-01

    Clinical information logistics is a construct that aims to describe and explain various phenomena of information provision to drive clinical processes. It can be measured by the workflow composite score, an aggregated indicator of the degree of IT support in clinical processes. This study primarily aimed to investigate the yet unknown empirical patterns constituting this construct. The second goal was to derive a data-driven weighting scheme for the constituents of the workflow composite score and to contrast this scheme with a literature based, top-down procedure. This approach should finally test the validity and robustness of the workflow composite score. Based on secondary data from 183 German hospitals, a tiered factor analytic approach (confirmatory and subsequent exploratory factor analysis) was pursued. A weighting scheme, which was based on factor loadings obtained in the analyses, was put into practice. We were able to identify five statistically significant factors of clinical information logistics that accounted for 63% of the overall variance. These factors were "flow of data and information", "mobility", "clinical decision support and patient safety", "electronic patient record" and "integration and distribution". The system of weights derived from the factor loadings resulted in values for the workflow composite score that differed only slightly from the score values that had been previously published based on a top-down approach. Our findings give insight into the internal composition of clinical information logistics both in terms of factors and weights. They also allowed us to propose a coherent model of clinical information logistics from a technical perspective that joins empirical findings with theoretical knowledge. Despite the new scheme of weights applied to the calculation of the workflow composite score, the score behaved robustly, which is yet another hint of its validity and therefore its usefulness. Copyright © 2016 Elsevier Ireland

  6. Dynamic Reusable Workflows for Ocean Science

    Directory of Open Access Journals (Sweden)

    Richard P. Signell

    2016-10-01

    Full Text Available Digital catalogs of ocean data have been available for decades, but advances in standardized services and software for catalog searches and data access now make it possible to create catalog-driven workflows that automate—end-to-end—data search, analysis, and visualization of data from multiple distributed sources. Further, these workflows may be shared, reused, and adapted with ease. Here we describe a workflow developed within the US Integrated Ocean Observing System (IOOS which automates the skill assessment of water temperature forecasts from multiple ocean forecast models, allowing improved forecast products to be delivered for an open water swim event. A series of Jupyter Notebooks are used to capture and document the end-to-end workflow using a collection of Python tools that facilitate working with standardized catalog and data services. The workflow first searches a catalog of metadata using the Open Geospatial Consortium (OGC Catalog Service for the Web (CSW, then accesses data service endpoints found in the metadata records using the OGC Sensor Observation Service (SOS for in situ sensor data and OPeNDAP services for remotely-sensed and model data. Skill metrics are computed and time series comparisons of forecast model and observed data are displayed interactively, leveraging the capabilities of modern web browsers. The resulting workflow not only solves a challenging specific problem, but highlights the benefits of dynamic, reusable workflows in general. These workflows adapt as new data enter the data system, facilitate reproducible science, provide templates from which new scientific workflows can be developed, and encourage data providers to use standardized services. As applied to the ocean swim event, the workflow exposed problems with two of the ocean forecast products which led to improved regional forecasts once errors were corrected. While the example is specific, the approach is general, and we hope to see increased

  7. Dynamic reusable workflows for ocean science

    Science.gov (United States)

    Signell, Richard; Fernandez, Filipe; Wilcox, Kyle

    2016-01-01

    Digital catalogs of ocean data have been available for decades, but advances in standardized services and software for catalog search and data access make it now possible to create catalog-driven workflows that automate — end-to-end — data search, analysis and visualization of data from multiple distributed sources. Further, these workflows may be shared, reused and adapted with ease. Here we describe a workflow developed within the US Integrated Ocean Observing System (IOOS) which automates the skill-assessment of water temperature forecasts from multiple ocean forecast models, allowing improved forecast products to be delivered for an open water swim event. A series of Jupyter Notebooks are used to capture and document the end-to-end workflow using a collection of Python tools that facilitate working with standardized catalog and data services. The workflow first searches a catalog of metadata using the Open Geospatial Consortium (OGC) Catalog Service for the Web (CSW), then accesses data service endpoints found in the metadata records using the OGC Sensor Observation Service (SOS) for in situ sensor data and OPeNDAP services for remotely-sensed and model data. Skill metrics are computed and time series comparisons of forecast model and observed data are displayed interactively, leveraging the capabilities of modern web browsers. The resulting workflow not only solves a challenging specific problem, but highlights the benefits of dynamic, reusable workflows in general. These workflows adapt as new data enters the data system, facilitate reproducible science, provide templates from which new scientific workflows can be developed, and encourage data providers to use standardized services. As applied to the ocean swim event, the workflow exposed problems with two of the ocean forecast products which led to improved regional forecasts once errors were corrected. While the example is specific, the approach is general, and we hope to see increased use of dynamic

  8. Parametric Room Acoustic Workflows

    DEFF Research Database (Denmark)

    Parigi, Dario; Svidt, Kjeld; Molin, Erik

    2017-01-01

    The paper investigates and assesses different room acoustics software and the opportunities they offer to engage in parametric acoustics workflow and to influence architectural designs. The first step consists in the testing and benchmarking of different tools on the basis of accuracy, speed...... and interoperability with Grasshopper 3d. The focus will be placed to the benchmarking of three different acoustic analysis tools based on raytracing. To compare the accuracy and speed of the acoustic evaluation across different tools, a homogeneous set of acoustic parameters is chosen. The room acoustics parameters...... included in the set are reverberation time (EDT, RT30), clarity (C50), loudness (G), and definition (D50). Scenarios are discussed for determining at different design stages the most suitable acoustic tool. Those scenarios are characterized, by the use of less accurate but fast evaluation tools to be used...

  9. 基于动态控制机制的工作流安全访问模型%Workflow Security Access Model Based on Dynamic Control Mechanism

    Institute of Scientific and Technical Information of China (English)

    巫茜; 周庆

    2012-01-01

    为确保工作流系统安全可靠地工作,在传统基于角色的访问控制模型中引入目标案例、用户管理和目标3个关系元素,设计动态授权机制,构建一种基于动态控制机制的工作流安全访问模型,通过基本约束关系与动态约束条件,保证模型的安全运行,并将其与工作流引擎组件进行集成,为独立安全领域应用提供安全授权服务.应用结果表明,该模型可将动态职责与互惠职责较好地分离,对动态职责进行绑定,为系统工作流安全访问提供技术支持.%In order to ensure the workflow system secure and reliable, this paper introduces three relation elements of Target Case(TC), User Management(UM) and Target(T) in the conventional access control model based on role, designs the dynamic authorization mechanism, and builds a sort of security access model in workflow based on dynamic control mechanism, which ensures the secure running of the model by means of basic constraint relation and dynamic constraint conditions. The model is integrated with workflow engine component to provide the security authorization service for the application of independent security field. Application results show that the model can better separate the dynamic responsibility and reciprocal responsibility, make the binding of dynamic responsibility, and provide technical support for security access of system workflow.

  10. Scientific workflows for bibliometrics

    NARCIS (Netherlands)

    Guler, A.T.; Waaijer, C.J.; Palmblad, M.

    2016-01-01

    Scientific workflows organize the assembly of specialized software into an overall data flow and are particularly well suited for multi-step analyses using different types of software tools. They are also favorable in terms of reusability, as previously designed workflows could be made publicly avai

  11. Optimizing Workflow Data Footprint

    Directory of Open Access Journals (Sweden)

    Gurmeet Singh

    2007-01-01

    Full Text Available In this paper we examine the issue of optimizing disk usage and scheduling large-scale scientific workflows onto distributed resources where the workflows are data-intensive, requiring large amounts of data storage, and the resources have limited storage resources. Our approach is two-fold: we minimize the amount of space a workflow requires during execution by removing data files at runtime when they are no longer needed and we demonstrate that workflows may have to be restructured to reduce the overall data footprint of the workflow. We show the results of our data management and workflow restructuring solutions using a Laser Interferometer Gravitational-Wave Observatory (LIGO application and an astronomy application, Montage, running on a large-scale production grid-the Open Science Grid. We show that although reducing the data footprint of Montage by 48% can be achieved with dynamic data cleanup techniques, LIGO Scientific Collaboration workflows require additional restructuring to achieve a 56% reduction in data space usage. We also examine the cost of the workflow restructuring in terms of the application's runtime.

  12. From Workflow to Interworkflow

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Workflow management systems are being introduced in manyorganizations to automa te the business process. The initial emphasis of introducing a workflow manageme nt system is on its application to the workflow in a given organization. The nex t step is to interconnect the workflow across organizations. We call it interwor kflow, and the total support technologies, which are necessary for its realizati on, interworkflow management mechanism. Interworkflow is being expected as a su pporting mechanism for Business-to-Business Electronic Commerce. We had propos ed this management mechanism and confirmed its realization with the prototype. At the same time, the interface and the protocol for interconnecting heterogeneous workflow management systems has been standardized by the WfMC. So, we advance t he project of the implementation of interworkflow management system for the prac tical use and its experimental proof.

  13. Benchmarking ETL Workflows

    Science.gov (United States)

    Simitsis, Alkis; Vassiliadis, Panos; Dayal, Umeshwar; Karagiannis, Anastasios; Tziovara, Vasiliki

    Extraction-Transform-Load (ETL) processes comprise complex data workflows, which are responsible for the maintenance of a Data Warehouse. A plethora of ETL tools is currently available constituting a multi-million dollar market. Each ETL tool uses its own technique for the design and implementation of an ETL workflow, making the task of assessing ETL tools extremely difficult. In this paper, we identify common characteristics of ETL workflows in an effort of proposing a unified evaluation method for ETL. We also identify the main points of interest in designing, implementing, and maintaining ETL workflows. Finally, we propose a principled organization of test suites based on the TPC-H schema for the problem of experimenting with ETL workflows.

  14. Toward Design, Modelling and Analysis of Dynamic Workflow Reconfigurations - A Process Algebra Perspective

    DEFF Research Database (Denmark)

    Mazzara, M.; Abouzaid, F.; Dragoni, Nicola

    2011-01-01

    This paper describes a case study involving the dynamic re- conguration of an oce work ow. We state the requirements on a sys- tem implementing the work ow and its reconguration, and describe the system's design in BPMN. We then use an asynchronous -calculus and Web1 to model the design and to ve......This paper describes a case study involving the dynamic re- conguration of an oce work ow. We state the requirements on a sys- tem implementing the work ow and its reconguration, and describe the system's design in BPMN. We then use an asynchronous -calculus and Web1 to model the design...

  15. Process capture and modeling via workflow for integrated human-based automated command and control processes

    Science.gov (United States)

    Green, David; Dunaway, Brad; Reaper, Jerome

    2005-05-01

    The Virtual Testbed for Advanced Command and Control Concepts (VTAC) program is performing research and development efforts leading to the creation of a testbed for new Command and Control (C2) processes, subprocesses and embedded automated systems and subsystems. This testbed will initially support the capture and modeling of existing C2 processes/subprocesses. Having modeled these at proper levels of abstraction, proposed revisions or replacements to processes, systems and subsystems can be evaluated within a virtual workspace that integrates human operators and automated systems in the context of a larger C2 process. By utilizing such a testbed early in the development cycle, expected improvements resulting from specific revisions or replacements can be quantitatively established. Crossover effects resulting from changes to one or more interrelated processes can also be measured. Quantified measures of improvement can then be provided to decision makers for use in cost-to-performance benefits analysis prior to implementing proposed revisions, replacements, or a sequence of planned enhancements. This paper first presents a high-level view of the VTAC project, followed by a discussion of an example C2 process that was captured, abstracted, and modeled. The abstraction approach, model implementation, and simulations results are covered in detail.

  16. Model-based workflows for optimal long-term reservoir mangement

    NARCIS (Netherlands)

    Leeuwenburgh, O.; Egberts, P.; Chitu, A.; Wilschut, F.

    2014-01-01

    Life-cycle optimization is the process of finding field operation strategies that aim to optimize recovery or economic value with a long-term (years to decades) horizon. A reservoir simulation model is therefore generally appropriate and sufficient to explore the impact of different recovery

  17. A workflow for mathematical modeling of subcellular metabolic pathways in leaf metabolism of Arabidopsis thaliana

    Directory of Open Access Journals (Sweden)

    Thomas eNägele

    2013-12-01

    Full Text Available During the last decade genome sequencing has experienced a rapid technological development resulting in numerous sequencing projects and applications in life science. In plant molecular biology, the availability of sequence data on whole genomes has enabled the reconstruction of metabolic networks. Enzymatic reactions are predicted by the sequence information. Pathways arise due to the participation of chemical compounds as substrates and products in these reactions. Although several of these comprehensive networks have been reconstructed for the genetic model plant Arabidopsis thaliana, the integration of experimental data is still challenging. Particularly the analysis of subcellular organization of plant cells limits the understanding of regulatory instances in these metabolic networks in vivo. In this study, we develop an approach for the functional integration of experimental high-throughput data into such large-scale networks. We present a subcellular metabolic network model comprising 524 metabolic intermediates and 548 metabolic interactions derived from a total of 2769 reactions. We demonstrate how to link the metabolite covariance matrix of different Arabidopsis thaliana accessions with the subcellular metabolic network model for the inverse calculation of the biochemical Jacobian, finally resulting in the calculation of a matrix which satisfies a Lyaponov equation involving a covariance matrix. In this way, differential strategies of metabolite compartmentation and involved reactions were identified in the accessions when exposed to low temperature.

  18. 基于工作流模板的Web服务组合模型研究%Research on Web Services Composition Model Based on Workflow Template

    Institute of Scientific and Technical Information of China (English)

    李顺新; 凌海洋; 江南

    2009-01-01

    Web services composition is an important research field of service applications. By the similarity between workflow and Web services composition, a new Web services composition model based on workflow template is proposed. In this model, the workflow and Web service can be found more accurately by using the advantage of the functional semantics. The Agent method is used to execute the composition flow. Finally, the template flow and the Web service are published to the register library by using the publishing algorithm.%服务组合是Web服务应用的一个重要研究方向,利用工作流与服务组合的相似性,提出了一种基于工作流模板的Web服务组合模型;该模型利用功能语义在服务匹配上的优点,对流程、服务进行较为准确的查询;并通过Agent技术来执行组合方案;最后结合服务发布算法,将组合后的服务和流程发布在注册库中.

  19. dfnWorks: A HPC Workflow for Discrete Fracture Network Modeling with Subsurface Flow and Transport Applications

    Science.gov (United States)

    Gable, C. W.; Hyman, J.; Karra, S.; Makedonska, N.; Painter, S. L.; Viswanathan, H. S.

    2015-12-01

    dfnWorks generates discrete fracture networks (DFN) of planar polygons, creates a high quality conforming Delaunay triangulation of the intersecting DFN polygons, assigns properties (aperture, permeability) using geostatistics, sets boundary and initial conditions, solves pressure/flow in single or multi-phase fluids (water, air, CO2) using the parallel PFLOTRAN or serial FEHM, and solves for transport using Lagrangian particle tracking. We outline the dfnWorks workflow and present applications from a range of fractured rock systems. dfnWorks (http://www.lanl.gov/expertise/teams/view/dfnworks) is composed of three main components, all of which are freely available. dfnGen generates a distribution of fracture polygons from site characterization data (statistics or deterministic fractures) and utilizes the FRAM (Feature Rejection Algorithm for Meshing) to guarantee the mesh generation package LaGriT (lagrit.lanl.gov) will generate a high quality conforming Delaunay triangular mesh. dfnWorks links the mesh to either PFLOTRAN (pflotran.org) or FEHM (fehm.lanl.gov) for solving flow and transport. The various physics options available in FEHM and PFLOTRAN such as single and multi-phase flow and reactive transport are all available with appropriate initial and boundary conditions and material property models. dfnTrans utilizes explicit Lagrangian particle tracking on the DFN using a velocity field reconstructed from the steady state pressure/flow field solution obtained in PFLOTRAN or FEHM. Applications are demonstrated for nuclear waste repository in fractured granite, CO2 sequestration and extraction of unconventional hydrocarbon resources.

  20. 基于SOA的BPO工作流模型研究与实践%Research of an SOA-Based BPO Workflow Model and Its Practical Application

    Institute of Scientific and Technical Information of China (English)

    秦凤梅; 张桂华; 邱玉辉

    2013-01-01

    With the development of enterprise information technology,enterprises pay more and more attention to their core business and outsource their non-core business to other professional companies.As a result,BPO (business process outsourcing) arises at the historic moment.This article gives an introduction to BPO workflow,analyzes the structure characteristics of current BPO workflow and,based on the SOA (service-oriented architecture),presents a description of the methods and steps of BPO business logic service encapsulation with the service-workflow mapping model.Then a BPO complaints workflow management system model is designed in the case of SOA architecture,which is characterized by loosely coupling,extensibility,re-usability and easiness for maintenance.%随着企业信息化技术不断发展,企业越来越重视其核心业务,将非核心业务外包给其它专业化企业,业务流程外包(Business Process Outsourcing,BPO)应运而生.从介绍BPO工作流入手,通过当前BPO工作流构成特点分析,结合面向服务架构(SOA),采用服务一工作流映射模型,对BPO业务逻辑服务封装的方法及步骤进行描述,并设计了一个SOA架构下的BPO投诉管理系统工作流模型,具有松散耦合、可扩展、复用、易维护等特点.

  1. Scientific workflows for bibliometrics.

    Science.gov (United States)

    Guler, Arzu Tugce; Waaijer, Cathelijn J F; Palmblad, Magnus

    Scientific workflows organize the assembly of specialized software into an overall data flow and are particularly well suited for multi-step analyses using different types of software tools. They are also favorable in terms of reusability, as previously designed workflows could be made publicly available through the myExperiment community and then used in other workflows. We here illustrate how scientific workflows and the Taverna workbench in particular can be used in bibliometrics. We discuss the specific capabilities of Taverna that makes this software a powerful tool in this field, such as automated data import via Web services, data extraction from XML by XPaths, and statistical analysis and visualization with R. The support of the latter is particularly relevant, as it allows integration of a number of recently developed R packages specifically for bibliometrics. Examples are used to illustrate the possibilities of Taverna in the fields of bibliometrics and scientometrics.

  2. It's All About the Data: Workflow Systems and Weather

    Science.gov (United States)

    Plale, B.

    2009-05-01

    Digital data is fueling new advances in the computational sciences, particularly geospatial research as environmental sensing grows more practical through reduced technology costs, broader network coverage, and better instruments. e-Science research (i.e., cyberinfrastructure research) has responded to data intensive computing with tools, systems, and frameworks that support computationally oriented activities such as modeling, analysis, and data mining. Workflow systems support execution of sequences of tasks on behalf of a scientist. These systems, such as Taverna, Apache ODE, and Kepler, when built as part of a larger cyberinfrastructure framework, give the scientist tools to construct task graphs of execution sequences, often through a visual interface for connecting task boxes together with arcs representing control flow or data flow. Unlike business processing workflows, scientific workflows expose a high degree of detail and control during configuration and execution. Data-driven science imposes unique needs on workflow frameworks. Our research is focused on two issues. The first is the support for workflow-driven analysis over all kinds of data sets, including real time streaming data and locally owned and hosted data. The second is the essential role metadata/provenance collection plays in data driven science, for discovery, determining quality, for science reproducibility, and for long-term preservation. The research has been conducted over the last 6 years in the context of cyberinfrastructure for mesoscale weather research carried out as part of the Linked Environments for Atmospheric Discovery (LEAD) project. LEAD has pioneered new approaches for integrating complex weather data, assimilation, modeling, mining, and cyberinfrastructure systems. Workflow systems have the potential to generate huge volumes of data. Without some form of automated metadata capture, either metadata description becomes largely a manual task that is difficult if not impossible

  3. Structured Composition of Dataflow and Control-Flow for Reusable and Robust Scientific Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Bowers, S; Ludaescher, B; Ngu, A; Critchlow, T

    2005-09-07

    Data-centric scientific workflows are often modeled as dataflow process networks. The simplicity of the dataflow framework facilitates workflow design, analysis, and optimization. However, some workflow tasks are particularly ''control-flow intensive'', e.g., procedures to make workflows more fault-tolerant and adaptive in an unreliable, distributed computing environment. Modeling complex control-flow directly within a dataflow framework often leads to overly complicated workflows that are hard to comprehend, reuse, schedule, and maintain. In this paper, we develop a framework that allows a structured embedding of control-flow intensive subtasks within dataflow process networks. In this way, we can seamlessly handle complex control-flows without sacrificing the benefits of dataflow. We build upon a flexible actor-oriented modeling and design approach and extend it with (actor) frames and (workflow) templates. A frame is a placeholder for an (existing or planned) collection of components with similar function and signature. A template partially specifies the behavior of a subworkflow by leaving ''holes'' (i.e., frames) in the subworkflow definition. Taken together, these abstraction mechanisms facilitate the separation and structured re-combination of control-flow and dataflow in scientific workflow applications. We illustrate our approach with a real-world scientific workflow from the astrophysics domain. This data-intensive workflow requires remote execution and file transfer in a semi-reliable environment. For such work-flows, we propose a 3-layered architecture: The top-level, typically a dataflow process network, includes Generic Data Transfer (GDT) frames and Generic remote eXecution (GX) frames. At the second level, the user can specialize the behavior of these generic components by embedding a suitable template (here: transducer templates for control-flow intensive tasks). At the third level, frames inside the

  4. Climate Data Analytics Workflow Management

    Science.gov (United States)

    Zhang, J.; Lee, S.; Pan, L.; Mattmann, C. A.; Lee, T. J.

    2016-12-01

    In this project we aim to pave a novel path to create a sustainable building block toward Earth science big data analytics and knowledge sharing. Closely studying how Earth scientists conduct data analytics research in their daily work, we have developed a provenance model to record their activities, and to develop a technology to automatically generate workflows for scientists from the provenance. On top of it, we have built the prototype of a data-centric provenance repository, and establish a PDSW (People, Data, Service, Workflow) knowledge network to support workflow recommendation. To ensure the scalability and performance of the expected recommendation system, we have leveraged the Apache OODT system technology. The community-approved, metrics-based performance evaluation web-service will allow a user to select a metric from the list of several community-approved metrics and to evaluate model performance using the metric as well as the reference dataset. This service will facilitate the use of reference datasets that are generated in support of the model-data intercomparison projects such as Obs4MIPs and Ana4MIPs. The data-centric repository infrastructure will allow us to catch richer provenance to further facilitate knowledge sharing and scientific collaboration in the Earth science community. This project is part of Apache incubator CMDA project.

  5. Flexible workflow sharing and execution services for e-scientists

    Science.gov (United States)

    Kacsuk, Péter; Terstyanszky, Gábor; Kiss, Tamas; Sipos, Gergely

    2013-04-01

    The sequence of computational and data manipulation steps required to perform a specific scientific analysis is called a workflow. Workflows that orchestrate data and/or compute intensive applications on Distributed Computing Infrastructures (DCIs) recently became standard tools in e-science. At the same time the broad and fragmented landscape of workflows and DCIs slows down the uptake of workflow-based work. The development, sharing, integration and execution of workflows is still a challenge for many scientists. The FP7 "Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs" (SHIWA) project significantly improved the situation, with a simulation platform that connects different workflow systems, different workflow languages, different DCIs and workflows into a single, interoperable unit. The SHIWA Simulation Platform is a service package, already used by various scientific communities, and used as a tool by the recently started ER-flow FP7 project to expand the use of workflows among European scientists. The presentation will introduce the SHIWA Simulation Platform and the services that ER-flow provides based on the platform to space and earth science researchers. The SHIWA Simulation Platform includes: 1. SHIWA Repository: A database where workflows and meta-data about workflows can be stored. The database is a central repository to discover and share workflows within and among communities . 2. SHIWA Portal: A web portal that is integrated with the SHIWA Repository and includes a workflow executor engine that can orchestrate various types of workflows on various grid and cloud platforms. 3. SHIWA Desktop: A desktop environment that provides similar access capabilities than the SHIWA Portal, however it runs on the users' desktops/laptops instead of a portal server. 4. Workflow engines: the ASKALON, Galaxy, GWES, Kepler, LONI Pipeline, MOTEUR, Pegasus, P-GRADE, ProActive, Triana, Taverna and WS-PGRADE workflow engines are already

  6. A standard-enabled workflow for synthetic biology.

    Science.gov (United States)

    Myers, Chris J; Beal, Jacob; Gorochowski, Thomas E; Kuwahara, Hiroyuki; Madsen, Curtis; McLaughlin, James Alastair; Mısırlı, Göksel; Nguyen, Tramy; Oberortner, Ernst; Samineni, Meher; Wipat, Anil; Zhang, Michael; Zundel, Zach

    2017-06-15

    A synthetic biology workflow is composed of data repositories that provide information about genetic parts, sequence-level design tools to compose these parts into circuits, visualization tools to depict these designs, genetic design tools to select parts to create systems, and modeling and simulation tools to evaluate alternative design choices. Data standards enable the ready exchange of information within such a workflow, allowing repositories and tools to be connected from a diversity of sources. The present paper describes one such workflow that utilizes, among others, the Synthetic Biology Open Language (SBOL) to describe genetic designs, the Systems Biology Markup Language to model these designs, and SBOL Visual to visualize these designs. We describe how a standard-enabled workflow can be used to produce types of design information, including multiple repositories and software tools exchanging information using a variety of data standards. Recently, the ACS Synthetic Biology journal has recommended the use of SBOL in their publications. © 2017 The Author(s); published by Portland Press Limited on behalf of the Biochemical Society.

  7. A workflow for transferring heterogeneous complex geological models to consistent finite element models and application to a deep geothermal reservoir operation

    Science.gov (United States)

    Wang, Bo; Bauer, Sebastian

    2016-04-01

    Geological models are the prerequisite of exploring possible use of the subsurface and evaluating induced impacts. Subsurface geological models often show strong complexity in geometry and hydraulic connectivity because of their heterogeneous nature. In order to model that complexity, the corner point grid approach has been applied by geologists for decades. The corner point grid utilizes a set of hexahedral blocks to represent geological formations. Due to the appearance of eroded geological layers, some edges of those blocks may be collapsed and the blocks thus degenerate. This leads to the inconsistency and the impossibility of using the corner point grid directly with a finite element based simulator. Therefore, in this study, we introduce a workflow for transferring heterogeneous geological models to consistent finite element models. In the corner point grid, the hexahedral blocks without collapsed edges are converted to hexahedral elements directly. But if they degenerate, each block is divided into prism, pyramid and tetrahedral elements based on individual degenerated situation. This approach consistently converts any degenerated corner point grid to a consistent hybrid finite element mesh. Along with the above converting scheme, the corresponding heterogeneous geological data, e.g. permeability and porosity, can be transferred as well. Moreover, well trajectories designed in the corner point grid can be resampled to the nodes in the finite element mesh, which represents the location for source terms along the well path. As a proof of concept, we implement the workflow in the framework of transferring models from Petrel to the finite element OpenGeoSys simulator. As application scenario we choose a deep geothermal reservoir operation in the North German Basin. A well doublet is defined in a saline aquifer in the Rhaetian formation, which has a depth of roughly 4000 m. The geometric model shows all kinds of degenerated blocks due to eroded layers and the

  8. 工作流管理规范综述%The Workflow Management Specification Overview

    Institute of Scientific and Technical Information of China (English)

    陈畅; 吴朝晖

    2000-01-01

    Workflow Management is a fast evolving technology,many software vendors have WFM products available today in the market.To enable interoperability between heterogeneous workflow products and improve integration of workflow applications with other IT services,it is necessary to work out common specifications.The purpose of this paper is to provide a framework to specifications for implementation in workflow products developed by the WFM Coalition.It provides a common"Reference Model"for workflow management systems.

  9. Dynamic hysteresis modeling including skin effect using diffusion equation model

    Science.gov (United States)

    Hamada, Souad; Louai, Fatima Zohra; Nait-Said, Nasreddine; Benabou, Abdelkader

    2016-07-01

    An improved dynamic hysteresis model is proposed for the prediction of hysteresis loop of electrical steel up to mean frequencies, taking into account the skin effect. In previous works, the analytical solution of the diffusion equation for low frequency (DELF) was coupled with the inverse static Jiles-Atherton (JA) model in order to represent the hysteresis behavior for a lamination. In the present paper, this approach is improved to ensure the reproducibility of measured hysteresis loops at mean frequency. The results of simulation are compared with the experimental ones. The selected results for frequencies 50 Hz, 100 Hz, 200 Hz and 400 Hz are presented and discussed.

  10. Dynamic hysteresis modeling including skin effect using diffusion equation model

    Energy Technology Data Exchange (ETDEWEB)

    Hamada, Souad, E-mail: souadhamada@yahoo.fr [LSP-IE: Research Laboratory, Electrical Engineering Department, University of Batna, 05000 Batna (Algeria); Louai, Fatima Zohra, E-mail: fz_louai@yahoo.com [LSP-IE: Research Laboratory, Electrical Engineering Department, University of Batna, 05000 Batna (Algeria); Nait-Said, Nasreddine, E-mail: n_naitsaid@yahoo.com [LSP-IE: Research Laboratory, Electrical Engineering Department, University of Batna, 05000 Batna (Algeria); Benabou, Abdelkader, E-mail: Abdelkader.Benabou@univ-lille1.fr [L2EP, Université de Lille1, 59655 Villeneuve d’Ascq (France)

    2016-07-15

    An improved dynamic hysteresis model is proposed for the prediction of hysteresis loop of electrical steel up to mean frequencies, taking into account the skin effect. In previous works, the analytical solution of the diffusion equation for low frequency (DELF) was coupled with the inverse static Jiles-Atherton (JA) model in order to represent the hysteresis behavior for a lamination. In the present paper, this approach is improved to ensure the reproducibility of measured hysteresis loops at mean frequency. The results of simulation are compared with the experimental ones. The selected results for frequencies 50 Hz, 100 Hz, 200 Hz and 400 Hz are presented and discussed.

  11. Unsteady panel method for complex configurations including wake modeling

    CSIR Research Space (South Africa)

    Van Zyl, Lourens H

    2008-01-01

    Full Text Available implementations of the DLM are however not very versatile in terms of geometries that can be modeled. The ZONA6 code offers a versatile surface panel body model including a separated wake model, but uses a pressure panel method for lifting surfaces. This paper...

  12. Modeling Electric Double-Layers Including Chemical Reaction Effects

    DEFF Research Database (Denmark)

    Paz-Garcia, Juan Manuel; Johannesson, Björn; Ottosen, Lisbeth M.

    2014-01-01

    A physicochemical and numerical model for the transient formation of an electric double-layer between an electrolyte and a chemically-active flat surface is presented, based on a finite elements integration of the nonlinear Nernst-Planck-Poisson model including chemical reactions. The model works...

  13. Parametric Room Acoustic Workflows

    DEFF Research Database (Denmark)

    Parigi, Dario; Svidt, Kjeld; Molin, Erik

    2017-01-01

    The paper investigates and assesses different room acoustics software and the opportunities they offer to engage in parametric acoustics workflow and to influence architectural designs. The first step consists in the testing and benchmarking of different tools on the basis of accuracy, speed and ...

  14. Circuit Modeling of a MEMS Varactor Including Dielectric Charging Dynamics

    Science.gov (United States)

    Giounanlis, P.; Andrade-Miceli, D.; Gorreta, S.; Pons-Nin, J.; Dominguez-Pumar, M.; Blokhina, E.

    2016-10-01

    Electrical models for MEMS varactors including the effect of dielectric charging dynamics are not available in commercial circuit simulators. In this paper a circuit model using lumped ideal elements available in the Cadence libraries and a basic Verilog-A model, has been implemented. The model has been used to simulate the dielectric charging in function of time and its effects over the MEMS capacitance value.

  15. Including investment risk in large-scale power market models

    DEFF Research Database (Denmark)

    Lemming, Jørgen Kjærgaard; Meibom, P.

    2003-01-01

    can be included in large-scale partial equilibrium models of the power market. The analyses are divided into a part about risk measures appropriate for power market investors and a more technical part about the combination of a risk-adjustment model and a partial-equilibrium model. To illustrate......Long-term energy market models can be used to examine investments in production technologies, however, with market liberalisation it is crucial that such models include investment risks and investor behaviour. This paper analyses how the effect of investment risk on production technology selection...... the analyses quantitatively, a framework based on an iterative interaction between the equilibrium model and a separate risk-adjustment module was constructed. To illustrate the features of the proposed modelling approach we examined how uncertainty in demand and variable costs affects the optimal choice...

  16. Tavaxy: integrating Taverna and Galaxy workflows with cloud computing support.

    Science.gov (United States)

    Abouelhoda, Mohamed; Issa, Shadi Alaa; Ghanem, Moustafa

    2012-05-04

    Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub-) workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and complexity of analysis.The system can be accessed either through a

  17. Tavaxy: Integrating Taverna and Galaxy workflows with cloud computing support

    Directory of Open Access Journals (Sweden)

    Abouelhoda Mohamed

    2012-05-01

    Full Text Available Abstract Background Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. Results In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Conclusions Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub- workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and

  18. Comparison of Resource Platform Selection Approaches for Scientific Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh; Ramakrishnan, Lavanya

    2010-03-05

    Cloud computing is increasingly considered as an additional computational resource platform for scientific workflows. The cloud offers opportunity to scale-out applications from desktops and local cluster resources. At the same time, it can eliminate the challenges of restricted software environments and queue delays in shared high performance computing environments. Choosing from these diverse resource platforms for a workflow execution poses a challenge for many scientists. Scientists are often faced with deciding resource platform selection trade-offs with limited information on the actual workflows. While many workflow planning methods have explored task scheduling onto different resources, these methods often require fine-scale characterization of the workflow that is onerous for a scientist. In this position paper, we describe our early exploratory work into using blackbox characteristics to do a cost-benefit analysis across of using cloud platforms. We use only very limited high-level information on the workflow length, width, and data sizes. The length and width are indicative of the workflow duration and parallelism. The data size characterizes the IO requirements. We compare the effectiveness of this approach to other resource selection models using two exemplar scientific workflows scheduled on desktops, local clusters, HPC centers, and clouds. Early results suggest that the blackbox model often makes the same resource selections as a more fine-grained whitebox model. We believe the simplicity of the blackbox model can help inform a scientist on the applicability of cloud computing resources even before porting an existing workflow.

  19. Pegasus Workflow Management System: Helping Applications From Earth and Space

    Science.gov (United States)

    Mehta, G.; Deelman, E.; Vahi, K.; Silva, F.

    2010-12-01

    Pegasus WMS is a Workflow Management System that can manage large-scale scientific workflows across Grid, local and Cloud resources simultaneously. Pegasus WMS provides a means for representing the workflow of an application in an abstract XML form, agnostic of the resources available to run it and the location of data and executables. It then compiles these workflows into concrete plans by querying catalogs and farming computations across local and distributed computing resources, as well as emerging commercial and community cloud environments in an easy and reliable manner. Pegasus WMS optimizes the execution as well as data movement by leveraging existing Grid and cloud technologies via a flexible pluggable interface and provides advanced features like reusing existing data, automatic cleanup of generated data, and recursive workflows with deferred planning. It also captures all the provenance of the workflow from the planning stage to the execution of the generated data, helping scientists to accurately measure performance metrics of their workflow as well as data reproducibility issues. Pegasus WMS was initially developed as part of the GriPhyN project to support large-scale high-energy physics and astrophysics experiments. Direct funding from the NSF enabled support for a wide variety of applications from diverse domains including earthquake simulation, bacterial RNA studies, helioseismology and ocean modeling. Earthquake Simulation: Pegasus WMS was recently used in a large scale production run in 2009 by the Southern California Earthquake Centre to run 192 million loosely coupled tasks and about 2000 tightly coupled MPI style tasks on National Cyber infrastructure for generating a probabilistic seismic hazard map of the Southern California region. SCEC ran 223 workflows over a period of eight weeks, using on average 4,420 cores, with a peak of 14,540 cores. A total of 192 million files were produced totaling about 165TB out of which 11TB of data was saved

  20. 基于工作流的电信业务开发系统模型%A Model of Workflow-based Telecoms Service Development System

    Institute of Scientific and Technical Information of China (English)

    丁林花

    2009-01-01

    随着Web服务技术的发展.Web服务已经成为集成电信网、移动网、互联网上分布和异构应用的通用技术.基于流程的Web服务组合语言未考虑业务规则的分离,不支持流程中不确定的动态部分的抽象和封装,因此该文提出了将规则系统应用到工作流中的业务开发系统模型,从而为服务提供商进行电信新业务开发和生成提供了快速有效的途径.%Web Services has become the universal technology for integrating distributed, heterogeneous applications among PSTN, mobile networks and Intemet. However Workflow-based Web Services composition languages lack the ability to use business rules managed by business rules engines, they also fails to abstract the dynamic part of processes. This paper will propose a system model which integrates rules engine into workflow systems to help telecoms service providers offer new services.

  1. CaGrid Workflow Toolkit: A taverna based workflow tool for cancer grid

    Directory of Open Access Journals (Sweden)

    Sulakhe Dinanath

    2010-11-01

    Full Text Available Abstract Background In biological and medical domain, the use of web services made the data and computation functionality accessible in a unified manner, which helped automate the data pipeline that was previously performed manually. Workflow technology is widely used in the orchestration of multiple services to facilitate in-silico research. Cancer Biomedical Informatics Grid (caBIG is an information network enabling the sharing of cancer research related resources and caGrid is its underlying service-based computation infrastructure. CaBIG requires that services are composed and orchestrated in a given sequence to realize data pipelines, which are often called scientific workflows. Results CaGrid selected Taverna as its workflow execution system of choice due to its integration with web service technology and support for a wide range of web services, plug-in architecture to cater for easy integration of third party extensions, etc. The caGrid Workflow Toolkit (or the toolkit for short, an extension to the Taverna workflow system, is designed and implemented to ease building and running caGrid workflows. It provides users with support for various phases in using workflows: service discovery, composition and orchestration, data access, and secure service invocation, which have been identified by the caGrid community as challenging in a multi-institutional and cross-discipline domain. Conclusions By extending the Taverna Workbench, caGrid Workflow Toolkit provided a comprehensive solution to compose and coordinate services in caGrid, which would otherwise remain isolated and disconnected from each other. Using it users can access more than 140 services and are offered with a rich set of features including discovery of data and analytical services, query and transfer of data, security protections for service invocations, state management in service interactions, and sharing of workflows, experiences and best practices. The proposed solution is

  2. CaGrid Workflow Toolkit: a Taverna based workflow tool for cancer grid.

    Science.gov (United States)

    Tan, Wei; Madduri, Ravi; Nenadic, Alexandra; Soiland-Reyes, Stian; Sulakhe, Dinanath; Foster, Ian; Goble, Carole A

    2010-11-02

    In biological and medical domain, the use of web services made the data and computation functionality accessible in a unified manner, which helped automate the data pipeline that was previously performed manually. Workflow technology is widely used in the orchestration of multiple services to facilitate in-silico research. Cancer Biomedical Informatics Grid (caBIG) is an information network enabling the sharing of cancer research related resources and caGrid is its underlying service-based computation infrastructure. CaBIG requires that services are composed and orchestrated in a given sequence to realize data pipelines, which are often called scientific workflows. CaGrid selected Taverna as its workflow execution system of choice due to its integration with web service technology and support for a wide range of web services, plug-in architecture to cater for easy integration of third party extensions, etc. The caGrid Workflow Toolkit (or the toolkit for short), an extension to the Taverna workflow system, is designed and implemented to ease building and running caGrid workflows. It provides users with support for various phases in using workflows: service discovery, composition and orchestration, data access, and secure service invocation, which have been identified by the caGrid community as challenging in a multi-institutional and cross-discipline domain. By extending the Taverna Workbench, caGrid Workflow Toolkit provided a comprehensive solution to compose and coordinate services in caGrid, which would otherwise remain isolated and disconnected from each other. Using it users can access more than 140 services and are offered with a rich set of features including discovery of data and analytical services, query and transfer of data, security protections for service invocations, state management in service interactions, and sharing of workflows, experiences and best practices. The proposed solution is general enough to be applicable and reusable within other

  3. Wildfire: distributed, Grid-enabled workflow construction and execution

    Directory of Open Access Journals (Sweden)

    Issac Praveen

    2005-03-01

    Full Text Available Abstract Background We observe two trends in bioinformatics: (i analyses are increasing in complexity, often requiring several applications to be run as a workflow; and (ii multiple CPU clusters and Grids are available to more scientists. The traditional solution to the problem of running workflows across multiple CPUs required programming, often in a scripting language such as perl. Programming places such solutions beyond the reach of many bioinformatics consumers. Results We present Wildfire, a graphical user interface for constructing and running workflows. Wildfire borrows user interface features from Jemboss and adds a drag-and-drop interface allowing the user to compose EMBOSS (and other programs into workflows. For execution, Wildfire uses GEL, the underlying workflow execution engine, which can exploit available parallelism on multiple CPU machines including Beowulf-class clusters and Grids. Conclusion Wildfire simplifies the tasks of constructing and executing bioinformatics workflows.

  4. Workflows for Full Waveform Inversions

    Science.gov (United States)

    Boehm, Christian; Krischer, Lion; Afanasiev, Michael; van Driel, Martin; May, Dave A.; Rietmann, Max; Fichtner, Andreas

    2017-04-01

    Despite many theoretical advances and the increasing availability of high-performance computing clusters, full seismic waveform inversions still face considerable challenges regarding data and workflow management. While the community has access to solvers which can harness modern heterogeneous computing architectures, the computational bottleneck has fallen to these often manpower-bounded issues that need to be overcome to facilitate further progress. Modern inversions involve huge amounts of data and require a tight integration between numerical PDE solvers, data acquisition and processing systems, nonlinear optimization libraries, and job orchestration frameworks. To this end we created a set of libraries and applications revolving around Salvus (http://salvus.io), a novel software package designed to solve large-scale full waveform inverse problems. This presentation focuses on solving passive source seismic full waveform inversions from local to global scales with Salvus. We discuss (i) design choices for the aforementioned components required for full waveform modeling and inversion, (ii) their implementation in the Salvus framework, and (iii) how it is all tied together by a usable workflow system. We combine state-of-the-art algorithms ranging from high-order finite-element solutions of the wave equation to quasi-Newton optimization algorithms using trust-region methods that can handle inexact derivatives. All is steered by an automated interactive graph-based workflow framework capable of orchestrating all necessary pieces. This naturally facilitates the creation of new Earth models and hopefully sparks new scientific insights. Additionally, and even more importantly, it enhances reproducibility and reliability of the final results.

  5. Insightful Workflow For Grid Computing

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Charles Earl

    2008-10-09

    We developed a workflow adaptation and scheduling system for Grid workflow. The system currently interfaces with and uses the Karajan workflow system. We developed machine learning agents that provide the planner/scheduler with information needed to make decisions about when and how to replan. The Kubrick restructures workflow at runtime, making it unique among workflow scheduling systems. The existing Kubrick system provides a platform on which to integrate additional quality of service constraints and in which to explore the use of an ensemble of scheduling and planning algorithms. This will be the principle thrust of our Phase II work.

  6. Progressive IRP Models for Power Resources Including EPP

    Directory of Open Access Journals (Sweden)

    Yiping Zhu

    2017-01-01

    Full Text Available In the view of optimizing regional power supply and demand, the paper makes effective planning scheduling of supply and demand side resources including energy efficiency power plant (EPP, to achieve the target of benefit, cost, and environmental constraints. In order to highlight the characteristics of different supply and demand resources in economic, environmental, and carbon constraints, three planning models with progressive constraints are constructed. Results of three models by the same example show that the best solutions to different models are different. The planning model including EPP has obvious advantages considering pollutant and carbon emission constraints, which confirms the advantages of low cost and emissions of EPP. The construction of progressive IRP models for power resources considering EPP has a certain reference value for guiding the planning and layout of EPP within other power resources and achieving cost and environmental objectives.

  7. Beginning WF Windows Workflow in .NET 4.0

    CERN Document Server

    Collins, M

    2010-01-01

    Windows Workflow Foundation is a ground-breaking addition to the core of the .NET Framework that allows you to orchestrate human and system interactions as a series of workflows that can be easily mapped, analyzed, adjusted, and implemented. As business problems become more complex, the need for a workflow-based solution has never been more evident. WF provides a simple and consistent way to model and implement complex problems. As a developer, you focus on developing the business logic for individual workflow tasks. The runtime handles the execution of those tasks after they have been compose

  8. The PBase Scientific Workflow Provenance Repository

    Directory of Open Access Journals (Sweden)

    Víctor Cuevas-Vicenttín

    2014-10-01

    Full Text Available Scientific workflows and their supporting systems are becoming increasingly popular for compute-intensive and data-intensive scientific experiments. The advantages scientific workflows offer include rapid and easy workflow design, software and data reuse, scalable execution, sharing and collaboration, and other advantages that altogether facilitate “reproducible science”. In this context, provenance – information about the origin, context, derivation, ownership, or history of some artifact – plays a key role, since scientists are interested in examining and auditing the results of scientific experiments. However, in order to perform such analyses on scientific results as part of extended research collaborations, an adequate environment and tools are required. Concretely, the need arises for a repository that will facilitate the sharing of scientific workflows and their associated execution traces in an interoperable manner, also enabling querying and visualization. Furthermore, such functionality should be supported while taking performance and scalability into account. With this purpose in mind, we introduce PBase: a scientific workflow provenance repository implementing the ProvONE proposed standard, which extends the emerging W3C PROV standard for provenance data with workflow specific concepts. PBase is built on the Neo4j graph database, thus offering capabilities such as declarative and efficient querying. Our experiences demonstrate the power gained by supporting various types of queries for provenance data. In addition, PBase is equipped with a user friendly interface tailored for the visualization of scientific workflow provenance data, making the specification of queries and the interpretation of their results easier and more effective.

  9. How Workflow Documentation Facilitates Curation Planning

    Science.gov (United States)

    Wickett, K.; Thomer, A. K.; Baker, K. S.; DiLauro, T.; Asangba, A. E.

    2013-12-01

    The description of the specific processes and artifacts that led to the creation of a data product provide a detailed picture of data provenance in the form of a workflow. The Site-Based Data Curation project, hosted by the Center for Informatics Research in Science and Scholarship at the University of Illinois, has been investigating how workflows can be used in developing curation processes and policies that move curation "upstream" in the research process. The team has documented an individual workflow for geobiology data collected during a single field trip to Yellowstone National Park. This specific workflow suggests a generalized three-part process for field data collection that comprises three distinct elements: a Planning Stage, a Fieldwork Stage, and a Processing and Analysis Stage. Beyond supplying an account of data provenance, the workflow has allowed the team to identify 1) points of intervention for curation processes and 2) data products that are likely candidates for sharing or deposit. Although these objects may be viewed by individual researchers as 'intermediate' data products, discussions with geobiology researchers have suggested that with appropriate packaging and description they may serve as valuable observational data for other researchers. Curation interventions may include the introduction of regularized data formats during the planning process, data description procedures, the identification and use of established controlled vocabularies, and data quality and validation procedures. We propose a poster that shows the individual workflow and our generalization into a three-stage process. We plan to discuss with attendees how well the three-stage view applies to other types of field-based research, likely points of intervention, and what kinds of interventions are appropriate and feasible in the example workflow.

  10. Structural and Parametric Models of the Załęcze and Żuchlów Gas Field Region, Fore-Sudetic Monocline, Poland – An Example of a General Static Modeling Workflow in Mature Petroleum Areas for CCS, EGR or EOR Purposes

    Directory of Open Access Journals (Sweden)

    Papiernik Bartosz

    2015-04-01

    Full Text Available Załęcze and Żuchlów are strongly depleted natural gas fields in aeolian sandstones of the Rotliegend, located in the central part of the Fore-Sudetic Monocline. A set of three static 3D models was generated to check the possibility of CO2 injection for Enhanced Gas Recovery (EGR and to check the safety of storage by means of geomechanical modeling: one regional model (ZZA and two local models – the first for Załęcze (ZA gas field and the second for Żuchlów (ZU gas field. The regional model is composed of 12 stratigraphic complexes (zones from the base of the Rotliegend to the ground surface. The local models comprise only the three lowermost complexes: fluvial deposits of the Rotliegend, aeolian sandstones of the Rotliegend (Reservoir I and basal Zechstein limestone, Ca1. The key elements of the modeling procedure include: Quality Control (QC of the data, interpretation of missing parameters necessary for static modeling and their integration within a geomodel. The processing workflow was elaborated to produce convergent regional and local models. The regional static model is a framework for a regional geomechanical model. The local models are the basis for dynamic simulations and local geomechanical modeling. The presented workflow could be used with some changes for geomodeling of many mature gas and oil fields.

  11. Quantitative analysis of probabilistic BPMN workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2012-01-01

    We present a framework for modelling and analysis of realworld business workflows. We present a formalised core subset of the Business Process Modelling and Notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...... of events, reward-based properties and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover the entire BPMN language, allow...

  12. Modeling heart rate variability including the effect of sleep stages

    Science.gov (United States)

    Soliński, Mateusz; Gierałtowski, Jan; Żebrowski, Jan

    2016-02-01

    We propose a model for heart rate variability (HRV) of a healthy individual during sleep with the assumption that the heart rate variability is predominantly a random process. Autonomic nervous system activity has different properties during different sleep stages, and this affects many physiological systems including the cardiovascular system. Different properties of HRV can be observed during each particular sleep stage. We believe that taking into account the sleep architecture is crucial for modeling the human nighttime HRV. The stochastic model of HRV introduced by Kantelhardt et al. was used as the initial starting point. We studied the statistical properties of sleep in healthy adults, analyzing 30 polysomnographic recordings, which provided realistic information about sleep architecture. Next, we generated synthetic hypnograms and included them in the modeling of nighttime RR interval series. The results of standard HRV linear analysis and of nonlinear analysis (Shannon entropy, Poincaré plots, and multiscale multifractal analysis) show that—in comparison with real data—the HRV signals obtained from our model have very similar properties, in particular including the multifractal characteristics at different time scales. The model described in this paper is discussed in the context of normal sleep. However, its construction is such that it should allow to model heart rate variability in sleep disorders. This possibility is briefly discussed.

  13. Complexity Metrics for Workflow Nets

    DEFF Research Database (Denmark)

    Lassen, Kristian Bisgaard; van der Aalst, Wil M.P.

    2009-01-01

    , etc. It seems obvious that the complexity of the model contributes to design errors and a lack of understanding. It is not easy to measure complexity, however. This paper presents three complexity metrics that have been implemented in the process analysis tool ProM. The metrics are defined...... analysts have difficulties grasping the dynamics implied by a process model. Recent empirical studies show that people make numerous errors when modeling complex business processes, e.g., about 20 percent of the EPCs in the SAP reference model have design flaws resulting in potential deadlocks, livelocks...... for a subclass of Petri nets named Workflow nets, but the results can easily be applied to other languages. To demonstrate the applicability of these metrics, we have applied our approach and tool to 262 relatively complex Protos models made in the context of various student projects. This allows us to validate...

  14. Efficient workflows for 3D building full-color model reconstruction using LIDAR long-range laser and image-based modeling techniques

    Science.gov (United States)

    Shih, Chihhsiong

    2005-01-01

    Two efficient workflow are developed for the reconstruction of a 3D full color building model. One uses a point wise sensing device to sample an unknown object densely and attach color textures from a digital camera separately. The other uses an image based approach to reconstruct the model with color texture automatically attached. The point wise sensing device reconstructs the CAD model using a modified best view algorithm that collects the maximum number of construction faces in one view. The partial views of the point clouds data are then glued together using a common face between two consecutive views. Typical overlapping mesh removal and coarsening procedures are adapted to generate a unified 3D mesh shell structure. A post processing step is then taken to combine the digital image content from a separate camera with the 3D mesh shell surfaces. An indirect uv mapping procedure first divide the model faces into groups within which every face share the same normal direction. The corresponding images of these faces in a group is then adjusted using the uv map as a guidance. The final assembled image is then glued back to the 3D mesh to present a full colored building model. The result is a virtual building that can reflect the true dimension and surface material conditions of a real world campus building. The image based modeling procedure uses a commercial photogrammetry package to reconstruct the 3D model. A novel view planning algorithm is developed to guide the photos taking procedure. This algorithm successfully generate a minimum set of view angles. The set of pictures taken at these view angles can guarantee that each model face shows up at least in two of the pictures set and no more than three. The 3D model can then be reconstructed with minimum amount of labor spent in correlating picture pairs. The finished model is compared with the original object in both the topological and dimensional aspects. All the test cases show exact same topology and

  15. Facilitating hydrological data analysis workflows in R: the RHydro package

    Science.gov (United States)

    Buytaert, Wouter; Moulds, Simon; Skoien, Jon; Pebesma, Edzer; Reusser, Dominik

    2015-04-01

    The advent of new technologies such as web-services and big data analytics holds great promise for hydrological data analysis and simulation. Driven by the need for better water management tools, it allows for the construction of much more complex workflows, that integrate more and potentially more heterogeneous data sources with longer tool chains of algorithms and models. With the scientific challenge of designing the most adequate processing workflow comes the technical challenge of implementing the workflow with a minimal risk for errors. A wide variety of new workbench technologies and other data handling systems are being developed. At the same time, the functionality of available data processing languages such as R and Python is increasing at an accelerating pace. Because of the large diversity of scientific questions and simulation needs in hydrology, it is unlikely that one single optimal method for constructing hydrological data analysis workflows will emerge. Nevertheless, languages such as R and Python are quickly gaining popularity because they combine a wide array of functionality with high flexibility and versatility. The object-oriented nature of high-level data processing languages makes them particularly suited for the handling of complex and potentially large datasets. In this paper, we explore how handling and processing of hydrological data in R can be facilitated further by designing and implementing a set of relevant classes and methods in the experimental R package RHydro. We build upon existing efforts such as the sp and raster packages for spatial data and the spacetime package for spatiotemporal data to define classes for hydrological data (HydroST). In order to handle simulation data from hydrological models conveniently, a HM class is defined. Relevant methods are implemented to allow for an optimal integration of the HM class with existing model fitting and simulation functionality in R. Lastly, we discuss some of the design challenges

  16. A Framework for Distributed Preservation Workflows

    Directory of Open Access Journals (Sweden)

    Rainer Schmidt

    2010-07-01

    Full Text Available The Planets Project is developing a service-oriented environment for the definition and evaluation of preservation strategies for human-centric data. It focuses on the question of logically preserving digital materials, as opposed to the physical preservation of content bit-streams. This includes the development of preservation tools for the automated characterisation, migration, and comparison of different types of Digital Objects as well as the emulation of their original runtime environment in order to ensure long-time access and interpretability. The Planets integrated environment provides a number of end-user applications that allow data curators to execute and scientifically evaluate preservation experiments based on composable preservation services. In this paper, we focus on the middleware and programming model and show how it can be utilised in order to create complex preservation workflows.

  17. A data analysis workflow to enhance clay and organic carbon models using proximal Vis-NIR data

    DEFF Research Database (Denmark)

    Tabatabai, Salman; Knadel, Maria; Greve, Mogens Humlekrog

    Modelling proximal sensors data is becoming a norm in soil characterization and mapping. In many cases, these models still have low predictive capabilities and lack robustness due to the large amount of noise from several environmental factors. In this study we proposed a combination of extensive...... data preprocessing (preprocessing survey) and two variable selection methods to significantly increase visible near-infrared spectroscopy (Vis-NIRS) model performance and stability. Spectra of eight agricultural fields were measured in the range of 350-2200 nm using a mobile sensor platform (Veris....... Spectral data were preprocessed using several thousands of combinations of methods/settings including Savistky-Golay smoothing/derivatives, multiplicative scatter correction, standard normal variate and generalized least squares weighting and the optimum Partial Least Squares (PLS) models for clay...

  18. Cost-Minimizing Scheduling of Workflows on a Cloud of Memory Managed Multicore Machines

    Science.gov (United States)

    Grounds, Nicolas G.; Antonio, John K.; Muehring, Jeff

    Workflows are modeled as hierarchically structured directed acyclic graphs in which vertices represent computational tasks, referred to as requests, and edges represent precedent constraints among requests. Associated with each workflow is a deadline that defines the time by which all computations of a workflow should be complete. Workflows are submitted by numerous clients to a scheduler that assigns workflow requests to a cloud of memory managed multicore machines for execution. A cost function is assumed to be associated with each workflow, which maps values of relative workflow tardiness to corresponding cost function values. A novel cost-minimizing scheduling framework is introduced to schedule requests of workflows so as to minimize the sum of cost function values for all workflows. The utility of the proposed scheduler is compared to another previously known scheduling policy.

  19. 解决变化问题的自底向上流程建模方法%Bottom-up workflow modeling approach for business changes

    Institute of Scientific and Technical Information of China (English)

    严志民; 徐玮

    2011-01-01

    为使工作流适应业务快速发展而复杂多变的特点,提出一种全新的以数据为中心的业务流程定义和业务流程建模的说明性业务流程建模方法。以自底向上机制分析解剖业务流程,提取出原子工单、活动和业务策略等,将业务要素和业务变化的描述分离成不同的层次。执行语义上,以数据中心的业务流程建模的说明性业务流程建模方法借助有限状态自动机来描述单个工单的生命周期,利用标号迁移系统来描述工作流及多个工单间的交互。此外,还进行了从以数据中心的业务流程建模的说明性业务流程建模方法到实现可部署工作流的探讨,并结合杭州市房产管理局的实际工作流程,阐述了该方法的实际应用。%To meet with the adaptability requirements of workflow in a complicated and rapid changing business environment,a new modeling method named Declarative ARTifact-centric workflow(DART) was proposed.The business process was analyzed in the bottom-up manner so that its building blocks such as artifacts,activities and business policies were extracted.Representation of business component and change were differentiated.DART also took Finite State Automata(FSA) to illustrate single artifact's lifecycle,and Labeled Transition Systems(LTS) to describe workflow and interactions among artifacts.In addition,from DART modeling method to realize deployable workflow was also discussed.This method was tested in Hangzhou real estate administration bureau and application was finally studied.

  20. Automation of Flexible Migration Workflows

    Directory of Open Access Journals (Sweden)

    Dirk von Suchodoletz

    2011-03-01

    Full Text Available Many digital preservation scenarios are based on the migration strategy, which itself is heavily tool-dependent. For popular, well-defined and often open file formats – e.g., digital images, such as PNG, GIF, JPEG – a wide range of tools exist. Migration workflows become more difficult with proprietary formats, as used by the several text processing applications becoming available in the last two decades. If a certain file format can not be rendered with actual software, emulation of the original environment remains a valid option. For instance, with the original Lotus AmiPro or Word Perfect, it is not a problem to save an object of this type in ASCII text or Rich Text Format. In specific environments, it is even possible to send the file to a virtual printer, thereby producing a PDF as a migration output. Such manual migration tasks typically involve human interaction, which may be feasible for a small number of objects, but not for larger batches of files.We propose a novel approach using a software-operated VNC abstraction layer in order to replace humans with machine interaction. Emulators or virtualization tools equipped with a VNC interface are very well suited for this approach. But screen, keyboard and mouse interaction is just part of the setup. Furthermore, digital objects need to be transferred into the original environment in order to be extracted after processing. Nevertheless, the complexity of the new generation of migration services is quickly rising; a preservation workflow is now comprised not only of the migration tool itself, but of a complete software and virtual hardware stack with recorded workflows linked to every supported migration scenario. Thus the requirements of OAIS management must include proper software archiving, emulator selection, system image and recording handling. The concept of view-paths could help either to automatically determine the proper pre-configured virtual environment or to set up system

  1. Integrating configuration workflows with project management system

    Science.gov (United States)

    Nilsen, Dimitri; Weber, Pavel

    2014-06-01

    The complexity of the heterogeneous computing resources, services and recurring infrastructure changes at the GridKa WLCG Tier-1 computing center require a structured approach to configuration management and optimization of interplay between functional components of the whole system. A set of tools deployed at GridKa, including Puppet, Redmine, Foreman, SVN and Icinga, provides the administrative environment giving the possibility to define and develop configuration workflows, reduce the administrative effort and improve sustainable operation of the whole computing center. In this presentation we discuss the developed configuration scenarios implemented at GridKa, which we use for host installation, service deployment, change management procedures, service retirement etc. The integration of Puppet with a project management tool like Redmine provides us with the opportunity to track problem issues, organize tasks and automate these workflows. The interaction between Puppet and Redmine results in automatic updates of the issues related to the executed workflow performed by different system components. The extensive configuration workflows require collaboration and interaction between different departments like network, security, production etc. at GridKa. Redmine plugins developed at GridKa and integrated in its administrative environment provide an effective way of collaboration within the GridKa team. We present the structural overview of the software components, their connections, communication protocols and show a few working examples of the workflows and their automation.

  2. Workflow patterns the definitive guide

    CERN Document Server

    Russell, Nick; ter Hofstede, Arthur H M

    2016-01-01

    The study of business processes has emerged as a highly effective approach to coordinating an organization's complex service- and knowledge-based activities. The growing field of business process management (BPM) focuses on methods and tools for designing, enacting, and analyzing business processes. This volume offers a definitive guide to the use of patterns, which synthesize the wide range of approaches to modeling business processes. It provides a unique and comprehensive introduction to the well-known workflow patterns collection -- recurrent, generic constructs describing common business process modeling and execution scenarios, presented in the form of problem-solution dialectics. The underlying principles of the patterns approach ensure that they are independent of any specific enabling technology, representational formalism, or modeling approach, and thus broadly applicable across the business process modeling and business process technology domains. The authors, drawing on extensive research done by...

  3. A hydrodynamic model for granular material flows including segregation effects

    Science.gov (United States)

    Gilberg, Dominik; Klar, Axel; Steiner, Konrad

    2017-06-01

    The simulation of granular flows including segregation effects in large industrial processes using particle methods is accurate, but very time-consuming. To overcome the long computation times a macroscopic model is a natural choice. Therefore, we couple a mixture theory based segregation model to a hydrodynamic model of Navier-Stokes-type, describing the flow behavior of the granular material. The granular flow model is a hybrid model derived from kinetic theory and a soil mechanical approach to cover the regime of fast dilute flow, as well as slow dense flow, where the density of the granular material is close to the maximum packing density. Originally, the segregation model has been formulated by Thornton and Gray for idealized avalanches. It is modified and adapted to be in the preferred form for the coupling. In the final coupled model the segregation process depends on the local state of the granular system. On the other hand, the granular system changes as differently mixed regions of the granular material differ i.e. in the packing density. For the modeling process the focus lies on dry granular material flows of two particle types differing only in size but can be easily extended to arbitrary granular mixtures of different particle size and density. To solve the coupled system a finite volume approach is used. To test the model the rotational mixing of small and large particles in a tumbler is simulated.

  4. Synaptic channel model including effects of spike width variation

    OpenAIRE

    2015-01-01

    Synaptic Channel Model Including Effects of Spike Width Variation Hamideh Ramezani Next-generation and Wireless Communications Laboratory (NWCL) Department of Electrical and Electronics Engineering Koc University, Istanbul, Turkey Ozgur B. Akan Next-generation and Wireless Communications Laboratory (NWCL) Department of Electrical and Electronics Engineering Koc University, Istanbul, Turkey ABSTRACT An accu...

  5. A sonic boom propagation model including mean flow atmospheric effects

    Science.gov (United States)

    Salamone, Joe; Sparrow, Victor W.

    2012-09-01

    This paper presents a time domain formulation of nonlinear lossy propagation in onedimension that also includes the effects of non-collinear mean flow in the acoustic medium. The model equation utilized is an augmented Burgers equation that includes the effects of nonlinearity, geometric spreading, atmospheric stratification, and also absorption and dispersion due to thermoviscous and molecular relaxation effects. All elements of the propagation are implemented in the time domain and the effects of non-collinear mean flow are accounted for in each term of the model equation. Previous authors have presented methods limited to showing the effects of wind on ray tracing and/or using an effective speed of sound in their model equation. The present work includes the effects of mean flow for all terms included in the augmented Burgers equation with all of the calculations performed in the time-domain. The capability to include the effects of mean flow in the acoustic medium allows one to make predictions more representative of real-world atmospheric conditions. Examples are presented for nonlinear propagation of N-waves and shaped sonic booms. [Work supported by Gulfstream Aerospace Corporation.

  6. The Symbiotic Relationship between Scientific Workflow and Provenance (Invited)

    Science.gov (United States)

    Stephan, E.

    2010-12-01

    The purpose of this presentation is to describe the symbiotic nature of scientific workflows and provenance. We will also discuss the current trends and real world challenges facing these two distinct research areas. Although motivated differently, the needs of the international science communities are the glue that binds this relationship together. Understanding and articulating the science drivers to these communities is paramount as these technologies evolve and mature. Originally conceived for managing business processes, workflows are now becoming invaluable assets in both computational and experimental sciences. These reconfigurable, automated systems provide essential technology to perform complex analyses by coupling together geographically distributed disparate data sources and applications. As a result, workflows are capable of higher throughput in a shorter amount of time than performing the steps manually. Today many different workflow products exist; these could include Kepler and Taverna or similar products like MeDICI, developed at PNNL, that are standardized on the Business Process Execution Language (BPEL). Provenance, originating from the French term Provenir “to come from”, is used to describe the curation process of artwork as art is passed from owner to owner. The concept of provenance was adopted by digital libraries as a means to track the lineage of documents while standards such as the DublinCore began to emerge. In recent years the systems science community has increasingly expressed the need to expand the concept of provenance to formally articulate the history of scientific data. Communities such as the International Provenance and Annotation Workshop (IPAW) have formalized a provenance data model. The Open Provenance Model, and the W3C is hosting a provenance incubator group featuring the Proof Markup Language. Although both workflows and provenance have risen from different communities and operate independently, their mutual

  7. The Design and Implementation of A Agents-based Workflow Model%一种基于Agent的工作流模型的设计与实现

    Institute of Scientific and Technical Information of China (English)

    陈善国; 高济

    2000-01-01

    In this paper,we present the SaFlow,a workflow management system based on agents. After describing the architecture,we discuss in details about the composing and implementation of MSA,the major part of SaFlow. At last,a product quotation workflow system is demonstrated,as an application of SaFlow.

  8. 基于Petri网的工作流过程建模与分析方法的研究%Research on Petri Net Based Modeling and Analyzing Methods for Workflow Process

    Institute of Scientific and Technical Information of China (English)

    姜浩; 董逸生; 罗军舟

    2000-01-01

    工作流管理是CSCW的一个重要方面.本文介绍了工作流过程的基本知识,给出了一种基于Petri网的建模方法和基本定义,讨论了工作流过程结构和行为正确性的分析和验证,最后提出了过程定义的验证算法.%Workflow management is an important aspect in CSCW at present.The elementary knowledge of workflow process is introduced,the Petri nets based process modeling methodology and basic definitions are provided,and the analysis and verification of structural and behavioral correctness of workflow process are discussed.Finally,the algorithm of verification of process definitions is proposed.

  9. Verification of Timed Healthcare Workflows Using Component Timed-Arc Petri Nets

    DEFF Research Database (Denmark)

    Bertolini, Cristiano; Liu, Zhiming; Srba, Jiri

    2013-01-01

    Workflows in modern healthcare systems are becoming increasingly complex and their execution involves concurrency and sharing of resources. The definition, analysis and management of collaborative healthcare workflows requires abstract model notations with a precisely defined semantics and a supp...

  10. Data intensive ATLAS workflows in the Cloud

    CERN Document Server

    Rzehorz, Gerhard Ferdinand; The ATLAS collaboration

    2016-01-01

    This contribution reports on the feasibility of executing data intensive workflows on Cloud infrastructures. In order to assess this, the metric ETC = Events/Time/Cost is formed, which quantifies the different workflow and infrastructure configurations that are tested against each other. In these tests ATLAS reconstruction Jobs are run, examining the effects of overcommitting (more parallel processes running than CPU cores available), scheduling (staggered execution) and scaling (number of cores). The desirability of commissioning storage in the cloud is evaluated, in conjunction with a simple analytical model of the system, and correlated with questions about the network bandwidth, caches and what kind of storage to utilise. In the end a cost/benefit evaluation of different infrastructure configurations and workflows is undertaken, with the goal to find the maximum of the ETC value

  11. Data intensive ATLAS workflows in the Cloud

    CERN Document Server

    Rzehorz, Gerhard Ferdinand; The ATLAS collaboration

    2017-01-01

    This contribution reports on the feasibility of executing data intensive workflows on Cloud infrastructures. In order to assess this, the metric ETC = Events/Time/Cost is formed, which quantifies the different workflow and infrastructure configurations that are tested against each other. In these tests ATLAS reconstruction Jobs are run, examining the effects of overcommitting (more parallel processes running than CPU cores available), scheduling (staggered execution) and scaling (number of cores). The desirability of commissioning storage in the Cloud is evaluated, in conjunction with a simple analytical model of the system, and correlated with questions about the network bandwidth, caches and what kind of storage to utilise. In the end a cost/benefit evaluation of different infrastructure configurations and workflows is undertaken, with the goal to find the maximum of the ETC value.

  12. From Paper Based Clinical Practice Guidelines to Declarative Workflow Management

    DEFF Research Database (Denmark)

    Lyng, Karen Marie; Hildebrandt, Thomas; Mukkamala, Raghava Rao

    2009-01-01

    a sub workflow can be described in a declarative workflow management system: the Resultmaker Online Consultant (ROC). The example demonstrates that declarative primitives allow to naturally extend the paper based flowchart to an executable model without introducing a complex cyclic control flow graph....

  13. Design decisions in workflow management and quality of work.

    NARCIS (Netherlands)

    Waal, B.M.E. de; Batenburg, R.

    2009-01-01

    In this paper, the design and implementation of a workflow management (WFM) system in a large Dutch social insurance organisation is described. The effect of workflow design decisions on the quality of work is explored theoretically and empirically, using the model of Zur Mühlen as a frame of refere

  14. From Paper Based Clinical Practice Guidelines to Declarative Workflow Management

    DEFF Research Database (Denmark)

    Lyng, Karen Marie; Hildebrandt, Thomas; Mukkamala, Raghava Rao

    2009-01-01

    a sub workflow can be described in a declarative workflow management system: the Resultmaker Online Consultant (ROC). The example demonstrates that declarative primitives allow to naturally extend the paper based flowchart to an executable model without introducing a complex cyclic control flow graph....

  15. 基于反演模式支持多租户动态配置的云工作流模型构建%Cloud workflow model construction based on reflection mode and supporting multi-tenant dynamical configuration

    Institute of Scientific and Technical Information of China (English)

    黄华; 彭蓉; 冯在文

    2013-01-01

    To solve the dynamic adaptability of multi-tenant's requirement change in cloud workflow,a cloud workflow model based on reflection mode and supporting multi-tenant dynamical configuration was proposed.By using reflective ideology,the cloud workflow was described and controlled from four aspects such as process model,role model,interface model and status model.Based on the controlled meta objects included topology,role,task and state as well as their corresponding Meta Object Protocols (MOPs),the adjustment for adaptability of tenant's requirement change was realized.The experiment result showed that the proposed model could satisfy the adaptability adjustment of cloud workflow caused by business process changing,members exchange and abnormal service resource.%为解决云工作流中多租户需求变化的动态适应性问题,提出基于反演模式支持多租户动态配置的云工作流模型.该模型运用反演计算的思想,从过程模型、角色模型、接口模型和状态模型等方面对云工作流进行描述和控制,并通过租户元对象所控制的拓扑元对象、角色元对象、任务元对象、状态元对象及其提供的元对象协议来实现对租户需求变化的适应性调整.实例分析表明,模型能够满足业务流程变更、人员变动及服务资源异常所引起的云工作流的适应性调整.

  16. An Extended Policy Language for Role Resolution in Project-Oriented Workflow

    Institute of Scientific and Technical Information of China (English)

    张晓光; 曹健; 张申生; 牟玉洁

    2004-01-01

    HP defines an SQL-like language to specify organizational policies (or constraints) in workflow systems.Three types of policies were studied including qualification, requirements and substitution policies which can not handle complex role resolution such as Separation of Roles and Binding of Roles, and several exception situations,such as Role Delegation and Role Unavailable. From the perspective of project-oriented workflow, a project and its sub-projects can be under the charge of teams (or virtual teams). The teams should satisfy the role resolution of the projects managed by the team. To support the above requirements, based on team-enabled organization model,this paper extended HP's policy language to support the role resolution in project-oriented workflow, and provided its modeling and enforcement mechanism.

  17. Declarative Event-Based Workflow as Distributed Dynamic Condition Response Graphs

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Mukkamala, Raghava Rao

    2010-01-01

    We present Dynamic Condition Response Graphs (DCR Graphs) as a declarative, event-based process model inspired by the workflow language employed by our industrial partner and conservatively generalizing prime event structures. A dynamic condition response graph is a directed graph with nodes...... representing the events that can happen and arrows representing four relations between events: condition, response, include, and exclude. Distributed DCR Graphs is then obtained by assigning roles to events and principals. We give a graphical notation inspired by related work by van der Aalst et al. We...... exemplify the use of distributed DCR Graphs on a simple workflow taken from a field study at a Danish hospital, pointing out their flexibility compared to imperative workflow models. Finally we provide a mapping from DCR Graphs to Buchi-automata....

  18. Designing Flexible E-Business Workflow Systems

    Directory of Open Access Journals (Sweden)

    Cătălin Silvestru

    2010-01-01

    Full Text Available In today’s business environment organizations must cope with complex interactions between actors adapt fast to frequent market changes and be innovative. In this context, integrating knowledge with processes and Business Intelligenceis a major step towards improving organization agility. Therefore, traditional environments for workflow design have been adapted to answer the new business models and current requirements in the field of collaborative processes. This paper approaches the design of flexible and dynamic workflow management systems for electronic businesses that can lead to agility.

  19. A model of Barchan dunes including lateral shear stress.

    Science.gov (United States)

    Schwämmle, V; Herrmann, H J

    2005-01-01

    Barchan dunes are found where sand availability is low and wind direction quite constant. The two dimensional shear stress of the wind field and the sand movement by saltation and avalanches over a barchan dune are simulated. The model with one dimensional shear stress is extended including surface diffusion and lateral shear stress. The resulting final shape is compared to the results of the model with a one dimensional shear stress and confirmed by comparison to measurements. We found agreement and improvements with respect to the model with one dimensional shear stress. Additionally, a characteristic edge at the center of the windward side is discovered which is also observed for big barchans. Diffusion effects reduce this effect for small dunes.

  20. Soundness of Timed-Arc Workflow Nets

    DEFF Research Database (Denmark)

    Mateo, Jose Antonio; Srba, Jiri; Sørensen, Mathias Grund

    2014-01-01

    , we demonstrate the usability of our theory on the case studies of a Brake System Control Unit used in aircraft certification, the MPEG2 encoding algorithm, and a blood transfusion workflow. The implementation of the algorithms is freely available as a part of the model checker TAPAAL....

  1. A Reference Architecture for Workflow Management Systems

    NARCIS (Netherlands)

    Grefen, Paul; Remmerts de Vries, Remmert

    1998-01-01

    In the workflow management field, fast developments are taking place. A growing number of systems is currently under development, both in academic and commercial environments. Consequently, a wide variety of ad hoc architectures has come into existence. Reference models are necessary, however, to al

  2. Goldilocks Models of Higher-Dimensional Inflation (including modulus stabilization)

    CERN Document Server

    Burgess, C P; Hayman, Peter; Patil, Subodh P

    2016-01-01

    We explore the mechanics of inflation in simplified extra-dimensional models involving an inflaton interacting with the Einstein-Maxwell system in two extra dimensions. The models are Goldilocks-like in that they are just complicated enough to include a mechanism to stabilize the extra-dimensional size, yet simple enough to solve the full 6D field equations using basic tools. The solutions are not limited to the effective 4D regime with H m_KK, but when they do standard 4D fluctuation calculations need not apply. When in a 4D regime the solutions predict eta ~ 0 hence n_s ~ 0.96 and r ~ 0.096 and so are ruled out if tensor modes remain unseen. Analysis of general parameters is difficult without a full 6D fluctuation calculation.

  3. Kinetic models of gene expression including non-coding RNAs

    Science.gov (United States)

    Zhdanov, Vladimir P.

    2011-03-01

    In cells, genes are transcribed into mRNAs, and the latter are translated into proteins. Due to the feedbacks between these processes, the kinetics of gene expression may be complex even in the simplest genetic networks. The corresponding models have already been reviewed in the literature. A new avenue in this field is related to the recognition that the conventional scenario of gene expression is fully applicable only to prokaryotes whose genomes consist of tightly packed protein-coding sequences. In eukaryotic cells, in contrast, such sequences are relatively rare, and the rest of the genome includes numerous transcript units representing non-coding RNAs (ncRNAs). During the past decade, it has become clear that such RNAs play a crucial role in gene expression and accordingly influence a multitude of cellular processes both in the normal state and during diseases. The numerous biological functions of ncRNAs are based primarily on their abilities to silence genes via pairing with a target mRNA and subsequently preventing its translation or facilitating degradation of the mRNA-ncRNA complex. Many other abilities of ncRNAs have been discovered as well. Our review is focused on the available kinetic models describing the mRNA, ncRNA and protein interplay. In particular, we systematically present the simplest models without kinetic feedbacks, models containing feedbacks and predicting bistability and oscillations in simple genetic networks, and models describing the effect of ncRNAs on complex genetic networks. Mathematically, the presentation is based primarily on temporal mean-field kinetic equations. The stochastic and spatio-temporal effects are also briefly discussed.

  4. Workflow Automation with Lotus Notes for the Governmental Administrative Information System

    OpenAIRE

    Maskeliunas, Saulius

    1999-01-01

    The paper presents an introductory overview of the workflow automation area, outlining the main types, basic technologies, the essential features of workflow applications. Two sorts of process models for the definition of workflows (according to the conversation-based and activity-based methodologies) are sketched. Later on, the nature of Lotus Notes and its capabilities (as an environment for workflow management systems development) are indicated. Concluding, the experience of automating adm...

  5. Text mining for the biocuration workflow.

    Science.gov (United States)

    Hirschman, Lynette; Burns, Gully A P C; Krallinger, Martin; Arighi, Cecilia; Cohen, K Bretonnel; Valencia, Alfonso; Wu, Cathy H; Chatr-Aryamontri, Andrew; Dowell, Karen G; Huala, Eva; Lourenço, Anália; Nash, Robert; Veuthey, Anne-Lise; Wiegers, Thomas; Winter, Andrew G

    2012-01-01

    Molecular biology has become heavily dependent on biological knowledge encoded in expert curated biological databases. As the volume of biological literature increases, biocurators need help in keeping up with the literature; (semi-) automated aids for biocuration would seem to be an ideal application for natural language processing and text mining. However, to date, there have been few documented successes for improving biocuration throughput using text mining. Our initial investigations took place for the workshop on 'Text Mining for the BioCuration Workflow' at the third International Biocuration Conference (Berlin, 2009). We interviewed biocurators to obtain workflows from eight biological databases. This initial study revealed high-level commonalities, including (i) selection of documents for curation; (ii) indexing of documents with biologically relevant entities (e.g. genes); and (iii) detailed curation of specific relations (e.g. interactions); however, the detailed workflows also showed many variabilities. Following the workshop, we conducted a survey of biocurators. The survey identified biocurator priorities, including the handling of full text indexed with biological entities and support for the identification and prioritization of documents for curation. It also indicated that two-thirds of the biocuration teams had experimented with text mining and almost half were using text mining at that time. Analysis of our interviews and survey provide a set of requirements for the integration of text mining into the biocuration workflow. These can guide the identification of common needs across curated databases and encourage joint experimentation involving biocurators, text mining developers and the larger biomedical research community.

  6. Kronos: a workflow assembler for genome analytics and informatics.

    Science.gov (United States)

    Taghiyar, M Jafar; Rosner, Jamie; Grewal, Diljot; Grande, Bruno M; Aniba, Radhouane; Grewal, Jasleen; Boutros, Paul C; Morin, Ryan D; Bashashati, Ali; Shah, Sohrab P

    2017-07-01

    The field of next-generation sequencing informatics has matured to a point where algorithmic advances in sequence alignment and individual feature detection methods have stabilized. Practical and robust implementation of complex analytical workflows (where such tools are structured into "best practices" for automated analysis of next-generation sequencing datasets) still requires significant programming investment and expertise. We present Kronos, a software platform for facilitating the development and execution of modular, auditable, and distributable bioinformatics workflows. Kronos obviates the need for explicit coding of workflows by compiling a text configuration file into executable Python applications. Making analysis modules would still require programming. The framework of each workflow includes a run manager to execute the encoded workflows locally (or on a cluster or cloud), parallelize tasks, and log all runtime events. The resulting workflows are highly modular and configurable by construction, facilitating flexible and extensible meta-applications that can be modified easily through configuration file editing. The workflows are fully encoded for ease of distribution and can be instantiated on external systems, a step toward reproducible research and comparative analyses. We introduce a framework for building Kronos components that function as shareable, modular nodes in Kronos workflows. The Kronos platform provides a standard framework for developers to implement custom tools, reuse existing tools, and contribute to the community at large. Kronos is shipped with both Docker and Amazon Web Services Machine Images. It is free, open source, and available through the Python Package Index and at https://github.com/jtaghiyar/kronos.

  7. Progress Towards an LES Wall Model Including Unresolved Roughness

    Science.gov (United States)

    Craft, Kyle; Redman, Andrew; Aikens, Kurt

    2015-11-01

    Wall models used in large eddy simulations (LES) are often based on theories for hydraulically smooth walls. While this is reasonable for many applications, there are also many where the impact of surface roughness is important. A previously developed wall model has been used primarily for jet engine aeroacoustics. However, jet simulations have not accurately captured thick initial shear layers found in some experimental data. This may partly be due to nozzle wall roughness used in the experiments to promote turbulent boundary layers. As a result, the wall model is extended to include the effects of unresolved wall roughness through appropriate alterations to the log-law. The methodology is tested for incompressible flat plate boundary layers with different surface roughness. Correct trends are noted for the impact of surface roughness on the velocity profile. However, velocity deficit profiles and the Reynolds stresses do not collapse as well as expected. Possible reasons for the discrepancies as well as future work will be presented. This work used the Extreme Science and Engineering Discovery Environment (XSEDE), which is supported by National Science Foundation grant number ACI-1053575. Computational resources on TACC Stampede were provided under XSEDE allocation ENG150001.

  8. Monitoring of Grid scientific workflows

    NARCIS (Netherlands)

    Balis, B.; Bubak, M.; Łabno, B.

    2008-01-01

    Scientific workflows are a means of conducting in silico experiments in modern computing infrastructures for e-Science, often built on top of Grids. Monitoring of Grid scientific workflows is essential not only for performance analysis but also to collect provenance data and gather feedback useful

  9. Monitoring of Grid scientific workflows

    NARCIS (Netherlands)

    Balis, B.; Bubak, M.; Łabno, B.

    2008-01-01

    Scientific workflows are a means of conducting in silico experiments in modern computing infrastructures for e-Science, often built on top of Grids. Monitoring of Grid scientific workflows is essential not only for performance analysis but also to collect provenance data and gather feedback useful i

  10. Digital workflows in contemporary orthodontics

    Directory of Open Access Journals (Sweden)

    Lars R Christensen

    2017-01-01

    Full Text Available Digital workflows are now increasingly possible in orthodontic practice. Workflows designed to improve the customization of orthodontic appliances are now available through laboratories and orthodontic manufacturing facilities in many parts of the world. These now have the potential to improve certain aspects of patient care.

  11. Workflow Management in Electronic Commerce

    NARCIS (Netherlands)

    Grefen, P.W.P.J.; Spaccapietra, S.; March, S.T.; Kambayashi, Y.

    2002-01-01

    This tutorial addresses the application of workflow management (WFM) for process support in both these cases. The tutorial is organized into three parts. In the first part, we pay attention to (classical) workflow management in the context of a single organization. In the second part, we extend this

  12. Automating Workflow using Dialectical Argumentation

    NARCIS (Netherlands)

    Urovi, Visara; Bromuri, Stefano; McGinnis, Jarred; Stathis, Kostas; Omicini, Andrea

    2008-01-01

    This paper presents a multi-agent framework based on argumentative agent technology for the automation of the workflow selection and execution. In this framework, workflow selection is coordinated by agent interactions governed by the rules of a dialogue game whose purpose is to evaluate the workflo

  13. A workflow for digitalization projects

    OpenAIRE

    De Mulder, Tom

    2005-01-01

    More and more institutions want to convert their traditional content to digital formats. In such pro jects the digitalization and metadata stages often happen asynchronously. This paper identifies the importance of frequent cross-verification of both. We suggest a workflow to formalise this process, and a possible technical implementation to automate this workflow.

  14. Constructing Workflows from Script Applications

    Directory of Open Access Journals (Sweden)

    Mikołaj Baranowski

    2012-01-01

    Full Text Available For programming and executing complex applications on grid infrastructures, scientific workflows have been proposed as convenient high-level alternative to solutions based on general-purpose programming languages, APIs and scripts. GridSpace is a collaborative programming and execution environment, which is based on a scripting approach and it extends Ruby language with a high-level API for invoking operations on remote resources. In this paper we describe a tool which enables to convert the GridSpace application source code into a workflow representation which, in turn, may be used for scheduling, provenance, or visualization. We describe how we addressed the issues of analyzing Ruby source code, resolving variable and method dependencies, as well as building workflow representation. The solutions to these problems have been developed and they were evaluated by testing them on complex grid application workflows such as CyberShake, Epigenomics and Montage. Evaluation is enriched by representing typical workflow control flow patterns.

  15. Numerical Modeling of Electroacoustic Logging Including Joule Heating

    Science.gov (United States)

    Plyushchenkov, Boris D.; Nikitin, Anatoly A.; Turchaninov, Victor I.

    It is well known that electromagnetic field excites acoustic wave in a porous elastic medium saturated with fluid electrolyte due to electrokinetic conversion effect. Pride's equations describing this process are written in isothermal approximation. Update of these equations, which allows to take influence of Joule heating on acoustic waves propagation into account, is proposed here. This update includes terms describing the initiation of additional acoustic waves excited by thermoelastic stresses and the heat conduction equation with right side defined by Joule heating. Results of numerical modeling of several problems of propagation of acoustic waves excited by an electric field source with and without consideration of Joule heating effect in their statements are presented. From these results, it follows that influence of Joule heating should be taken into account at the numerical simulation of electroacoustic logging and at the interpretation of its log data.

  16. 面向科学过程的工作流技术研究现状与趋势%Overview of workflow technology in scientific process

    Institute of Scientific and Technical Information of China (English)

    肖飞; 张为华; 王东辉

    2011-01-01

    介绍了科学工作流技术的起源及发展,分析了科学工作流全生命周期组成及关键技术,主要包括流程建模与描述、流程映射、流程执行与调度以及数据起源管理这四个方面的发展状况,从科学工作流管理系统框架、协同技术和应用现状等方面分析了科学工作流技术的研究现状,分析了目前科学工作流技术中存在的不足,并对其未来发展趋势给出了建议.%This paper introduced the origin and development of scientific workflow technology; then it analyzed the life cycle components of scientific workflow and key technologies, including process modeling and description, process mapping, process execution and scheduling, and data provenance management; next, it summed up the scientific workflow technology, including the framework of scientific workflow management systems, collaboration technology and workflow application; finally, it illustrated the shortcomings of current scientific workflow technology, and gave some proposal for scientific workflow' s future trends.

  17. CA-PLAN, a Service-Oriented Workflow

    Institute of Scientific and Technical Information of China (English)

    Shung-Bin Yan; Feng-Jian Wang

    2005-01-01

    Workflow management systems (WfMSs) are accepted worldwide due to their ability to model and control business processes. Previously, we defined an intra-organizational workflow specification model, Process LANguage (PLAN).PLAN, with associated tools, allowed a user to describe a graph specification for processes, artifacts, and participants in an organization. PLAN has been successfully implemented in Agentflow to support workflow (Agentflow) applications. PLAN,and most current WfMSs are designed to adopt a centralized architecture so that they can be applied to a single organization.However, in such a structure, participants in Agentflow applications in different organizations cannot serve each other with workflows.In this paper, a service-oriented cooperative workflow model, Cooperative Agentflow Process LANguage (CA-PLAN) is presented. CA-PLAN proposes a workflow component model to model inter-organizational processes. In CA-PLAN, an interorganizational process is partitioned into several intra-organizational processes. Each workflow system inside an organization is modeled as an Integrated Workflow Component (IWC). Each IWC contains a process service interface, specifying process services provided by an organization, in conjunction with a remote process interface specifying what remote processes are used to refer to remote process services provided by other organizations, and intra-organizational processes. An IWC is a workflow node and participant. An inter-organizational process is made up of connections among these process services and remote processes with respect to different IWCs. In this paper, the related service techniques and supporting tools provided in Agentflow systems are presented.

  18. KNIME-CDK: Workflow-driven cheminformatics.

    Science.gov (United States)

    Beisken, Stephan; Meinl, Thorsten; Wiswedel, Bernd; de Figueiredo, Luis F; Berthold, Michael; Steinbeck, Christoph

    2013-08-22

    Cheminformaticians have to routinely process and analyse libraries of small molecules. Among other things, that includes the standardization of molecules, calculation of various descriptors, visualisation of molecular structures, and downstream analysis. For this purpose, scientific workflow platforms such as the Konstanz Information Miner can be used if provided with the right plug-in. A workflow-based cheminformatics tool provides the advantage of ease-of-use and interoperability between complementary cheminformatics packages within the same framework, hence facilitating the analysis process. KNIME-CDK comprises functions for molecule conversion to/from common formats, generation of signatures, fingerprints, and molecular properties. It is based on the Chemistry Development Toolkit and uses the Chemical Markup Language for persistence. A comparison with the cheminformatics plug-in RDKit shows that KNIME-CDK supports a similar range of chemical classes and adds new functionality to the framework. We describe the design and integration of the plug-in, and demonstrate the usage of the nodes on ChEBI, a library of small molecules of biological interest. KNIME-CDK is an open-source plug-in for the Konstanz Information Miner, a free workflow platform. KNIME-CDK is build on top of the open-source Chemistry Development Toolkit and allows for efficient cross-vendor structural cheminformatics. Its ease-of-use and modularity enables researchers to automate routine tasks and data analysis, bringing complimentary cheminformatics functionality to the workflow environment.

  19. Goldilocks models of higher-dimensional inflation (including modulus stabilization)

    Science.gov (United States)

    Burgess, C. P.; Enns, Jared J. H.; Hayman, Peter; Patil, Subodh P.

    2016-08-01

    We explore the mechanics of inflation within simplified extra-dimensional models involving an inflaton interacting with the Einstein-Maxwell system in two extra dimensions. The models are Goldilocks-like inasmuch as they are just complicated enough to include a mechanism to stabilize the extra-dimensional size (or modulus), yet simple enough to solve explicitly the full extra-dimensional field equations using only simple tools. The solutions are not restricted to the effective 4D regime with H ll mKK (the latter referring to the characteristic mass splitting of the Kaluza-Klein excitations) because the full extra-dimensional Einstein equations are solved. This allows an exploration of inflationary physics in a controlled calculational regime away from the usual four-dimensional lamp-post. The inclusion of modulus stabilization is important because experience with string models teaches that this is usually what makes models fail: stabilization energies easily dominate the shallow potentials required by slow roll and so open up directions to evolve that are steeper than those of the putative inflationary direction. We explore (numerically and analytically) three representative kinds of inflationary scenarios within this simple setup. In one the radion is trapped in an inflaton-dependent local minimum whose non-zero energy drives inflation. Inflation ends as this energy relaxes to zero when the inflaton finds its own minimum. The other two involve power-law scaling solutions during inflation. One of these is a dynamical attractor whose features are relatively insensitive to initial conditions but whose slow-roll parameters cannot be arbitrarily small; the other is not an attractor but can roll much more slowly, until eventually transitioning to the attractor. The scaling solutions can satisfy H > mKK, but when they do standard 4D fluctuation calculations need not apply. When in a 4D regime the solutions predict η simeq 0 and so r simeq 0.11 when ns simeq 0.96 and so

  20. Managing and Documenting Legacy Scientific Workflows.

    Science.gov (United States)

    Acuña, Ruben; Chomilier, Jacques; Lacroix, Zoé

    2015-10-06

    Scientific legacy workflows are often developed over many years, poorly documented and implemented with scripting languages. In the context of our cross-disciplinary projects we face the problem of maintaining such scientific workflows. This paper presents the Workflow Instrumentation for Structure Extraction (WISE) method used to process several ad-hoc legacy workflows written in Python and automatically produce their workflow structural skeleton. Unlike many existing methods, WISE does not assume input workflows to be preprocessed in a known workflow formalism. It is also able to identify and analyze calls to external tools. We present the method and report its results on several scientific workflows.

  1. Reputation-controlled business process workflows

    OpenAIRE

    Aziz, Benjamin; Hamilton, G

    2013-01-01

    This paper presents a model solution for controlling the execution of BPEL business processes based on reputation constraints at the level of the services, the service providers and the BPEL workflow. The reputation constraints are expressed as part of a service level agreement and are then enforced at runtime by a reputation monitoring system. We use our model to demonstrate how trust requirements based on such reputation constraints can be upheld in a real world example of a distributed map...

  2. From Requirements via Colored Workflow Nets to an Implementation in Several Workflow Systems

    DEFF Research Database (Denmark)

    Mans, Ronnie S:; van der Aalst, Wil M.P.; Bakker, Piet J.M.;

    2007-01-01

    Care organizations, such as hospitals, need to support complex and dynamic workflows. More- over, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes an approach where an Execu......Care organizations, such as hospitals, need to support complex and dynamic workflows. More- over, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes an approach where...... care process of the Academic Medical Center (AMC) hospital is used as reference process. The process consists of hundreds of activities. These have been modeled and analyzed using an EUC and a CWN. Moreover, based on the CWN, the process has been implemented using four different workflow systems...

  3. Concurrency & Asynchrony in Declarative Workflows

    DEFF Research Database (Denmark)

    Debois, Søren; Hildebrandt, Thomas; Slaats, Tijs

    2015-01-01

    Declarative or constraint-based business process and workflow notations have received increasing interest in the last decade as possible means of addressing the challenge of supporting at the same time flexibility in execution, adaptability and compliance. However, the definition of concurrent...... of concurrency in DCR Graphs admits asynchronous execution of declarative workflows both conceptually and by reporting on a prototype implementation of a distributed declarative workflow engine. Both the theoretical development and the implementation is supported by an extended example; moreover, the theoretical...

  4. Commentary on the integration of model sharing and reproducibility analysis to scholarly publishing workflow in computational biomechanics.

    Science.gov (United States)

    Erdemir, Ahmet; Guess, Trent M; Halloran, Jason P; Modenese, Luca; Reinbolt, Jeffrey A; Thelen, Darryl G; Umberger, Brian R; Erdemir, Ahmet; Guess, Trent M; Halloran, Jason P; Modenese, Luca; Reinbolt, Jeffrey A; Thelen, Darryl G; Umberger, Brian R; Umberger, Brian R; Erdemir, Ahmet; Thelen, Darryl G; Guess, Trent M; Reinbolt, Jeffrey A; Modenese, Luca; Halloran, Jason P

    2016-10-01

    The overall goal of this paper is to demonstrate that dissemination of models and analyses for assessing the reproducibility of simulation results can be incorporated in the scientific review process in biomechanics. As part of a special issue on model sharing and reproducibility in the IEEE Transactions on Biomedical Engineering, two manuscripts on computational biomechanics were submitted: Rajagopal et al., IEEE Trans. Biomed. Eng., 2016 and Schmitz and Piovesan, IEEE Trans. Biomed. Eng., 2016. Models used in these studies were shared with the scientific reviewers and the public. In addition to the standard review of the manuscripts, the reviewers downloaded the models and performed simulations that reproduced results reported in the studies. There was general agreement between simulation results of the authors and those of the reviewers. Discrepancies were resolved during the necessary revisions. The manuscripts and instructions for download and simulation were updated in response to the reviewers' feedback; changes that may otherwise have been missed if explicit model sharing and simulation reproducibility analysis was not conducted in the review process. Increased burden on the authors and the reviewers, to facilitate model sharing and to repeat simulations, were noted. When the authors of computational biomechanics studies provide access to models and data, the scientific reviewers can download and thoroughly explore the model, perform simulations, and evaluate simulation reproducibility beyond the traditional manuscript-only review process. Model sharing and reproducibility analysis in scholarly publishing will result in a more rigorous review process, which will enhance the quality of modeling and simulation studies and inform future users of computational models.

  5. Including spatial data in nutrient balance modelling on dairy farms

    Science.gov (United States)

    van Leeuwen, Maricke; van Middelaar, Corina; Stoof, Cathelijne; Oenema, Jouke; Stoorvogel, Jetse; de Boer, Imke

    2017-04-01

    The Annual Nutrient Cycle Assessment (ANCA) calculates the nitrogen (N) and phosphorus (P) balance at a dairy farm, while taking into account the subsequent nutrient cycles of the herd, manure, soil and crop components. Since January 2016, Dutch dairy farmers are required to use ANCA in order to increase understanding of nutrient flows and to minimize nutrient losses to the environment. A nutrient balance calculates the difference between nutrient inputs and outputs. Nutrients enter the farm via purchased feed, fertilizers, deposition and fixation by legumes (nitrogen), and leave the farm via milk, livestock, manure, and roughages. A positive balance indicates to which extent N and/or P are lost to the environment via gaseous emissions (N), leaching, run-off and accumulation in soil. A negative balance indicates that N and/or P are depleted from soil. ANCA was designed to calculate average nutrient flows on farm level (for the herd, manure, soil and crop components). ANCA was not designed to perform calculations of nutrient flows at the field level, as it uses averaged nutrient inputs and outputs across all fields, and it does not include field specific soil characteristics. Land management decisions, however, such as the level of N and P application, are typically taken at the field level given the specific crop and soil characteristics. Therefore the information that ANCA provides is likely not sufficient to support farmers' decisions on land management to minimize nutrient losses to the environment. This is particularly a problem when land management and soils vary between fields. For an accurate estimate of nutrient flows in a given farming system that can be used to optimize land management, the spatial scale of nutrient inputs and outputs (and thus the effect of land management and soil variation) could be essential. Our aim was to determine the effect of the spatial scale of nutrient inputs and outputs on modelled nutrient flows and nutrient use efficiencies

  6. Workflow logs analysis system for enterprise performance measurement

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Workflow logs that record the execution of business processes offer very valuable data resource for real-time enterprise performance measurement. In this paper, a novel scheme that uses the technology of data warehouse and OLAP to explore workflow logs and create complex analysis reports for enterprise performance measurement is proposed. Three key points of this scheme are studied: 1) the measure set; 2) the open and flexible architecture for workflow logs analysis system; 3) the data models in WFMS and data warehouse. A case study that shows the validity of the scheme is also provided.

  7. A Discrete Velocity Traffic Kinetic Model Including Desired Speed

    Directory of Open Access Journals (Sweden)

    Shoufeng Lu

    2013-05-01

    Full Text Available We introduce the desired speed variable into the table of games and formulate a new table of games and the corresponding discrete traffic kinetic model. We use the hybrid programming technique of VB and MATLAB to develop the program. Lastly, we compared the proposed model result and the detector data. The results show that the proposed model can describe the traffic flow evolution.

  8. Semi-holographic model including the radiation component

    CERN Document Server

    del Campo, Sergio; Magaña, Juan; Villanueva, J R

    2014-01-01

    In this letter we study the semi holographic model which corresponds to the radiative version of the model proposed by Zhang et al. (Phys. Lett. B 694 (2010), 177) and revisited by C\\'ardenas et al. (Mon. Not. Roy. Astron. Soc. 438 (2014), 3603). This inclusion makes the model more realistic, so allows us to test it with current observational data and then answer if the inconsistency reported by C\\'ardenas et al. is relaxed.

  9. Summer Student Report - AV Workflow

    CERN Document Server

    Abramson, Jessie

    2014-01-01

    The AV Workflow is web application which allows cern users to publish, update and delete videos from cds. During my summer internship I implemented the backend of the new version of the AV Worklow in python using the django framework.

  10. A Fault Evolution Model Including the Rupture Dynamic Simulation

    Science.gov (United States)

    Wu, Y.; Chen, X.

    2011-12-01

    We perform a preliminary numerical simulation of seismicity and stress evolution along a strike-slip fault in a 3D elastic half space. Following work of Ben-Zion (1996), the fault geometry is devised as a vertical plane which is about 70 km long and 17 km wide, comparable to the size of San Andreas Fault around Parkfield. The loading mechanism is described by "backslip" method. The fault failure is governed by a static/kinetic friction law, and induced stress transfer is calculated with Okada's static solution. In order to track the rupture propagation in detail, we allow induced stress to propagate through the medium at the shear wave velocity by introducing a distance-dependent time delay to responses to stress changes. Current simulation indicates small to moderate earthquakes following the Gutenberg-Richter law and quasi-periodical characteristic large earthquakes, which are consistent with previous work by others. Next we will consider introducing a more realistic friction law, namely, the laboratory-derived rate- and state- dependent law, which can simulate more realistic and complicated sliding behavior such as the stable and unstable slip, the aseismic sliding and the slip nucleation process. In addition, the long duration of aftershocks is expected to be reproduced due to this time-dependent friction law, which is not available in current seismicity simulation. The other difference from previous work is that we are trying to include the dynamic ruptures in this study. Most previous study on seismicity simulation is based on the static solution when dealing with failure induced stress changes. However, studies of numerical simulation of rupture dynamics have revealed lots of important details which are missing in the quasi-static/quasi- dynamic simulation. For example, dynamic simulations indicate that the slip on the ground surface becomes larger if the dynamic rupture process reaches the free surface. The concentration of stress on the propagating crack

  11. Chang'E-3 data pre-processing system based on scientific workflow

    Science.gov (United States)

    tan, xu; liu, jianjun; wang, yuanyuan; yan, wei; zhang, xiaoxia; li, chunlai

    2016-04-01

    The Chang'E-3(CE3) mission have obtained a huge amount of lunar scientific data. Data pre-processing is an important segment of CE3 ground research and application system. With a dramatic increase in the demand of data research and application, Chang'E-3 data pre-processing system(CEDPS) based on scientific workflow is proposed for the purpose of making scientists more flexible and productive by automating data-driven. The system should allow the planning, conduct and control of the data processing procedure with the following possibilities: • describe a data processing task, include:1)define input data/output data, 2)define the data relationship, 3)define the sequence of tasks,4)define the communication between tasks,5)define mathematical formula, 6)define the relationship between task and data. • automatic processing of tasks. Accordingly, Describing a task is the key point whether the system is flexible. We design a workflow designer which is a visual environment for capturing processes as workflows, the three-level model for the workflow designer is discussed:1) The data relationship is established through product tree.2)The process model is constructed based on directed acyclic graph(DAG). Especially, a set of process workflow constructs, including Sequence, Loop, Merge, Fork are compositional one with another.3)To reduce the modeling complexity of the mathematical formulas using DAG, semantic modeling based on MathML is approached. On top of that, we will present how processed the CE3 data with CEDPS.

  12. From chart tracking to workflow management.

    Science.gov (United States)

    Srinivasan, P; Vignes, G; Venable, C; Hazelwood, A; Cade, T

    1994-01-01

    The current interest in system-wide integration appears to be based on the assumption that an organization, by digitizing information and accepting a common standard for the exchange of such information, will improve the accessibility of this information and automatically experience benefits resulting from its more productive use. We do not dispute this reasoning, but assert that an organization's capacity for effective change is proportional to the understanding of the current structure among its personnel. Our workflow manager is based on the use of a Parameterized Petri Net (PPN) model which can be configured to represent an arbitrarily detailed picture of an organization. The PPN model can be animated to observe the model organization in action, and the results of the animation analyzed. This simulation is a dynamic ongoing process which changes with the system and allows members of the organization to pose "what if" questions as a means of exploring opportunities for change. We present, the "workflow management system" as the natural successor to the tracking program, incorporating modeling, scheduling, reactive planning, performance evaluation, and simulation. This workflow management system is more than adequate for meeting the needs of a paper chart tracking system, and, as the patient record is computerized, will serve as a planning and evaluation tool in converting the paper-based health information system into a computer-based system.

  13. Evacuation modeling including traveler information and compliance behavior

    NARCIS (Netherlands)

    Pel, A.J.; Hoogendoorn, S.P.; Bliemer, M.C.J.

    2010-01-01

    Traffic simulation models are often used to support decisions when planning an evacuation. Scenario analyses based on these models then typically focus on traffic dynamics and the effect of traffic control measures in order to locate possible bottlenecks and predict evacuation times. A clear approac

  14. Agile parallel bioinformatics workflow management using Pwrake

    Directory of Open Access Journals (Sweden)

    Tanaka Masahiro

    2011-09-01

    Full Text Available Abstract Background In bioinformatics projects, scientific workflow systems are widely used to manage computational procedures. Full-featured workflow systems have been proposed to fulfil the demand for workflow management. However, such systems tend to be over-weighted for actual bioinformatics practices. We realize that quick deployment of cutting-edge software implementing advanced algorithms and data formats, and continuous adaptation to changes in computational resources and the environment are often prioritized in scientific workflow management. These features have a greater affinity with the agile software development method through iterative development phases after trial and error. Here, we show the application of a scientific workflow system Pwrake to bioinformatics workflows. Pwrake is a parallel workflow extension of Ruby's standard build tool Rake, the flexibility of which has been demonstrated in the astronomy domain. Therefore, we hypothesize that Pwrake also has advantages in actual bioinformatics workflows. Findings We implemented the Pwrake workflows to process next generation sequencing data using the Genomic Analysis Toolkit (GATK and Dindel. GATK and Dindel workflows are typical examples of sequential and parallel workflows, respectively. We found that in practice, actual scientific workflow development iterates over two phases, the workflow definition phase and the parameter adjustment phase. We introduced separate workflow definitions to help focus on each of the two developmental phases, as well as helper methods to simplify the descriptions. This approach increased iterative development efficiency. Moreover, we implemented combined workflows to demonstrate modularity of the GATK and Dindel workflows. Conclusions Pwrake enables agile management of scientific workflows in the bioinformatics domain. The internal domain specific language design built on Ruby gives the flexibility of rakefiles for writing scientific workflows

  15. Optimization of tomographic reconstruction workflows on geographically distributed resources.

    Science.gov (United States)

    Bicer, Tekin; Gürsoy, Dogˇa; Kettimuthu, Rajkumar; De Carlo, Francesco; Foster, Ian T

    2016-07-01

    New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modeling of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can

  16. Tailored business solutions by workflow technologies

    Directory of Open Access Journals (Sweden)

    Alexandra Fortiş

    2006-01-01

    Full Text Available VISP (Virtual Internet Service Provider is an IST-STREP project, which is conducting research in the field of these new technologies, targeted to telecom/ISP companies. One of the first tasks of the VISP project is to identify the most appropriate technologies in order to construct the VISP platform. This paper presents the most significant results in the field of choreography and orchestration, two key domains that must accompany process modeling in the construction of a workflow environment.

  17. Global atmospheric model for mercury including oxidation by bromine atoms

    Directory of Open Access Journals (Sweden)

    C. D. Holmes

    2010-12-01

    Full Text Available Global models of atmospheric mercury generally assume that gas-phase OH and ozone are the main oxidants converting Hg0 to HgII and thus driving mercury deposition to ecosystems. However, thermodynamic considerations argue against the importance of these reactions. We demonstrate here the viability of atomic bromine (Br as an alternative Hg0 oxidant. We conduct a global 3-D simulation with the GEOS-Chem model assuming gas-phase Br to be the sole Hg0 oxidant (Hg + Br model and compare to the previous version of the model with OH and ozone as the sole oxidants (Hg + OH/O3 model. We specify global 3-D Br concentration fields based on our best understanding of tropospheric and stratospheric Br chemistry. In both the Hg + Br and Hg + OH/O3 models, we add an aqueous photochemical reduction of HgII in cloud to impose a tropospheric lifetime for mercury of 6.5 months against deposition, as needed to reconcile observed total gaseous mercury (TGM concentrations with current estimates of anthropogenic emissions. This added reduction would not be necessary in the Hg + Br model if we adjusted the Br oxidation kinetics downward within their range of uncertainty. We find that the Hg + Br and Hg + OH/O3 models are equally capable of reproducing the spatial distribution of TGM and its seasonal cycle at northern mid-latitudes. The Hg + Br model shows a steeper decline of TGM concentrations from the tropics to southern mid-latitudes. Only the Hg + Br model can reproduce the springtime depletion and summer rebound of TGM observed at polar sites; the snowpack component of GEOS-Chem suggests that 40% of HgII deposited to snow in the Arctic is transferred to the ocean and land reservoirs, amounting to a net deposition flux to the Arctic of 60 Mg a−1. Summertime events of depleted Hg0 at Antarctic sites due to subsidence are much better simulated by

  18. Pilot Wave model that includes creation and annihilation of particles

    CERN Document Server

    Sverdlov, Roman

    2010-01-01

    The purpose of this paper is to come up with a Pilot Wave model of quantum field theory that incorporates particle creation and annihilation without sacrificing determinism. This has been previously attempted in an article by the same author titled "Incorporating particle creation and annihilation in Pilot Wave model", in a much less satisfactory way. In this paper I would like to "clean up" some of the things. In particular, I would like to get rid of a very unnatural concept of "visibility" of particles, which makes the model much simpler. On the other hand, I would like to add a mechanism for decoherence, which was absent in the previous version.

  19. Global atmospheric model for mercury including oxidation by bromine atoms

    Directory of Open Access Journals (Sweden)

    C. D. Holmes

    2010-08-01

    Full Text Available Global models of atmospheric mercury generally assume that OH and ozone are the main oxidants converting Hg0 to HgII and thus driving mercury deposition to ecosystems. However, thermodynamic considerations argue against the importance of these reactions. We demonstrate here the viability of atomic bromine (Br as an alternative Hg0 oxidant. We conduct a global 3-D simulation with the GEOS-Chem model assuming Br to be the sole Hg0 oxidant (Hg + Br model and compare to the previous version of the model with OH and ozone as the sole oxidants (Hg + OH/O3 model. We specify global 3-D Br concentration fields based on our best understanding of tropospheric and stratospheric Br chemistry. In both the Hg + Br and Hg + OH/O3 models, we add an aqueous photochemical reduction of HgII in cloud to impose a tropospheric lifetime for mercury of 6.5 months against deposition, as needed to reconcile observed total gaseous mercury (TGM concentrations with current estimates of anthropogenic emissions. This added reduction would not be necessary in the Hg + Br model if we adjusted the Br oxidation kinetics downward within their range of uncertainty. We find that the Hg + Br and Hg + OH/O3 models are equally capable of reproducing the spatial distribution of TGM and its seasonal cycle at northern mid-latitudes. The Hg + Br model shows a steeper decline of TGM concentrations from the tropics to southern mid-latitudes. Only the Hg + Br model can reproduce the springtime depletion and summer rebound of TGM observed at polar sites; the snowpack component of GEOS-Chem suggests that 40% of HgII deposited to snow in the Arctic is transferred to the ocean and land reservoirs, amounting to a net deposition flux of 60 Mg a−1. Summertime events of depleted Hg0 at Antarctic sites due to subsidence are much better simulated by the Hg + Br model. Model

  20. An Intracellular Calcium Oscillations Model Including Mitochondrial Calcium Cycling

    Institute of Scientific and Technical Information of China (English)

    SHI Xiao-Min; LIU Zeng-Rong

    2005-01-01

    @@ Calcium is a ubiquitous second messenger. Mitochondria contributes significantly to intracellular Ca2+ dynamics.The experiment of Kaftan et al. [J. Biol. Chem. 275(2000) 25465] demonstrated that inhibiting mitochondrial Ca2+ uptake can reduce the frequency of cytosolic Ca2+ concentration oscillations of gonadotropes. By considering the mitochondrial Ca2+ cycling we develop a three-variable model of intracellular Ca2+ oscillations based on the models of Atri et al. [Biophys. J. 65 (1993) 1727] and Falcke et al. [Biophys. J. 77 (1999) 37]. The model reproduces the fact that mitochondrial Ca2+ cycling increases the frequency of cytosolic Ca2+ oscillations, which accords with Kaftan's results. Moreover the model predicts that when the mitochondria overload with Ca2+, the cytosolic Ca2+ oscillations vanish, which may trigger apoptosis.

  1. Provenance-Powered Automatic Workflow Generation and Composition

    Science.gov (United States)

    Zhang, J.; Lee, S.; Pan, L.; Lee, T. J.

    2015-12-01

    In recent years, scientists have learned how to codify tools into reusable software modules that can be chained into multi-step executable workflows. Existing scientific workflow tools, created by computer scientists, require domain scientists to meticulously design their multi-step experiments before analyzing data. However, this is oftentimes contradictory to a domain scientist's daily routine of conducting research and exploration. We hope to resolve this dispute. Imagine this: An Earth scientist starts her day applying NASA Jet Propulsion Laboratory (JPL) published climate data processing algorithms over ARGO deep ocean temperature and AMSRE sea surface temperature datasets. Throughout the day, she tunes the algorithm parameters to study various aspects of the data. Suddenly, she notices some interesting results. She then turns to a computer scientist and asks, "can you reproduce my results?" By tracking and reverse engineering her activities, the computer scientist creates a workflow. The Earth scientist can now rerun the workflow to validate her findings, modify the workflow to discover further variations, or publish the workflow to share the knowledge. In this way, we aim to revolutionize computer-supported Earth science. We have developed a prototyping system to realize the aforementioned vision, in the context of service-oriented science. We have studied how Earth scientists conduct service-oriented data analytics research in their daily work, developed a provenance model to record their activities, and developed a technology to automatically generate workflow starting from user behavior and adaptability and reuse of these workflows for replicating/improving scientific studies. A data-centric repository infrastructure is established to catch richer provenance to further facilitate collaboration in the science community. We have also established a Petri nets-based verification instrument for provenance-based automatic workflow generation and recommendation.

  2. Use of contextual inquiry to understand anatomic pathology workflow: Implications for digital pathology adoption

    Directory of Open Access Journals (Sweden)

    Jonhan Ho

    2012-01-01

    Full Text Available Background: For decades anatomic pathology (AP workflow have been a highly manual process based on the use of an optical microscope and glass slides. Recent innovations in scanning and digitizing of entire glass slides are accelerating a move toward widespread adoption and implementation of a workflow based on digital slides and their supporting information management software. To support the design of digital pathology systems and ensure their adoption into pathology practice, the needs of the main users within the AP workflow, the pathologists, should be identified. Contextual inquiry is a qualitative, user-centered, social method designed to identify and understand users′ needs and is utilized for collecting, interpreting, and aggregating in-detail aspects of work. Objective: Contextual inquiry was utilized to document current AP workflow, identify processes that may benefit from the introduction of digital pathology systems, and establish design requirements for digital pathology systems that will meet pathologists′ needs. Materials and Methods: Pathologists were observed and interviewed at a large academic medical center according to contextual inquiry guidelines established by Holtzblatt et al. 1998. Notes representing user-provided data were documented during observation sessions. An affinity diagram, a hierarchal organization of the notes based on common themes in the data, was created. Five graphical models were developed to help visualize the data including sequence, flow, artifact, physical, and cultural models. Results: A total of six pathologists were observed by a team of two researchers. A total of 254 affinity notes were documented and organized using a system based on topical hierarchy, including 75 third-level, 24 second-level, and five main-level categories, including technology, communication, synthesis/preparation, organization, and workflow. Current AP workflow was labor intensive and lacked scalability. A large number

  3. Using Workflow Modeling to Identify Areas to Improve Genetic Test Processes in the University of Maryland Translational Pharmacogenomics Project.

    Science.gov (United States)

    Cutting, Elizabeth M; Overby, Casey L; Banchero, Meghan; Pollin, Toni; Kelemen, Mark; Shuldiner, Alan R; Beitelshees, Amber L

    Delivering genetic test results to clinicians is a complex process. It involves many actors and multiple steps, requiring all of these to work together in order to create an optimal course of treatment for the patient. We used information gained from focus groups in order to illustrate the current process of delivering genetic test results to clinicians. We propose a business process model and notation (BPMN) representation of this process for a Translational Pharmacogenomics Project being implemented at the University of Maryland Medical Center, so that personalized medicine program implementers can identify areas to improve genetic testing processes. We found that the current process could be improved to reduce input errors, better inform and notify clinicians about the implications of certain genetic tests, and make results more easily understood. We demonstrate our use of BPMN to improve this important clinical process for CYP2C19 genetic testing in patients undergoing invasive treatment of coronary heart disease.

  4. A scheduling framework applied to digital publishing workflows

    Science.gov (United States)

    Lozano, Wilson; Rivera, Wilson

    2006-02-01

    This paper presents the advances in developing a dynamic scheduling technique suitable for automating digital publishing workflows. Traditionally scheduling in digital publishing has been limited to timing criteria. The proposed scheduling strategy takes into account contingency and priority fluctuations. The new scheduling algorithm, referred to as QB-MUF, gives high priority to jobs with low probability of failing according to artifact recognition and workflow modeling critera. The experimental results show the suitability and efficiency of the scheduling strategy.

  5. Cement-aggregate compatibility and structure property relationships including modelling

    Energy Technology Data Exchange (ETDEWEB)

    Jennings, H.M.; Xi, Y.

    1993-07-15

    The role of aggregate, and its interface with cement paste, is discussed with a view toward establishing models that relate structure to properties. Both short (nm) and long (mm) range structure must be considered. The short range structure of the interface depends not only on the physical distribution of the various phases, but also on moisture content and reactivity of aggregate. Changes that occur on drying, i.e. shrinkage, may alter the structure which, in turn, feeds back to alter further drying and shrinkage. The interaction is dynamic, even without further hydration of cement paste, and the dynamic characteristic must be considered in order to fully understand and model its contribution to properties. Microstructure and properties are two subjects which have been pursued somewhat separately. This review discusses both disciplines with a view toward finding common research goals in the future. Finally, comment is made on possible chemical reactions which may occur between aggregate and cement paste.

  6. Including lateral interactions into microkinetic models of catalytic reactions

    DEFF Research Database (Denmark)

    Hellman, Anders; Honkala, Johanna Karoliina

    2007-01-01

    In many catalytic reactions lateral interactions between adsorbates are believed to have a strong influence on the reaction rates. We apply a microkinetic model to explore the effect of lateral interactions and how to efficiently take them into account in a simple catalytic reaction. Three differ...... different approximations are investigated: site, mean-field, and quasichemical approximations. The obtained results are compared to accurate Monte Carlo numbers. In the end, we apply the approximations to a real catalytic reaction, namely, ammonia synthesis....

  7. Comparison of Joint Modeling Approaches Including Eulerian Sliding Interfaces

    Energy Technology Data Exchange (ETDEWEB)

    Lomov, I; Antoun, T; Vorobiev, O

    2009-12-16

    Accurate representation of discontinuities such as joints and faults is a key ingredient for high fidelity modeling of shock propagation in geologic media. The following study was done to improve treatment of discontinuities (joints) in the Eulerian hydrocode GEODYN (Lomov and Liu 2005). Lagrangian methods with conforming meshes and explicit inclusion of joints in the geologic model are well suited for such an analysis. Unfortunately, current meshing tools are unable to automatically generate adequate hexahedral meshes for large numbers of irregular polyhedra. Another concern is that joint stiffness in such explicit computations requires significantly reduced time steps, with negative implications for both the efficiency and quality of the numerical solution. An alternative approach is to use non-conforming meshes and embed joint information into regular computational elements. However, once slip displacement on the joints become comparable to the zone size, Lagrangian (even non-conforming) meshes could suffer from tangling and decreased time step problems. The use of non-conforming meshes in an Eulerian solver may alleviate these difficulties and provide a viable numerical approach for modeling the effects of faults on the dynamic response of geologic materials. We studied shock propagation in jointed/faulted media using a Lagrangian and two Eulerian approaches. To investigate the accuracy of this joint treatment the GEODYN calculations have been compared with results from the Lagrangian code GEODYN-L which uses an explicit treatment of joints via common plane contact. We explore two approaches to joint treatment in the code, one for joints with finite thickness and the other for tight joints. In all cases the sliding interfaces are tracked explicitly without homogenization or blending the joint and block response into an average response. In general, rock joints will introduce an increase in normal compliance in addition to a reduction in shear strength. In the

  8. Neighboring extremal optimal control design including model mismatch errors

    Energy Technology Data Exchange (ETDEWEB)

    Kim, T.J. [Sandia National Labs., Albuquerque, NM (United States); Hull, D.G. [Texas Univ., Austin, TX (United States). Dept. of Aerospace Engineering and Engineering Mechanics

    1994-11-01

    The mismatch control technique that is used to simplify model equations of motion in order to determine analytic optimal control laws is extended using neighboring extremal theory. The first variation optimal control equations are linearized about the extremal path to account for perturbations in the initial state and the final constraint manifold. A numerical example demonstrates that the tuning procedure inherent in the mismatch control method increases the performance of the controls to the level of a numerically-determined piecewise-linear controller.

  9. Double pendulum model for tennis stroke including a collision process

    CERN Document Server

    Youn, Sun-Hyun

    2015-01-01

    By means of adding a collision process between the ball and racket in double pendulum model, we analyzed the tennis stroke. It is possible that the speed of the rebound ball does not simply depend on the angular velocity of the racket, and higher angular velocity sometimes gives lower ball speed. We numerically showed that the proper time lagged racket rotation increases the speed of the rebound ball by 20%. We also showed that the elbow should move in order to add the angular velocity of the racket.

  10. The CESM Workflow Re-Engineering Project

    Science.gov (United States)

    Strand, G.

    2015-12-01

    The Community Earth System Model (CESM) Workflow Re-Engineering Project is a collaborative project between the CESM Software Engineering Group (CSEG) and the NCAR Computation and Information Systems Lab (CISL) Application Scalability and Performance (ASAP) Group to revamp how CESM saves its output. The CMIP3 and particularly CMIP5 experiences in submitting CESM data to those intercomparison projects revealed that the output format of the CESM is not well-suited for the data requirements common to model intercomparison projects. CESM, for efficiency reasons, creates output files containing all fields for each model time sampling, but MIPs require individual files for each field comprising all model time samples. This transposition of model output can be very time-consuming; depending on the volume of data written by the specific simulation, the time to re-orient the data can be comparable to the time required for the simulation to complete. Previous strategies including using serial tools to perform this transposition, but they are now far too inefficient to deal with the many terabytes of output a single simulation can generate. A new set of Python tools, using data parallelism, have been written to enable this re-orientation, and have achieved markedly improved I/O performance. The perspective of a data manager/data producer in the use of these new tools is presented, and likely future work on their development and use will be shown. These tools are a critical part of the NCAR CESM submission to the upcoming CMIP6, with the intention that a much more timely and efficient submission of the expected petabytes of data will be accomplished in the given time frame.

  11. Optimizing CyberShake Seismic Hazard Workflows for Large HPC Resources

    Science.gov (United States)

    Callaghan, S.; Maechling, P. J.; Juve, G.; Vahi, K.; Deelman, E.; Jordan, T. H.

    2014-12-01

    The CyberShake computational platform is a well-integrated collection of scientific software and middleware that calculates 3D simulation-based probabilistic seismic hazard curves and hazard maps for the Los Angeles region. Currently each CyberShake model comprises about 235 million synthetic seismograms from about 415,000 rupture variations computed at 286 sites. CyberShake integrates large-scale parallel and high-throughput serial seismological research codes into a processing framework in which early stages produce files used as inputs by later stages. Scientific workflow tools are used to manage the jobs, data, and metadata. The Southern California Earthquake Center (SCEC) developed the CyberShake platform using USC High Performance Computing and Communications systems and open-science NSF resources.CyberShake calculations were migrated to the NSF Track 1 system NCSA Blue Waters when it became operational in 2013, via an interdisciplinary team approach including domain scientists, computer scientists, and middleware developers. Due to the excellent performance of Blue Waters and CyberShake software optimizations, we reduced the makespan (a measure of wallclock time-to-solution) of a CyberShake study from 1467 to 342 hours. We will describe the technical enhancements behind this improvement, including judicious introduction of new GPU software, improved scientific software components, increased workflow-based automation, and Blue Waters-specific workflow optimizations.Our CyberShake performance improvements highlight the benefits of scientific workflow tools. The CyberShake workflow software stack includes the Pegasus Workflow Management System (Pegasus-WMS, which includes Condor DAGMan), HTCondor, and Globus GRAM, with Pegasus-mpi-cluster managing the high-throughput tasks on the HPC resources. The workflow tools handle data management, automatically transferring about 13 TB back to SCEC storage.We will present performance metrics from the most recent Cyber

  12. Modelling of Dual-Junction Solar Cells including Tunnel Junction

    Directory of Open Access Journals (Sweden)

    Abdelaziz Amine

    2013-01-01

    Full Text Available Monolithically stacked multijunction solar cells based on III–V semiconductors materials are the state-of-art of approach for high efficiency photovoltaic energy conversion, in particular for space applications. The individual subcells of the multi-junction structure are interconnected via tunnel diodes which must be optically transparent and connect the component cells with a minimum electrical resistance. The quality of these diodes determines the output performance of the solar cell. The purpose of this work is to contribute to the investigation of the tunnel electrical resistance of such a multi-junction cell through the analysis of the current-voltage (J-V characteristics under illumination. Our approach is based on an equivalent circuit model of a diode for each subcell. We examine the effect of tunnel resistance on the performance of a multi-junction cell using minimization of the least squares technique.

  13. Human sperm chromatin stabilization: a proposed model including zinc bridges.

    Science.gov (United States)

    Björndahl, Lars; Kvist, Ulrik

    2010-01-01

    The primary focus of this review is to challenge the current concepts on sperm chromatin stability. The observations (i) that zinc depletion at ejaculation allows a rapid and total sperm chromatin decondensation without the addition of exogenous disulfide cleaving agents and (ii) that the human sperm chromatin contains one zinc for every protamine for every turn of the DNA helix suggest an alternative model for sperm chromatin structure may be plausible. An alternative model is therefore proposed, that the human spermatozoon could at ejaculation have a rapidly reversible zinc dependent chromatin stability: Zn(2+) stabilizes the structure and prevents the formation of excess disulfide bridges by a single mechanism, the formation of zinc bridges with protamine thiols of cysteine and potentially imidazole groups of histidine. Extraction of zinc enables two biologically totally different outcomes: immediate decondensation if chromatin fibers are concomitantly induced to repel (e.g. by phosphorylation in the ooplasm); otherwise freed thiols become committed into disulfide bridges creating a superstabilized chromatin. Spermatozoa in the zinc rich prostatic fluid (normally the first expelled ejaculate fraction) represent the physiological situation. Extraction of chromatin zinc can be accomplished by the seminal vesicular fluid. Collection of the ejaculate in one single container causes abnormal contact between spermatozoa and seminal vesicular fluid affecting the sperm chromatin stability. There are men in infertile couples with low content of sperm chromatin zinc due to loss of zinc during ejaculation and liquefaction. Tests for sperm DNA integrity may give false negative results due to decreased access for the assay to the DNA in superstabilized chromatin.

  14. Global model including multistep ionizations in helium plasmas

    Science.gov (United States)

    Oh, Seung-Ju; Lee, Hyo-Chang; Chung, Chin-Wook

    2016-12-01

    Particle and power balance equations including stepwise ionizations are derived and solved in helium plasmas. In the balance equations, two metastable states (21S1 in singlet and 23S1 triplet) are considered and the followings are obtained. The plasma density linearly increases and the electron temperature is relatively in a constant value against the absorbed power. It is also found that the contribution to multi-step ionization with respect to the single-step ionization is in the range of 8%-23%, as the gas pressure increases from 10 mTorr to 100 mTorr. Compared to the results in the argon plasma, there is little variation in the collisional energy loss per electron-ion pair created (ɛc) with absorbed power and gas pressure due to the small collision cross section and higher inelastic collision threshold energy.

  15. Modelization of a water tank including a PCM module

    Energy Technology Data Exchange (ETDEWEB)

    Ibanez, Manuel [Dept. de Medi Ambient i Ciencies del Sol, Universitat de Lleida, Rovira Roure 191, 25198 Lleida (Spain); Cabeza, Luisa F.; Sole, Cristian; Roca, Joan; Nogues, Miquel [Dept. d' Informatica i Eng. Industrial, Universitat de Lleida, Jaume II 69, 25001 Lleida (Spain)

    2006-08-15

    The reduction of CO{sub 2} emissions is a key component for today's governments. Therefore, implementation of more and more systems with renewable energies is necessary. Solar systems for single family houses or residential buildings need a big water tank that many times is not easy to locate. This paper studies the modelization of a new technology where PCM modules are implemented in domestic hot water tanks to reduce their size without reducing the energy stored. A new TRNSYS component, based in the already existing TYPE 60, was developed, called TYPE 60PCM. After tuning the new component with experimental results, two more experiences were developed to validate the simulation of a water tank with two cylindrical PCM modules using type 60PCM, the cooldown and reheating experiments. Concordance between experimental and simulated data was very good. Since the new TRNSYS component was developed to simulate full solar systems, comparison of experimental results from a pilot plant solar system with simulations were performed, and they confirmed that the type 60PCM is a powerful tool to evaluate the performance of PCM modules in water tanks. (author)

  16. NeuroManager: a workflow analysis based simulation management engine for computational neuroscience.

    Science.gov (United States)

    Stockton, David B; Santamaria, Fidel

    2015-01-01

    We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project.

  17. Galaxy tools and workflows for sequence analysis with applications in molecular plant pathology.

    Science.gov (United States)

    Cock, Peter J A; Grüning, Björn A; Paszkiewicz, Konrad; Pritchard, Leighton

    2013-01-01

    The Galaxy Project offers the popular web browser-based platform Galaxy for running bioinformatics tools and constructing simple workflows. Here, we present a broad collection of additional Galaxy tools for large scale analysis of gene and protein sequences. The motivating research theme is the identification of specific genes of interest in a range of non-model organisms, and our central example is the identification and prediction of "effector" proteins produced by plant pathogens in order to manipulate their host plant. This functional annotation of a pathogen's predicted capacity for virulence is a key step in translating sequence data into potential applications in plant pathology. This collection includes novel tools, and widely-used third-party tools such as NCBI BLAST+ wrapped for use within Galaxy. Individual bioinformatics software tools are typically available separately as standalone packages, or in online browser-based form. The Galaxy framework enables the user to combine these and other tools to automate organism scale analyses as workflows, without demanding familiarity with command line tools and scripting. Workflows created using Galaxy can be saved and are reusable, so may be distributed within and between research groups, facilitating the construction of a set of standardised, reusable bioinformatic protocols. The Galaxy tools and workflows described in this manuscript are open source and freely available from the Galaxy Tool Shed (http://usegalaxy.org/toolshed or http://toolshed.g2.bx.psu.edu).

  18. Galaxy tools and workflows for sequence analysis with applications in molecular plant pathology

    Directory of Open Access Journals (Sweden)

    Peter J.A. Cock

    2013-09-01

    Full Text Available The Galaxy Project offers the popular web browser-based platform Galaxy for running bioinformatics tools and constructing simple workflows. Here, we present a broad collection of additional Galaxy tools for large scale analysis of gene and protein sequences. The motivating research theme is the identification of specific genes of interest in a range of non-model organisms, and our central example is the identification and prediction of “effector” proteins produced by plant pathogens in order to manipulate their host plant. This functional annotation of a pathogen’s predicted capacity for virulence is a key step in translating sequence data into potential applications in plant pathology.This collection includes novel tools, and widely-used third-party tools such as NCBI BLAST+ wrapped for use within Galaxy. Individual bioinformatics software tools are typically available separately as standalone packages, or in online browser-based form. The Galaxy framework enables the user to combine these and other tools to automate organism scale analyses as workflows, without demanding familiarity with command line tools and scripting. Workflows created using Galaxy can be saved and are reusable, so may be distributed within and between research groups, facilitating the construction of a set of standardised, reusable bioinformatic protocols.The Galaxy tools and workflows described in this manuscript are open source and freely available from the Galaxy Tool Shed (http://usegalaxy.org/toolshed or http://toolshed.g2.bx.psu.edu.

  19. Scientific Process Automation and Workflow Management

    Energy Technology Data Exchange (ETDEWEB)

    Ludaescher, Bertram T.; Altintas, Ilkay; Bowers, Shawn; Cummings, J.; Critchlow, Terence J.; Deelman, Ewa; De Roure, D.; Freire, Juliana; Goble, Carole; Jones, Matt; Klasky, S.; McPhillips, Timothy; Podhorszki, Norbert; Silva, C.; Taylor, I.; Vouk, M.

    2010-01-01

    We introduce and describe scientific workflows, i.e., executable descriptions of automatable scientific processes such as computational science simulations and data analyses. Scientific workflows are often expressed in terms of tasks and their (data ow) dependencies. This chapter first provides an overview of the characteristic features of scientific workflows and outlines their life cycle. A detailed case study highlights workflow challenges and solutions in simulation management. We then provide a brief overview of how some concrete systems support the various phases of the workflow life cycle, i.e., design, resource management, execution, and provenance management. We conclude with a discussion on community-based workflow sharing.

  20. Office 2010 Workflow Developing Collaborative Solutions

    CERN Document Server

    Mann, David; Enterprises, Creative

    2010-01-01

    Workflow is the glue that binds information worker processes, users, and artifacts. Without workflow, information workers are just islands of data and potential. Office 2010 Workflow details how to implement workflow in SharePoint 2010 and the client Microsoft Office 2010 suite to help information workers share data, enforce processes and business rules, and work more efficiently together or solo. This book covers everything you need to know-from what workflow is all about to creating new activities; from the SharePoint Designer to Visual Studio 2010; from out-of-the-box workflows to state mac

  1. O potencial das redes de Petri em modelagem e análise de processos de negócio The potencial of Petri nets in modeling and analysis of workflow

    Directory of Open Access Journals (Sweden)

    Sílvia Inês Dallavalle de Pádua

    2004-04-01

    Full Text Available A tecnologia de gerenciamento workflow procura oferecer uma solução flexível em apoio aos processos de negócios, por meio da facilitação de modificações e da criação de novos processos. Entretanto, a falta de definição bem formalizada no que se refere à sintaxe e à semântica de tais técnicas dificulta análises mais complexas dos modelos. Nesse caso, redes de Petri atuam com excelente potencial, uma vez que possuem representação gráfica, são de fácil aprendizado, funcionam como linguagem de comunicação entre especialistas de diversas áreas, possibilitam descrever aspectos estáticos e dinâmicos do sistema a ser representado e ainda possuem o formalismo matemático necessário para métodos de análise já consagrados. O presente trabalho tem por objetivo, primeiramente, oferecer uma visão atualizada do estado da arte na área de modelagem de workflow baseada em redes de Petri e expor um exemplo de modelo de workflow com parâmetros de tempo e custo associados às transições da rede. O objetivo secundário é apresentar conceitos importantes de redes de Petri, workflow e rotas de processos.Unfortunately these systems and techniques do not have well-defined syntax and semantic, which makes harder the complex analysis of models. In this case, Petri nets have excellent potential to solve the problem, once they present graphic representation and easy understanding. They are used as a communication language among expert people in different areas, and may allow the description of static and dynamic aspects of the systems that have to be represented. Also, they have mathematic formalism, which make possibly the use of the analysis methods. The objective of this work is to present the potential of Petri nets in studies of Enterprise Modelling and Workflow, showing an example of application, with time and cost parameters.

  2. Procesos workflow en la nube

    OpenAIRE

    Peralta,Mario; Salgado, Carlos Humberto; Baigorria, Lorena; Montejano, Germán Antonio; Riesco, Daniel Eduardo

    2014-01-01

    Dada la globalización de la información, las organizaciones tienden a virtualizar sus negocios: subir su negocio a la Nube. Desde la perspectiva de la complejidad de los procesos de negocio, una de las tecnologías más significativas para soportar su automatización son los Sistemas de Gestión Workflow, dando soporte computacional para definir, sincronizar y ejecutar actividades del proceso utilizando workflows. Para favorecer y dar flexibilidad a dichos sistemas, es fundamental tener herram...

  3. Analysing scientific workflows: why workflows not only connect web services

    NARCIS (Netherlands)

    Wassink, I.; van der Vet, P.E.; Wolstencroft, K.; Neerincx, P.B.T.; Roos, M.; Rauwerda, H.; Breit, T.M.; Zhang, LJ.

    2009-01-01

    Life science workflow systems are developed to help life scientists to conveniently connect various programs and web services. In practice however, much time is spent on data conversion, because web services provided by different organisations use different data formats. We have analysed all the

  4. Analysing scientific workflows: Why workflows not only connect web services

    NARCIS (Netherlands)

    Wassink, I.; van der Vet, P.E.; Wolstencroft, K.; Neerincx, P.B.T.; Roos, M.; Rauwerda, H.; Breit, T.M.; Zhang, L.J.

    2009-01-01

    Life science workflow systems are developed to help life scientists to conveniently connect various programs and web services. In practice however, much time is spent on data conversion, because web services provided by different organisations use different data formats. We have analysed all the

  5. Workflows for microarray data processing in the Kepler environment

    Directory of Open Access Journals (Sweden)

    Stropp Thomas

    2012-05-01

    Full Text Available Abstract Background Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. Results We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data and therefore are close to

  6. A scientific workflow framework for (13)C metabolic flux analysis.

    Science.gov (United States)

    Dalman, Tolga; Wiechert, Wolfgang; Nöh, Katharina

    2016-08-20

    Metabolic flux analysis (MFA) with (13)C labeling data is a high-precision technique to quantify intracellular reaction rates (fluxes). One of the major challenges of (13)C MFA is the interactivity of the computational workflow according to which the fluxes are determined from the input data (metabolic network model, labeling data, and physiological rates). Here, the workflow assembly is inevitably determined by the scientist who has to consider interacting biological, experimental, and computational aspects. Decision-making is context dependent and requires expertise, rendering an automated evaluation process hardly possible. Here, we present a scientific workflow framework (SWF) for creating, executing, and controlling on demand (13)C MFA workflows. (13)C MFA-specific tools and libraries, such as the high-performance simulation toolbox 13CFLUX2, are wrapped as web services and thereby integrated into a service-oriented architecture. Besides workflow steering, the SWF features transparent provenance collection and enables full flexibility for ad hoc scripting solutions. To handle compute-intensive tasks, cloud computing is supported. We demonstrate how the challenges posed by (13)C MFA workflows can be solved with our approach on the basis of two proof-of-concept use cases.

  7. CO2 Storage Feasibility: A Workflow for Site Characterisation

    Directory of Open Access Journals (Sweden)

    Nepveu Manuel

    2015-04-01

    Full Text Available In this paper, we present an overview of the SiteChar workflow model for site characterisation and assessment for CO2 storage. Site characterisation and assessment is required when permits are requested from the legal authorities in the process of starting a CO2 storage process at a given site. The goal is to assess whether a proposed CO2 storage site can indeed be used for permanent storage while meeting the safety requirements demanded by the European Commission (EC Storage Directive (9, Storage Directive 2009/31/EC. Many issues have to be scrutinised, and the workflow presented here is put forward to help efficiently organise this complex task. Three issues are highlighted: communication within the working team and with the authorities; interdependencies in the workflow and feedback loops; and the risk-based character of the workflow. A general overview (helicopter view of the workflow is given; the issues involved in communication and the risk assessment process are described in more detail. The workflow as described has been tested within the SiteChar project on five potential storage sites throughout Europe. This resulted in a list of key aspects of site characterisation which can help prepare and focus new site characterisation studies.

  8. Implementation Recommendations for MOSAIC: A Workflow Architecture for Analytic Enrichment. Analysis and Recommendations for the Implementation of a Cohesive Method for Orchestrating Analytics in a Distributed Model

    Science.gov (United States)

    2011-02-01

    recognition and genre detection. This analytic work must happen before any of the active workflows execute on the incoming document, so it is...documents of a particular language or genre and this information must be made available to all of them up front. Any further or deeper analysis should...of the structure of an Item to us, but still leaves the location of the document text a mystery . We did some exploring by using the GUI to create a

  9. Big data analytics workflow management for eScience

    Science.gov (United States)

    Fiore, Sandro; D'Anca, Alessandro; Palazzo, Cosimo; Elia, Donatello; Mariello, Andrea; Nassisi, Paola; Aloisio, Giovanni

    2015-04-01

    In many domains such as climate and astrophysics, scientific data is often n-dimensional and requires tools that support specialized data types and primitives if it is to be properly stored, accessed, analysed and visualized. Currently, scientific data analytics relies on domain-specific software and libraries providing a huge set of operators and functionalities. However, most of these software fail at large scale since they: (i) are desktop based, rely on local computing capabilities and need the data locally; (ii) cannot benefit from available multicore/parallel machines since they are based on sequential codes; (iii) do not provide declarative languages to express scientific data analysis tasks, and (iv) do not provide newer or more scalable storage models to better support the data multidimensionality. Additionally, most of them: (v) are domain-specific, which also means they support a limited set of data formats, and (vi) do not provide a workflow support, to enable the construction, execution and monitoring of more complex "experiments". The Ophidia project aims at facing most of the challenges highlighted above by providing a big data analytics framework for eScience. Ophidia provides several parallel operators to manipulate large datasets. Some relevant examples include: (i) data sub-setting (slicing and dicing), (ii) data aggregation, (iii) array-based primitives (the same operator applies to all the implemented UDF extensions), (iv) data cube duplication, (v) data cube pivoting, (vi) NetCDF-import and export. Metadata operators are available too. Additionally, the Ophidia framework provides array-based primitives to perform data sub-setting, data aggregation (i.e. max, min, avg), array concatenation, algebraic expressions and predicate evaluation on large arrays of scientific data. Bit-oriented plugins have also been implemented to manage binary data cubes. Defining processing chains and workflows with tens, hundreds of data analytics operators is the

  10. Concurrency & Asynchrony in Declarative Workflows

    DEFF Research Database (Denmark)

    Debois, Søren; Hildebrandt, Thomas; Slaats, Tijs

    2015-01-01

    Declarative or constraint-based business process and workflow notations have received increasing interest in the last decade as possible means of addressing the challenge of supporting at the same time flexibility in execution, adaptability and compliance. However, the definition of concurrent se...... development has been verified correct in the Isabelle-HOL interactive theorem prover....

  11. Concurrency & Asynchrony in Declarative Workflows

    DEFF Research Database (Denmark)

    Debois, Søren; Hildebrandt, Thomas; Slaats, Tijs

    2015-01-01

    Declarative or constraint-based business process and workflow notations have received increasing interest in the last decade as possible means of addressing the challenge of supporting at the same time flexibility in execution, adaptability and compliance. However, the definition of concurrent se...... development has been verified correct in the Isabelle-HOL interactive theorem prover....

  12. New Interactions with Workflow Systems

    NARCIS (Netherlands)

    Wassink, I.; Vet, van der P.E.; Veer, van der G.C.; Roos, M.; Dijk, van E.M.A.G.; Norros, L.; Koskinen, H.; Salo, L.; Savioja, P.

    2009-01-01

    This paper describes the evaluation of our early design ideas of an ad-hoc of workflow system. Using the teach-back technique, we have performed a hermeneutic analysis of the mockup implementation named NIWS to get corrective and creative feedback at the functional, dialogue and representation level

  13. New Interactions with Workflow Systems

    NARCIS (Netherlands)

    Wassink, I.; van der Vet, P.E.; van der Veer, Gerrit C.; Roos, M.; van Dijk, Elisabeth M.A.G.; Norros, L.; Koskinen, H.; Salo, L.; Savioja, P.

    2009-01-01

    This paper describes the evaluation of our early design ideas of an ad-hoc of workflow system. Using the teach-back technique, we have performed a hermeneutic analysis of the mockup implementation named NIWS to get corrective and creative feedback at the functional, dialogue and representation level

  14. Constructing workflows from script applications

    NARCIS (Netherlands)

    Baranowski, M.; Belloum, A.; Bubak, M.; Malawski, M.

    2012-01-01

    For programming and executing complex applications on grid infrastructures, scientific workflows have been proposed as convenient high-level alternative to solutions based on general-purpose programming languages, APIs and scripts. GridSpace is a collaborative programming and execution environment,

  15. Adobe Photoshop Lightroom and Photoshop workflow bible

    CERN Document Server

    Fitzgerald, Mark

    2013-01-01

    The digital photographer's workflow is divided into two distinct parts - the Production Workflow and the Creative Workflow. The Production workflow is used to import and organize large numbers of images, and prepare them for presentation via proof printing, Web, or slideshow. Increasingly, photographers are turning to Adobe's acclaimed new Lightroom software to manage this part of the workflow. After the best images are identified, photographers move to the second part of the workflow, the Creative Workflow, to fine-tune special images using a variety of advanced digital tools so that the creative vision is realized. An overwhelming majority of digital photographers use Photoshop for this advanced editing. Adobe Photoshop Lightroom & Photoshop Workflow Bible effectively guides digital photographers through both parts of this process. Author Mark Fitzgerald, an Adobe Certified Expert and Adobe Certified Instructor in Photoshop CS3 offers readers a clear path to using both Lightroom 2 and Photoshop CS3 to c...

  16. Hierarchical and structured audit business workflow modeling based on Petri net%基于Petri网的审核业务工作流层次结构化建模

    Institute of Scientific and Technical Information of China (English)

    王卫东; 周国祥

    2012-01-01

    文章针对企业审核业务过程分析,提出了一种基于Petri网的层次结构化审核业务工作流过程建模方法,同时引入了抽象变迁和审核模型的概念,克服了传统工作流建模方式在处理审核业务中缺乏流程柔性和系统灵活性的缺点,使其更加直观、清晰地描述审核过程,最后结合一个实例说明建模方法.%Based on the analysis of audit business process of enterprises, by using Petri net as a workflow modeling tool, a hierarchical and structured modeling approach to audit business workflow is proposed. And the concept of abstract transition and audit model is introduced. The model overcomes the limitation of traditional modeling method that lacks process suppleness and system flexibility. It can explain the audit process clearer and more directly. Finally, an instance is presented to describe the modeling method.

  17. Design and implementation of a secure workflow system based on PKI/PMI

    Science.gov (United States)

    Yan, Kai; Jiang, Chao-hui

    2013-03-01

    As the traditional workflow system in privilege management has the following weaknesses: low privilege management efficiency, overburdened for administrator, lack of trust authority etc. A secure workflow model based on PKI/PMI is proposed after studying security requirements of the workflow systems in-depth. This model can achieve static and dynamic authorization after verifying user's ID through PKC and validating user's privilege information by using AC in workflow system. Practice shows that this system can meet the security requirements of WfMS. Moreover, it can not only improve system security, but also ensures integrity, confidentiality, availability and non-repudiation of the data in the system.

  18. 基于角色和任务的工作流访问控制管理模型%An Administrative Model for Task-Role Based on Access Control in Workflow Systems

    Institute of Scientific and Technical Information of China (English)

    张晶; 杨国林; 萨智海

    2011-01-01

    针对现有访问控制模型在工作流系统安全方面存在的不足,提出一种基于角色和任务的工作流访问控制管理模型(ATRBAC).该模型将ARBAC模型中的管理思想融入TRBAC模型,并引入管理员及管理权限,同时对管理员实行层次管理,解决了系统管理员的权限过大而产生的隐患,加强了系统的安全性.%To overcome the weaknesses of security existing in the old access control models of workflow systems,a new model called Administrative Model for Task-Role Based Access Control (ATRBAC) is presented in this paper.In this model the administrative idea of ARBAC (Administrative Model for Role Based Access Control) is integrated into the TRBAC (Task-Role Based Access Control) through adding administrator and administrative privilege.By applying hierarchical management on administrators, the hidden risk caused by the excessive privileges of system administrator is solved, and the security of workflow system is further strengthened.

  19. The Workflow Specification of Process Definition%工作流过程定义规范

    Institute of Scientific and Technical Information of China (English)

    缪晓阳; 石文俊; 吴朝晖

    2000-01-01

    This paper discusses the representation of a business process in a form.There are three basic aspects:the concept of workflow process definition,on which the idea of process definition interchange is raised;the meta-model of workflow,which is used to describe the entities and attributes of entities within the process definition;the workflow process definition language(WPDL),which is used to implement the Process definition.

  20. Execution Time Estimation for Workflow Scheduling

    NARCIS (Netherlands)

    Chirkin, A.M.; Belloum, A..S.Z.; Kovalchuk, S.V.; Makkes, M.X.

    2014-01-01

    Estimation of the execution time is an important part of the workflow scheduling problem. The aim of this paper is to highlight common problems in estimating the workflow execution time and propose a solution that takes into account the complexity and the randomness of the workflow components and th

  1. PRODUCT-ORIENTED WORKFLOW MANAGEMENT IN CAPP

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    A product-oriented process workflow management model is proposed based on the multi-agent technology.The autonomy, inter-operability, scalability and flexibility of agent are used to cooperate the whole process planning andachieve the full share of resource and information. Thus, unnecessary waste of human labor, time and work is reducedand the computer-aided process planning (CAPP) system's adaptability and stability are improved. In the detailed im-plementation, according to the products' BOM (Bill of materials) in structural design, the task assignment, managementcontrol, automatic process making, process examination and process sanction are combined into a unified management tomake it convenient for the adjustment, control and management.

  2. CMS data and workflow management system

    CERN Document Server

    Fanfani, A; Bacchi, W; Codispoti, G; De Filippis, N; Pompili, A; My, S; Abbrescia, M; Maggi, G; Donvito, G; Silvestris, L; Calzolari, F; Sarkar, S; Spiga, D; Cinquili, M; Lacaprara, S; Biasotto, M; Farina, F; Merlo, M; Belforte, S; Kavka, C; Sala, L; Harvey, J; Hufnagel, D; Fanzago, F; Corvo, M; Magini, N; Rehn, J; Toteva, Z; Feichtinger, D; Tuura, L; Eulisse, G; Bockelman, B; Lundstedt, C; Egeland, R; Evans, D; Mason, D; Gutsche, O; Sexton-Kennedy, L; Dagenhart, D W; Afaq, A; Guo, Y; Kosyakov, S; Lueking, L; Sekhri, V; Fisk, I; McBride, P; Bauerdick, L; Bakken, J; Rossman, P; Wicklund, E; Wu, Y; Jones, C; Kuznetsov, V; Riley, D; Dolgert, A; van Lingen, F; Narsky, I; Paus, C; Klute, M; Gomez-Ceballos, G; Piedra-Gomez, J; Miller, M; Mohapatra, A; Lazaridis, C; Bradley, D; Elmer, P; Wildish, T; Wuerthwein, F; Letts, J; Bourilkov, D; Kim, B; Smith, P; Hernandez, J M; Caballero, J; Delgado, A; Flix, J; Cabrillo-Bartolome, I; Kasemann, M; Flossdorf, A; Stadie, H; Kreuzer, P; Khomitch, A; Hof, C; Zeidler, C; Kalini, S; Trunov, A; Saout, C; Felzmann, U; Metson, S; Newbold, D; Geddes, N; Brew, C; Jackson, J; Wakefield, S; De Weirdt, S; Adler, V; Maes, J; Van Mulders, P; Villella, I; Hammad, G; Pukhaeva, N; Kurca, T; Semneniouk, I; Guan, W; Lajas, J A; Teodoro, D; Gregores, E; Baquero, M; Shehzad, A; Kadastik, M; Kodolova, O; Chao, Y; Ming Kuo, C; Filippidis, C; Walzel, G; Han, D; Kalinowski, A; Giro de Almeida, N M; Panyam, N

    CMS expects to manage many tens of peta bytes of data to be distributed over several computing centers around the world. The CMS distributed computing and analysis model is designed to serve, process and archive the large number of events that will be generated when the CMS detector starts taking data. The underlying concepts and the overall architecture of the CMS data and workflow management system will be presented. In addition the experience in using the system for MC production, initial detector commissioning activities and data analysis will be summarized.

  3. Nursing medication administration and workflow using computerized physician order entry.

    Science.gov (United States)

    Tschannen, Dana; Talsma, Akkeneel; Reinemeyer, Nicholas; Belt, Christine; Schoville, Rhonda

    2011-07-01

    The benefits of computerized physician order entry systems have been described widely; however, the impact of computerized physician order entry on nursing workflow and its potential for error are unclear. The purpose of this study was to determine the impact of a computerized physician order entry system on nursing workflow. Using an exploratory design, nurses employed on an adult ICU (n = 36) and a general pediatric unit (n = 50) involved in computerized physician order entry-based medication delivery were observed. Nurses were also asked questions regarding the impact of computerized physician order entry on nursing workflow. Observations revealed total time required for administering medications averaged 8.45 minutes in the ICU and 9.93 minutes in the pediatric unit. Several additional steps were required in the process for pediatric patients, including preparing the medications and communicating with patients and family, which resulted in greater time associated with the delivery of medications. Frequent barriers to workflow were noted by nurses across settings, including system issues (ie, inefficient medication reconciliation processes, long order sets requiring more time to determine medication dosage), less frequent interaction between the healthcare team, and greater use of informal communication modes. Areas for nursing workflow improvement include (1) medication reconciliation/order duplication, (2) strategies to improve communication, and (3) evaluation of the impact of computerized physician order entry on practice standards.

  4. Information Issues and Contexts that Impair Team Based Communication Workflow: A Palliative Sedation Case Study.

    Science.gov (United States)

    Cornett, Alex; Kuziemsky, Craig

    2015-01-01

    Implementing team based workflows can be complex because of the scope of providers involved and the extent of information exchange and communication that needs to occur. While a workflow may represent the ideal structure of communication that needs to occur, information issues and contextual factors may impact how the workflow is implemented in practice. Understanding these issues will help us better design systems to support team based workflows. In this paper we use a case study of palliative sedation therapy (PST) to model a PST workflow and then use it to identify purposes of communication, information issues and contextual factors that impact them. We then suggest how our findings could inform health information technology (HIT) design to support team based communication workflows.

  5. Workflow-based approaches to neuroimaging analysis.

    Science.gov (United States)

    Fissell, Kate

    2007-01-01

    Analysis of functional and structural magnetic resonance imaging (MRI) brain images requires a complex sequence of data processing steps to proceed from raw image data to the final statistical tests. Neuroimaging researchers have begun to apply workflow-based computing techniques to automate data analysis tasks. This chapter discusses eight major components of workflow management systems (WFMSs): the workflow description language, editor, task modules, data access, verification, client, engine, and provenance, and their implementation in the Fiswidgets neuroimaging workflow system. Neuroinformatics challenges involved in applying workflow techniques in the domain of neuroimaging are discussed.

  6. Integrated Sensitivity Analysis Workflow

    Energy Technology Data Exchange (ETDEWEB)

    Friedman-Hill, Ernest J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hoffman, Edward L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gibson, Marcus J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Clay, Robert L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-08-01

    Sensitivity analysis is a crucial element of rigorous engineering analysis, but performing such an analysis on a complex model is difficult and time consuming. The mission of the DART Workbench team at Sandia National Laboratories is to lower the barriers to adoption of advanced analysis tools through software integration. The integrated environment guides the engineer in the use of these integrated tools and greatly reduces the cycle time for engineering analysis.

  7. Development of the workflow kine systems for support on KAIZEN.

    Science.gov (United States)

    Mizuno, Yuki; Ito, Toshihiko; Yoshikawa, Toru; Yomogida, Satoshi; Morio, Koji; Sakai, Kazuhiro

    2012-01-01

    In this paper, we introduce the new workflow line system consisted of the location and image recording, which led to the acquisition of workflow information and the analysis display. From the results of workflow line investigation, we considered the anticipated effects and the problems on KAIZEN. Workflow line information included the location information and action contents information. These technologies suggest the viewpoints to help improvement, for example, exclusion of useless movement, the redesign of layout and the review of work procedure. Manufacturing factory, it was clear that there was much movement from the standard operation place and accumulation residence time. The following was shown as a result of this investigation, to be concrete, the efficient layout was suggested by this system. In the case of the hospital, similarly, it is pointed out that the workflow has the problem of layout and setup operations based on the effective movement pattern of the experts. This system could adapt to routine work, including as well as non-routine work. By the development of this system which can fit and adapt to industrial diversification, more effective "visual management" (visualization of work) is expected in the future.

  8. How to plan workflow changes: a practical quality improvement tool used in an outpatient hospital pharmacy.

    Science.gov (United States)

    Aguilar, Christine; Chau, Connie; Giridharan, Neha; Huh, Youchin; Cooley, Janet; Warholak, Terri L

    2013-06-01

    A quality improvement tool is provided to improve pharmacy workflow with the goal of minimizing errors caused by workflow issues. This study involved workflow evaluation and reorganization, and staff opinions of these proposed changes. The study pharmacy was an outpatient pharmacy in the Tucson area. However, the quality improvement tool may be applied in all pharmacy settings, including but not limited to community, hospital, and independent pharmacies. This tool can help the user to identify potential workflow problem spots, such as high-traffic areas through the creation of current and proposed workflow diagrams. Creating a visual representation can help the user to identify problem spots and to propose changes to optimize workflow. It may also be helpful to assess employees' opinions of these changes. The workflow improvement tool can be used to assess where improvements are needed in a pharmacy's floor plan and workflow. Suggestions for improvements in the study pharmacy included increasing the number of verification points and decreasing high traffic areas in the workflow. The employees of the study pharmacy felt that the proposed changes displayed greater continuity, sufficiency, accessibility, and space within the pharmacy.

  9. An Overview of Workflow Management on Mobile Agent Technology

    Directory of Open Access Journals (Sweden)

    Anup Patnaik

    2014-07-01

    Full Text Available Mobile agent workflow management/plugins is quite appropriate to handle control flows in open distributed system; basically it is the emerging technology which can bring the process oriented tasks to run as a single unit from diverse frameworks. This workflow technology offers organizations the opportunity to reshape business processes beyond the boundaries of their own organizations so that instead of static models, modern era incurring dynamic workflows which can respond the changes during its execution, provide necessary security measures, great degree of adaptivity, troubleshoot the running processes and recovery of lost states through fault tolerance. The prototype that we are planning to design makes sure to hold reliability, security, robustness, scalability without being forced to make tradeoffs the performance. This paper is concerned with design, implementation and evaluation of performance on the improved methods of proposed prototype models based on current research in this domain.

  10. BALANCED SCORECARDS EVALUATION MODEL THAT INCLUDES ELEMENTS OF ENVIRONMENTAL MANAGEMENT SYSTEM USING AHP MODEL

    Directory of Open Access Journals (Sweden)

    Jelena Jovanović

    2010-03-01

    Full Text Available The research is oriented on improvement of environmental management system (EMS using BSC (Balanced Scorecard model that presents strategic model of measurem ents and improvement of organisational performance. The research will present approach of objectives and environmental management me trics involvement (proposed by literature review in conventional BSC in "Ad Barska plovi dba" organisation. Further we will test creation of ECO-BSC model based on business activities of non-profit organisations in order to improve envir onmental management system in parallel with other systems of management. Using this approach we may obtain 4 models of BSC that includ es elements of environmen tal management system for AD "Barska plovidba". Taking into acc ount that implementation and evaluation need long period of time in AD "Barska plovidba", the final choice will be based on 14598 (Information technology - Software product evaluation and ISO 9126 (Software engineering - Product quality using AHP method. Those standards are usually used for evaluation of quality software product and computer programs that serve in organisation as support and factors for development. So, AHP model will be bas ed on evolution criteria based on suggestion of ISO 9126 standards and types of evaluation from two evaluation teams. Members of team & will be experts in BSC and environmental management system that are not em ployed in AD "Barska Plovidba" organisation. The members of team 2 will be managers of AD "Barska Plovidba" organisation (including manage rs from environmental department. Merging results based on previously cr eated two AHP models, one can obtain the most appropriate BSC that includes elements of environmental management system. The chosen model will present at the same time suggestion for approach choice including ecological metrics in conventional BSC model for firm that has at least one ECO strategic orientation.

  11. CamBAfx: Workflow Design, Implementation and Application for Neuroimaging.

    Science.gov (United States)

    Ooi, Cinly; Bullmore, Edward T; Wink, Alle-Meije; Sendur, Levent; Barnes, Anna; Achard, Sophie; Aspden, John; Abbott, Sanja; Yue, Shigang; Kitzbichler, Manfred; Meunier, David; Maxim, Voichita; Salvador, Raymond; Henty, Julian; Tait, Roger; Subramaniam, Naresh; Suckling, John

    2009-01-01

    CamBAfx is a workflow application designed for both researchers who use workflows to process data (consumers) and those who design them (designers). It provides a front-end (user interface) optimized for data processing designed in a way familiar to consumers. The back-end uses a pipeline model to represent workflows since this is a common and useful metaphor used by designers and is easy to manipulate compared to other representations like programming scripts. As an Eclipse Rich Client Platform application, CamBAfx's pipelines and functions can be bundled with the software or downloaded post-installation. The user interface contains all the workflow facilities expected by consumers. Using the Eclipse Extension Mechanism designers are encouraged to customize CamBAfx for their own pipelines. CamBAfx wraps a workflow facility around neuroinformatics software without modification. CamBAfx's design, licensing and Eclipse Branding Mechanism allow it to be used as the user interface for other software, facilitating exchange of innovative computational tools between originating labs.

  12. CamBAfx: workflow design, implementation and application for neuroimaging

    Directory of Open Access Journals (Sweden)

    Cinly Ooi

    2009-08-01

    Full Text Available CamBAfx is a workflow application designed for both researchers who use workflows to process data (consumers and those who design them (designers. It provides a front-end (user interface optimized for data processing designed in a way familiar to consumers. The back-end uses a pipeline model to represent workflows since this is a common and useful metaphor used by designers and is easy to manipulate compared to other representations like programming scripts. As an Eclipse Rich Client Platform application, CamBAfx's pipelines and functions can be bundled with the software or downloaded post-installation. The user interface contains all the workflow facilities expected by consumers. Using the Eclipse Extension Mechanism designers are encouraged to customize CamBAfx for their own pipelines. CamBAfx wraps a workflow facility around neuroinformatics software without modification. CamBAfx's design, licensing and Eclipse Branding Mechanism allow it to be used as the user interface for other software, facilitating exchange of innovative computational tools between originating labs.

  13. Scheduling Multilevel Deadline-Constrained Scientific Workflows on Clouds Based on Cost Optimization

    Directory of Open Access Journals (Sweden)

    Maciej Malawski

    2015-01-01

    Full Text Available This paper presents a cost optimization model for scheduling scientific workflows on IaaS clouds such as Amazon EC2 or RackSpace. We assume multiple IaaS clouds with heterogeneous virtual machine instances, with limited number of instances per cloud and hourly billing. Input and output data are stored on a cloud object store such as Amazon S3. Applications are scientific workflows modeled as DAGs as in the Pegasus Workflow Management System. We assume that tasks in the workflows are grouped into levels of identical tasks. Our model is specified using mathematical programming languages (AMPL and CMPL and allows us to minimize the cost of workflow execution under deadline constraints. We present results obtained using our model and the benchmark workflows representing real scientific applications in a variety of domains. The data used for evaluation come from the synthetic workflows and from general purpose cloud benchmarks, as well as from the data measured in our own experiments with Montage, an astronomical application, executed on Amazon EC2 cloud. We indicate how this model can be used for scenarios that require resource planning for scientific workflows and their ensembles.

  14. Scientific Workflows and the Sensor Web for Virtual Environmental Observatories

    Science.gov (United States)

    Simonis, I.; Vahed, A.

    2008-12-01

    interfaces. All data sets and sensor communication follow well-defined abstract models and corresponding encodings, mostly developed by the OGC Sensor Web Enablement initiative. Scientific progress is currently accelerated by an emerging new concept called scientific workflows, which organize and manage complex distributed computations. A scientific workflow represents and records the highly complex processes that a domain scientist typically would follow in exploration, discovery and ultimately, transformation of raw data to publishable results. The challenge is now to integrate the benefits of scientific workflows with those provided by the Sensor Web in order to leverage all resources for scientific exploration, problem solving, and knowledge generation. Scientific workflows for the Sensor Web represent the next evolutionary step towards efficient, powerful, and flexible earth observation frameworks and platforms. Those platforms support the entire process from capturing data, sharing and integrating, to requesting additional observations. Multiple sites and organizations will participate on single platforms and scientists from different countries and organizations interact and contribute to large-scale research projects. Simultaneously, the data- and information overload becomes manageable, as multiple layers of abstraction will free scientists to deal with underlying data-, processing or storage peculiarities. The vision are automated investigation and discovery mechanisms that allow scientists to pose queries to the system, which in turn would identify potentially related resources, schedules processing tasks and assembles all parts in workflows that may satisfy the query.

  15. Scientific Workflows + Provenance = Better (Meta-)Data Management

    Science.gov (United States)

    Ludaescher, B.; Cuevas-Vicenttín, V.; Missier, P.; Dey, S.; Kianmajd, P.; Wei, Y.; Koop, D.; Chirigati, F.; Altintas, I.; Belhajjame, K.; Bowers, S.

    2013-12-01

    The origin and processing history of an artifact is known as its provenance. Data provenance is an important form of metadata that explains how a particular data product came about, e.g., how and when it was derived in a computational process, which parameter settings and input data were used, etc. Provenance information provides transparency and helps to explain and interpret data products. Other common uses and applications of provenance include quality control, data curation, result debugging, and more generally, 'reproducible science'. Scientific workflow systems (e.g. Kepler, Taverna, VisTrails, and others) provide controlled environments for developing computational pipelines with built-in provenance support. Workflow results can then be explained in terms of workflow steps, parameter settings, input data, etc. using provenance that is automatically captured by the system. Scientific workflows themselves provide a user-friendly abstraction of the computational process and are thus a form of ('prospective') provenance in their own right. The full potential of provenance information is realized when combining workflow-level information (prospective provenance) with trace-level information (retrospective provenance). To this end, the DataONE Provenance Working Group (ProvWG) has developed an extension of the W3C PROV standard, called D-PROV. Whereas PROV provides a 'least common denominator' for exchanging and integrating provenance information, D-PROV adds new 'observables' that described workflow-level information (e.g., the functional steps in a pipeline), as well as workflow-specific trace-level information ( timestamps for each workflow step executed, the inputs and outputs used, etc.) Using examples, we will demonstrate how the combination of prospective and retrospective provenance provides added value in managing scientific data. The DataONE ProvWG is also developing tools based on D-PROV that allow scientists to get more mileage from provenance metadata

  16. 数字化工作流程中自定义作业描述模型的研究%Research on Custom Job Descriptive Model of Digital Printing Workflow

    Institute of Scientific and Technical Information of China (English)

    朱明; 李晓春

    2012-01-01

    Custom printing job descriptive model was designed based on the design thought of "resource link process" in JDF model using existing function modules. The printing workflow software platform based on the new model was developed, and the appIication of the model was tested. XML was used in the model. Compared to JDF model, the logical structure of the custom printing job descriptive model is simple. It can not only meet the need of function customization of enterprise production process, but also reduce development costs of print- ing workflow software platform greatly.%借鉴了JDF作业描述模型“资源链接过程”的设计思想,利用现有的底层功能模块,研究设计了自定义印刷作业描述模型。最后以新模型为基础研发了数字化工作流程软件平台,并测试了模型的应用性。该模型采用XML描述,与JDF相比,自定义作业描述模型的逻辑结构简单,既可满足印刷企业生产流程功能定制的需要,又可大幅度降低印刷数字化工作流程软件的研发成本。

  17. Assessment of the Nurse Medication Administration Workflow Process

    Directory of Open Access Journals (Sweden)

    Nathan Huynh

    2016-01-01

    Full Text Available This paper presents findings of an observational study of the Registered Nurse (RN Medication Administration Process (MAP conducted on two comparable medical units in a large urban tertiary care medical center in Columbia, South Carolina. A total of 305 individual MAP observations were recorded over a 6-week period with an average of 5 MAP observations per RN participant for both clinical units. A key MAP variation was identified in terms of unbundled versus bundled MAP performance. In the unbundled workflow, an RN engages in the MAP by performing only MAP tasks during a care episode. In the bundled workflow, an RN completes medication administration along with other patient care responsibilities during the care episode. Using a discrete-event simulation model, this paper addresses the difference between unbundled and bundled workflow and their effects on simulated redesign interventions.

  18. Assessment of the Nurse Medication Administration Workflow Process

    Science.gov (United States)

    Snyder, Rita; Vidal, José M.; Sharif, Omor; Cai, Bo; Parsons, Bridgette; Bennett, Kevin

    2016-01-01

    This paper presents findings of an observational study of the Registered Nurse (RN) Medication Administration Process (MAP) conducted on two comparable medical units in a large urban tertiary care medical center in Columbia, South Carolina. A total of 305 individual MAP observations were recorded over a 6-week period with an average of 5 MAP observations per RN participant for both clinical units. A key MAP variation was identified in terms of unbundled versus bundled MAP performance. In the unbundled workflow, an RN engages in the MAP by performing only MAP tasks during a care episode. In the bundled workflow, an RN completes medication administration along with other patient care responsibilities during the care episode. Using a discrete-event simulation model, this paper addresses the difference between unbundled and bundled workflow and their effects on simulated redesign interventions.

  19. Analog to digital workflow improvement: a quantitative study.

    Science.gov (United States)

    Wideman, Catherine; Gallet, Jacqueline

    2006-01-01

    This study tracked a radiology department's conversion from utilization of a Kodak Amber analog system to a Kodak DirectView DR 5100 digital system. Through the use of ProModel Optimization Suite, a workflow simulation software package, significant quantitative information was derived from workflow process data measured before and after the change to a digital system. Once the digital room was fully operational and the radiology staff comfortable with the new system, average patient examination time was reduced from 9.24 to 5.28 min, indicating that a higher patient throughput could be achieved. Compared to the analog system, chest examination time for modality specific activities was reduced by 43%. The percentage of repeat examinations experienced with the digital system also decreased to 8% vs. the level of 9.5% experienced with the analog system. The study indicated that it is possible to quantitatively study clinical workflow and productivity by using commercially available software.

  20. Towards Automated Design, Analysis and Optimization of Declarative Curation Workflows

    Directory of Open Access Journals (Sweden)

    Tianhong Song

    2014-10-01

    Full Text Available Data curation is increasingly important. Our previous work on a Kepler curation package has demonstrated advantages that come from automating data curation pipelines by using workflow systems. However, manually designed curation workflows can be error-prone and inefficient due to a lack of user understanding of the workflow system, misuse of actors, or human error. Correcting problematic workflows is often very time-consuming. A more proactive workflow system can help users avoid such pitfalls. For example, static analysis before execution can be used to detect the potential problems in a workflow and help the user to improve workflow design. In this paper, we propose a declarative workflow approach that supports semi-automated workflow design, analysis and optimization. We show how the workflow design engine helps users to construct data curation workflows, how the workflow analysis engine detects different design problems of workflows and how workflows can be optimized by exploiting parallelism.

  1. Soundness of Timed-Arc Workflow Nets in Discrete and Continuous-Time Semantics

    DEFF Research Database (Denmark)

    Mateo, Jose Antonio; Srba, Jiri; Sørensen, Mathias Grund

    2015-01-01

    Analysis of workflow processes with quantitative aspectslike timing is of interest in numerous time-critical applications. We suggest a workflow model based on timed-arc Petri nets and studythe foundational problems of soundness and strong (time-bounded) soundness.We first consider the discrete-t...

  2. Nexus: A modular workflow management system for quantum simulation codes

    Science.gov (United States)

    Krogel, Jaron T.

    2016-01-01

    The management of simulation workflows represents a significant task for the individual computational researcher. Automation of the required tasks involved in simulation work can decrease the overall time to solution and reduce sources of human error. A new simulation workflow management system, Nexus, is presented to address these issues. Nexus is capable of automated job management on workstations and resources at several major supercomputing centers. Its modular design allows many quantum simulation codes to be supported within the same framework. Current support includes quantum Monte Carlo calculations with QMCPACK, density functional theory calculations with Quantum Espresso or VASP, and quantum chemical calculations with GAMESS. Users can compose workflows through a transparent, text-based interface, resembling the input file of a typical simulation code. A usage example is provided to illustrate the process.

  3. A prototype of workflow management system for construction design projects

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    A great deal of benefits can be achieved if information and process are integrated within the building design project. This paper aims to establish a prototype of workflow management system for construction design project through the application of workflow technology. The composition and function of prototype is presented to satisfy the needs of information share and process integration. By integrating all subsystems and modules of the prototype, the whole system can deal with design information-flow modeling, emulating and optimizing, task planning and distributing, automatic tracking and monitoring, as well as network service, etc. In this way, the collaborative design environment of building design project is brought into being.

  4. Flash Builder and Flash Catalyst The New Workflow

    CERN Document Server

    Peeters, Steven

    2010-01-01

    The Flash Platform is changing. Flash Builder and Flash Catalyst have brought a new separation of design and coding to web development that enables a much more efficient and streamlined workflow. For designers and developers used to the close confines of Flash, this is a hugely liberating but at first alien concept. This book teaches you the new workflow for the Flash platform. It gives an overview of the technologies involved and provides you with real-world project examples and best-practice guidelines to get from design to implementation with the tools at hand. * Includes many examples* Foc

  5. Scalable Scientific Workflows Management System SWFMS

    Directory of Open Access Journals (Sweden)

    M. Abdul Rahman

    2016-11-01

    Full Text Available In today’s electronic world conducting scientific experiments, especially in natural sciences domain, has become more and more challenging for domain scientists since “science” today has turned out to be more complex due to the two dimensional intricacy; one: assorted as well as complex computational (analytical applications and two: increasingly large volume as well as heterogeneity of scientific data products processed by these applications. Furthermore, the involvement of increasingly large number of scientific instruments such as sensors and machines makes the scientific data management even more challenging since the data generated from such type of instruments are highly complex. To reduce the amount of complexities in conducting scientific experiments as much as possible, an integrated framework that transparently implements the conceptual separation between both the dimensions is direly needed. In order to facilitate scientific experiments ‘workflow’ technology has in recent years emerged in scientific disciplines like biology, bioinformatics, geology, environmental science, and eco-informatics. Much more research work has been done to develop the scientific workflow systems. However, our analysis over these existing systems shows that they lack a well-structured conceptual modeling methodology to deal with the two complex dimensions in a transparent manner. This paper presents a scientific workflow framework that properly addresses these two dimensional complexities in a proper manner.

  6. Toward Exascale Seismic Imaging: Taming Workflow and I/O Issues

    Science.gov (United States)

    Lefebvre, M. P.; Bozdag, E.; Lei, W.; Rusmanugroho, H.; Smith, J. A.; Tromp, J.; Yuan, Y.

    2013-12-01

    Providing a better understanding of the physics and chemistry of Earth's interior through numerical simulations has always required tremendous computational resources. Post-petascale supercomputers are now available to solve complex scientific problems that were thought unreachable a few decades ago. They also bring a cohort of concerns on how to obtain optimum performance. Several issues are currently being investigated by the HPC community. To name a few, we can list energy consumption, fault resilience, scalability of the current parallel paradigms, large workflow management, I/O performance and feature extraction with large datasets. For this presentation, we focus on the last three issues. In the context of seismic imaging, in particular for simulations based on adjoint methods, workflows are well defined. They consist of a few collective steps (e.g., mesh generation or model updates) and of a large number of independent steps (e.g., forward and adjoint simulations of each seismic event, pre- and postprocessing of seismic traces). The greater goal is to reduce the time to solution, that is, obtaining a more precise representation of the subsurface as fast as possible. This brings us to consider both the workflow in its entirety and the parts composing it. The usual approach is to speedup the purely computational parts by code tuning in order to reach higher FLOPS and better memory usage. This still remains an important concern, but larger scale experiments show that the imaging workflow suffers from a severe I/O bottleneck. This limitation occurs both for purely computational data and seismic time series. The latter are dealt with by the introduction of a new Adaptable Seismic Data Format (ASDF). In both cases, a parallel I/O library, ORNL's ADIOS, is used to drastically lessen the weight of disk access. Moreover, parallel visualization tools, such as VisIt, are able to take advantage of the metadata included in our ADIOS outputs to extract features and

  7. Smart Voyage Planning Model Sensitivity Analysis Using Ocean and Atmospheric Models Including Ensemble Methods

    Science.gov (United States)

    2012-09-01

    ATMOSPHERIC MODELS INCLUDING ENSEMBLE METHODS Scott E. Miller Lieutenant Commander, United States Navy B.S., University of South Carolina, 2000 B.S...Typical gas turbine fuel consumption curve and relationship to sea state .......51  Figure 16.  DDG 58 speed reduction curves for bow seas...Day Time Group ECDIS-N Electronic Chart Display and Information System – Navy ECMWF European Center for Medium Range Weather Forecasts EFAS

  8. Agile parallel bioinformatics workflow management using Pwrake.

    OpenAIRE

    2011-01-01

    Abstract Background In bioinformatics projects, scientific workflow systems are widely used to manage computational procedures. Full-featured workflow systems have been proposed to fulfil the demand for workflow management. However, such systems tend to be over-weighted for actual bioinformatics practices. We realize that quick deployment of cutting-edge software implementing advanced algorithms and data formats, and continuous adaptation to changes in computational resources and the environm...

  9. Content and Workflow Management for Library Websites: Case Studies

    Science.gov (United States)

    Yu, Holly, Ed.

    2005-01-01

    Using database-driven web pages or web content management (WCM) systems to manage increasingly diverse web content and to streamline workflows is a commonly practiced solution recognized in libraries today. However, limited library web content management models and funding constraints prevent many libraries from purchasing commercially available…

  10. Phxnlme: An R package that facilitates pharmacometric workflow of Phoenix NLME analyses.

    Science.gov (United States)

    Lim, Chay Ngee; Liang, Shuang; Feng, Kevin; Chittenden, Jason; Henry, Ana; Mouksassi, Samer; Birnbaum, Angela K

    2017-03-01

    Pharmacometric analyses are integral components of the drug development process, and Phoenix NLME is one of the popular software used to conduct such analyses. To address current limitations with model diagnostic graphics and efficiency of the workflow for this software, we developed an R package, Phxnlme, to facilitate its workflow and provide improved graphical diagnostics. Phxnlme was designed to provide functionality for the major tasks that are usually performed in pharmacometric analyses (i.e. nonlinear mixed effects modeling, basic model diagnostics, visual predictive checks and bootstrap). Various estimation methods for modeling using the R package are made available through the Phoenix NLME engine. The Phxnlme R package utilizes other packages such as ggplot2 and lattice to produce the graphical output, and various features were included to allow customizability of the output. Interactive features for some plots were also added using the manipulate R package. Phxnlme provides enhanced capabilities for nonlinear mixed effects modeling that can be accessed using the phxnlme() command. Output from the model can be graphed to assess the adequacy of model fits and further explore relationships in the data using various functions included in this R package, such as phxplot() and phxvpc.plot(). Bootstraps, stratified up to three variables, can also be performed to obtain confidence intervals around the model estimates. With the use of an R interface, different R projects can be created to allow multi-tasking, which addresses the current limitation of the Phoenix NLME desktop software. In addition, there is a wide selection of diagnostic and exploratory plots in the Phxnlme package, with improvements in the customizability of plots, compared to Phoenix NLME. The Phxnlme package is a flexible tool that allows implementation of the analytical workflow of Phoenix NLME with R, with features for greater overall efficiency and improved customizable graphics. Phxnlme is

  11. You’ve Got Email: a Workflow Management Extraction System

    NARCIS (Netherlands)

    P. chaipornkaew (Piyanuch); T. Prexawanprasut (Takorn); M.J. McAleer (Michael)

    2017-01-01

    textabstractEmail is one of the most powerful tools for communication. Many businesses use email as the main channel for communication, so it is possible that substantial data are included in email content. In order to help businesses grow faster, a workflow management system may be required. The

  12. Science Gateways, Scientific Workflows and Open Community Software

    Science.gov (United States)

    Pierce, M. E.; Marru, S.

    2014-12-01

    Science gateways and scientific workflows occupy different ends of the spectrum of user-focused cyberinfrastructure. Gateways, sometimes called science portals, provide a way for enabling large numbers of users to take advantage of advanced computing resources (supercomputers, advanced storage systems, science clouds) by providing Web and desktop interfaces and supporting services. Scientific workflows, at the other end of the spectrum, support advanced usage of cyberinfrastructure that enable "power users" to undertake computational experiments that are not easily done through the usual mechanisms (managing simulations across multiple sites, for example). Despite these different target communities, gateways and workflows share many similarities and can potentially be accommodated by the same software system. For example, pipelines to process InSAR imagery sets or to datamine GPS time series data are workflows. The results and the ability to make downstream products may be made available through a gateway, and power users may want to provide their own custom pipelines. In this abstract, we discuss our efforts to build an open source software system, Apache Airavata, that can accommodate both gateway and workflow use cases. Our approach is general, and we have applied the software to problems in a number of scientific domains. In this talk, we discuss our applications to usage scenarios specific to earth science, focusing on earthquake physics examples drawn from the QuakSim.org and GeoGateway.org efforts. We also examine the role of the Apache Software Foundation's open community model as a way to build up common commmunity codes that do not depend upon a single "owner" to sustain. Pushing beyond open source software, we also see the need to provide gateways and workflow systems as cloud services. These services centralize operations, provide well-defined programming interfaces, scale elastically, and have global-scale fault tolerance. We discuss our work providing

  13. Talkoot Portals: Discover, Tag, Share, and Reuse Collaborative Science Workflows

    Science.gov (United States)

    Wilson, B. D.; Ramachandran, R.; Lynnes, C.

    2009-05-01

    A small but growing number of scientists are beginning to harness Web 2.0 technologies, such as wikis, blogs, and social tagging, as a transformative way of doing science. These technologies provide researchers easy mechanisms to critique, suggest and share ideas, data and algorithms. At the same time, large suites of algorithms for science analysis are being made available as remotely-invokable Web Services, which can be chained together to create analysis workflows. This provides the research community an unprecedented opportunity to collaborate by sharing their workflows with one another, reproducing and analyzing research results, and leveraging colleagues' expertise to expedite the process of scientific discovery. However, wikis and similar technologies are limited to text, static images and hyperlinks, providing little support for collaborative data analysis. A team of information technology and Earth science researchers from multiple institutions have come together to improve community collaboration in science analysis by developing a customizable "software appliance" to build collaborative portals for Earth Science services and analysis workflows. The critical requirement is that researchers (not just information technologists) be able to build collaborative sites around service workflows within a few hours. We envision online communities coming together, much like Finnish "talkoot" (a barn raising), to build a shared research space. Talkoot extends a freely available, open source content management framework with a series of modules specific to Earth Science for registering, creating, managing, discovering, tagging and sharing Earth Science web services and workflows for science data processing, analysis and visualization. Users will be able to author a "science story" in shareable web notebooks, including plots or animations, backed up by an executable workflow that directly reproduces the science analysis. New services and workflows of interest will be

  14. Improved Subseasonal Prediction with Advanced Coupled Models including the 30km FIM-HYCOM Coupled Model

    Science.gov (United States)

    Benjamin, Stan; Sun, Shan; Grell, Georg; Green, Benjamin; Bleck, Rainer; Li, Haiqin

    2017-04-01

    Extreme events for subseasonal duration have been linked to multi-week processes related to onset, duration, and cessation of blocking events or, more generally, quasi-stationary waves. Results will be shown from different sets of 32-day prediction experiments (3200 runs each) over a 16-year period for earth system processes key for subseasonal prediction for different resolution, numerics, and physics using the FIM-HYCOM coupled model. The coupled atmosphere (FIM) and ocean (HYCOM) modeling system is a relatively new coupled atmosphere-ocean model developed for subseasonal to seasonal prediction (Green et al. 2017 Mon.Wea.Rev. accepted, Bleck et al 2015 Mon. Wea. Rev.). Both component models operate on a common icosahedral horizontal grid and use an adaptive hybrid vertical coordinate (sigma-isentropic in FIM and sigma-isopycnic in HYCOM). FIM-HYCOM has been used to conduct 16 years of subseasonal retrospective forecasts following the NOAA Subseasonal (SubX) NMME protocol (32-day forward integrations), run with 4 ensemble members per week. Results from this multi-year FIM-HYCOM hindcast include successful forecasts out to 14-20 days for stratospheric warming events (from archived 10 hPa fields), improved MJO predictability (Green et al. 2017) using the Grell-Freitas (2014, ACP) scale-aware cumulus scheme instead of the Simplified Arakawa-Schubert scheme, and little sensitivity to resolution for blocking frequency. Forecast skill of metrics from FIM-HYCOM including 500 hPa heights and MJO index is at least comparable to that of the operational Climate Forecast System (CFSv2) used by the National Centers for Environmental Prediction. Subseasonal skill is improved with a limited multi-model (FIM-HYCOM and CFSv2), consistent with previous seasonal multi-model ensemble results. Ongoing work will also be reported on for adding inline aerosol/chemistry treatment to the coupled FIM-HYCOM model and for advanced approaches to subgrid-scale clouds to address regional biases

  15. Linking Geobiology Fieldwork and Data Curation Through Workflow Documentation

    Science.gov (United States)

    Thomer, A.; Baker, K. S.; Jett, J. G.; Gordon, S.; Palmer, C. L.

    2014-12-01

    Describing the specific processes and artifacts that lead to the creation of data products provides a detailed picture of data provenance in the form of a high-level workflow. The resulting diagram identifies:1. "points of intervention" at which curation processes can be moved upstream, and 2. data products that may be important for sharing and preservation. The Site-Based Data Curation project, an Institute of Museum and Library Services-funded project hosted by the Center for Informatics Research in Science and Scholarship at the University of Illinois, previously inferred a geobiologist's planning, field and laboratory workflows through close study of the data products produced during a single field trip to Yellowstone National Park (Wickett et al, 2013). We have since built on this work by documenting post hoc curation processes, and integrating them with the existing workflow. By holistically considering both data collection and curation, we are able to identify concrete steps that scientists can take to begin curating data in the field. This field-to-repository workflow represents a first step toward a more comprehensive and nuanced model of the research data lifecycle. Using our initial three-phase workflow, we identify key data products to prioritize for curation, and the points at which data curation best practices integrate with research processes with minimal interruption. We then document the processes that make key data products sharable and ready for preservation. We append the resulting curatorial phases to the field data collection workflow: Data Staging, Data Standardizing and Data Packaging. These refinements demonstrate:1) the interdependence of research and curatorial phases;2) the links between specific research products, research phases and curatorial processes; 3) the interdependence of laboratory-specific standards and community-wide best practices. We propose a poster that shows the six-phase workflow described above. We plan to discuss

  16. Dynamic Voltage Frequency Scaling Simulator for Real Workflows Energy-Aware Management in Green Cloud Computing.

    Science.gov (United States)

    Cotes-Ruiz, Iván Tomás; Prado, Rocío P; García-Galán, Sebastián; Muñoz-Expósito, José Enrique; Ruiz-Reyes, Nicolás

    2017-01-01

    Nowadays, the growing computational capabilities of Cloud systems rely on the reduction of the consumed power of their data centers to make them sustainable and economically profitable. The efficient management of computing resources is at the heart of any energy-aware data center and of special relevance is the adaptation of its performance to workload. Intensive computing applications in diverse areas of science generate complex workload called workflows, whose successful management in terms of energy saving is still at its beginning. WorkflowSim is currently one of the most advanced simulators for research on workflows processing, offering advanced features such as task clustering and failure policies. In this work, an expected power-aware extension of WorkflowSim is presented. This new tool integrates a power model based on a computing-plus-communication design to allow the optimization of new management strategies in energy saving considering computing, reconfiguration and networks costs as well as quality of service, and it incorporates the preeminent strategy for on host energy saving: Dynamic Voltage Frequency Scaling (DVFS). The simulator is designed to be consistent in different real scenarios and to include a wide repertory of DVFS governors. Results showing the validity of the simulator in terms of resources utilization, frequency and voltage scaling, power, energy and time saving are presented. Also, results achieved by the intra-host DVFS strategy with different governors are compared to those of the data center using a recent and successful DVFS-based inter-host scheduling strategy as overlapped mechanism to the DVFS intra-host technique.

  17. 基于ECA规则和活动分解的工作流模型%A Workflow Model Based on ECA Rules and Activity Decomposition

    Institute of Scientific and Technical Information of China (English)

    胡锦敏; 张申生; 余新颖

    2002-01-01

    企业在面临电子商务的挑战中,越来越重视业务过程重组.建立一种合理的流程模型是成功开展BPR(business process re-engineering)的关键.这样的模型应该可以集成企业许多业务相关的信息并且是可被系统解释执行的.在参考WfMC(workflow management coalition)元模型基础上建立了一种基于ECA(event-condition-action)规则和活动分解的工作流模型.ECA规则反映活动之间的执行依赖关系,通过重写办法把ECA模型变为触发器形式的TA(trigger-action)模型,使模型的解析更高效,把事件重写为事件发生时间,使事件表达式具有更强的表达能力.活动分解的模型能很好地支持层次化项目管理.

  18. Putting Lipstick on Pig: Enabling Database-style Workflow Provenance

    CERN Document Server

    Amsterdamer, Yael; Deutch, Daniel; Milo, Tova; Stoyanovich, Julia; Tannen, Val

    2012-01-01

    Workflow provenance typically assumes that each module is a "black-box", so that each output depends on all inputs (coarse-grained dependencies). Furthermore, it does not model the internal state of a module, which can change between repeated executions. In practice, however, an output may depend on only a small subset of the inputs (fine-grained dependencies) as well as on the internal state of the module. We present a novel provenance framework that marries database-style and workflow-style provenance, by using Pig Latin to expose the functionality of modules, thus capturing internal state and fine-grained dependencies. A critical ingredient in our solution is the use of a novel form of provenance graph that models module invocations and yields a compact representation of fine-grained workflow provenance. It also enables a number of novel graph transformation operations, allowing to choose the desired level of granularity in provenance querying (ZoomIn and ZoomOut), and supporting "what-if" workflow analyti...

  19. Managing Library IT Workflow with Bugzilla

    Directory of Open Access Journals (Sweden)

    Nina McHale

    2010-09-01

    Full Text Available Prior to September 2008, all technology issues at the University of Colorado Denver's Auraria Library were reported to a dedicated departmental phone line. A variety of staff changes necessitated a more formal means of tracking, delegating, and resolving reported issues, and the department turned to Bugzilla, an open source bug tracking application designed by Mozilla.org developers. While designed with software development bug tracking in mind, Bugzilla can be easily customized and modified to serve as an IT ticketing system. Twenty-three months and over 2300 trouble tickets later, Auraria's IT department workflow is much smoother and more efficient. This article includes two Perl Template Toolkit code samples for customized Bugzilla screens for its use in a library environment; readers will be able to easily replicate the project in their own environments.

  20. Implementing bioinformatic workflows within the bioextract server

    Science.gov (United States)

    Computational workflows in bioinformatics are becoming increasingly important in the achievement of scientific advances. These workflows typically require the integrated use of multiple, distributed data sources and analytic tools. The BioExtract Server (http://bioextract.org) is a distributed servi...

  1. Contracts for Cross-Organizational Workflow Management

    NARCIS (Netherlands)

    Koetsier, M.J.; Grefen, P.W.P.J.; Vonk, J.

    1999-01-01

    Nowadays, many organizations form dynamic partnerships to deal effectively with market requirements. As companies use automated workflow systems to control their processes, a way of linking workflow processes in different organizations is useful in turning the co-operating companies into a seamless

  2. Contracts for Cross-Organizational Workflow Management

    NARCIS (Netherlands)

    Koetsier, M.; Grefen, P.; Vonk, J.

    1999-01-01

    Nowadays, many organizations form dynamic partnerships to deal effectively with market requirements. As companies use automated workflow systems to control their processes, a way of linking workflow processes in different organizations is useful in turning the co-operating companies into a seamless

  3. Metaworkflows and Workflow Interoperability for Heliophysics

    Science.gov (United States)

    Pierantoni, Gabriele; Carley, Eoin P.

    2014-06-01

    Heliophysics is a relatively new branch of physics that investigates the relationship between the Sun and the other bodies of the solar system. To investigate such relationships, heliophysicists can rely on various tools developed by the community. Some of these tools are on-line catalogues that list events (such as Coronal Mass Ejections, CMEs) and their characteristics as they were observed on the surface of the Sun or on the other bodies of the Solar System. Other tools offer on-line data analysis and access to images and data catalogues. During their research, heliophysicists often perform investigations that need to coordinate several of these services and to repeat these complex operations until the phenomena under investigation are fully analyzed. Heliophysicists combine the results of these services; this service orchestration is best suited for workflows. This approach has been investigated in the HELIO project. The HELIO project developed an infrastructure for a Virtual Observatory for Heliophysics and implemented service orchestration using TAVERNA workflows. HELIO developed a set of workflows that proved to be useful but lacked flexibility and re-usability. The TAVERNA workflows also needed to be executed directly in TAVERNA workbench, and this forced all users to learn how to use the workbench. Within the SCI-BUS and ER-FLOW projects, we have started an effort to re-think and re-design the heliophysics workflows with the aim of fostering re-usability and ease of use. We base our approach on two key concepts, that of meta-workflows and that of workflow interoperability. We have divided the produced workflows in three different layers. The first layer is Basic Workflows, developed both in the TAVERNA and WS-PGRADE languages. They are building blocks that users compose to address their scientific challenges. They implement well-defined Use Cases that usually involve only one service. The second layer is Science Workflows usually developed in TAVERNA. They

  4. Workflow Management in CLARIN-DK

    DEFF Research Database (Denmark)

    Jongejan, Bart

    2013-01-01

    output, given the input resource at hand. Still, in such cases it may be possible to reach the set goal by chaining a number of tools. The approach presented here frees the user of having to meddle with tools and the construction of workflows. Instead, the user only needs to supply the workflow manager......The CLARIN-DK infrastructure is not only a repository of resources, but also a place where users can analyse, annotate, reformat and potentially even translate resources, using tools that are integrated in the infrastructure as web services. In many cases a single tool does not produce the desired...... with the features that describe her goal, because the workflow manager not only executes chains of tools in a workflow, but also takes care of autonomously devising workflows that serve the user’s intention, given the tools that currently are integrated in the infrastructure as web services. To do this...

  5. Automated data reduction workflows for astronomy

    CERN Document Server

    Freudling, W; Bramich, D M; Ballester, P; Forchi, V; Garcia-Dablo, C E; Moehler, S; Neeser, M J

    2013-01-01

    Data from complex modern astronomical instruments often consist of a large number of different science and calibration files, and their reduction requires a variety of software tools. The execution chain of the tools represents a complex workflow that needs to be tuned and supervised, often by individual researchers that are not necessarily experts for any specific instrument. The efficiency of data reduction can be improved by using automatic workflows to organise data and execute the sequence of data reduction steps. To realize such efficiency gains, we designed a system that allows intuitive representation, execution and modification of the data reduction workflow, and has facilities for inspection and interaction with the data. The European Southern Observatory (ESO) has developed Reflex, an environment to automate data reduction workflows. Reflex is implemented as a package of customized components for the Kepler workflow engine. Kepler provides the graphical user interface to create an executable flowch...

  6. Workflow Tools for Digital Curation

    Directory of Open Access Journals (Sweden)

    Andrew James Weidner

    2013-04-01

    Full Text Available Maintaining usable and sustainable digital collections requires a complex set of actions that address the many challenges at various stages of the digital object lifecycle. Digital curation activities enhance access and retrieval, maintain quality, add value, and facilitate use and re-use over time. Digital resource lifecycle management is becoming an increasingly important topic as digital curators actively explore software tools that perform metadata curation and file management tasks. Accordingly, the University of North Texas (UNT Libraries develop tools and workflows that streamline production and quality assurance activities. This article demonstrates two open source software tools, AutoHotkey and Selenium IDE, which the UNT Digital Libraries Division has adopted for use during the pre-ingest and post-ingest stages of the digital resource lifecycle.

  7. CSP for Executable Scientific Workflows

    DEFF Research Database (Denmark)

    Friborg, Rune Møllegaard

    and can usually benefit performance-wise from both multiprocessing, cluster and grid environments. PyCSP is an implementation of Communicating Sequential Processes (CSP) for the Python programming language and takes advantage of CSP's formal and verifiable approach to controlling concurrency...... and the readability of Python source code. Python is a popular programming language in the scientific community, with many scientific libraries (modules) and simple integration to external languages. This thesis presents a PyCSP extended with many new features and a more robust implementation to allow scientific...... is demonstrated through examples. By providing a robust library for organising scientific workflows in a Python application I hope to inspire scientific users to adopt PyCSP. As a proof-of-concept this thesis demonstrates three scientific applications: kNN, stochastic minimum search and McStas to scale well...

  8. SURVEY OF WORKFLOW ANALYSIS IN PAST AND PRESENT ISSUES

    Directory of Open Access Journals (Sweden)

    SARAVANAN .M.S,

    2011-06-01

    Full Text Available This paper surveys the workflow analysis in the view of business process for all organizations. The business can be defined as an organization that provides goods and services to others, who want or need them. The concept of managing business processes is referred to as Business Process Management (BPM. A workflow is the automation of a business process, in whole or part, during which documents, information or tasks are passed from one participant to another for action, according to a set of procedural rules. The process mining aims at extracting useful and meaningful information from event logs, which is a set of real executions of business process at any organizations. This paper briefly reviews the state-or-the-art of business processes developed so far and the techniques adopted. Also presents, the survey of workflow analysis in the view of business process can be broadly classified into four major categories, they are Business Process Modeling, Ontology based Business Process Management, Workflow based Business Process Controlling and Business Process Mining.

  9. Images crossing borders: image and workflow sharing on multiple levels.

    Science.gov (United States)

    Ross, Peeter; Pohjonen, Hanna

    2011-04-01

    Digitalisation of medical data makes it possible to share images and workflows between related parties. In addition to linear data flow where healthcare professionals or patients are the information carriers, a new type of matrix of many-to-many connections is emerging. Implementation of shared workflow brings challenges of interoperability and legal clarity. Sharing images or workflows can be implemented on different levels with different challenges: inside the organisation, between organisations, across country borders, or between healthcare institutions and citizens. Interoperability issues vary according to the level of sharing and are either technical or semantic, including language. Legal uncertainty increases when crossing national borders. Teleradiology is regulated by multiple European Union (EU) directives and legal documents, which makes interpretation of the legal system complex. To achieve wider use of eHealth and teleradiology several strategic documents were published recently by the EU. Despite EU activities, responsibility for organising, providing and funding healthcare systems remains with the Member States. Therefore, the implementation of new solutions requires strong co-operation between radiologists, societies of radiology, healthcare administrators, politicians and relevant EU authorities. The aim of this article is to describe different dimensions of image and workflow sharing and to analyse legal acts concerning teleradiology in the EU.

  10. An Optimization Algorithm for Multipath Parallel Allocation for Service Resource in the Simulation Task Workflow

    OpenAIRE

    Zhiteng Wang; Hongjun Zhang; Rui Zhang; Yong Li; Xuliang Zhang

    2014-01-01

    Service oriented modeling and simulation are hot issues in the field of modeling and simulation, and there is need to call service resources when simulation task workflow is running. How to optimize the service resource allocation to ensure that the task is complete effectively is an important issue in this area. In military modeling and simulation field, it is important to improve the probability of success and timeliness in simulation task workflow. Therefore, this paper proposes an optimiz...

  11. Secretome Analysis of Lipid-Induced Insulin Resistance in Skeletal Muscle Cells by a Combined Experimental and Bioinformatics Workflow

    DEFF Research Database (Denmark)

    Deshmukh, Atul S; Cox, Juergen; Jensen, Lars Juhl

    2015-01-01

    the secretome of lipid-induced insulin-resistant skeletal muscle cells. Our workflow identified 1073 putative secreted proteins including 32 growth factors, 25 cytokines, and 29 metalloproteinases. In addition to previously reported proteins, we report hundreds of novel ones. Intriguingly, ∼40% of the secreted...... proteins were regulated under insulin-resistant conditions, including a protein family with signal peptide and EGF-like domain structure that had not yet been associated with insulin resistance. Finally, we report that secretion of IGF and IGF-binding proteins was down-regulated under insulin-resistant...... conditions. Our study demonstrates an efficient combined experimental and bioinformatics workflow to identify putative secreted proteins from insulin-resistant skeletal muscle cells, which could easily be adapted to other cellular models....

  12. A computational workflow for designing silicon donor qubits

    Science.gov (United States)

    Humble, Travis S.; Ericson, M. Nance; Jakowski, Jacek; Huang, Jingsong; Britton, Charles; Curtis, Franklin G.; Dumitrescu, Eugene F.; Mohiyaddin, Fahd A.; Sumpter, Bobby G.

    2016-10-01

    Developing devices that can reliably and accurately demonstrate the principles of superposition and entanglement is an on-going challenge for the quantum computing community. Modeling and simulation offer attractive means of testing early device designs and establishing expectations for operational performance. However, the complex integrated material systems required by quantum device designs are not captured by any single existing computational modeling method. We examine the development and analysis of a multi-staged computational workflow that can be used to design and characterize silicon donor qubit systems with modeling and simulation. Our approach integrates quantum chemistry calculations with electrostatic field solvers to perform detailed simulations of a phosphorus dopant in silicon. We show how atomistic details can be synthesized into an operational model for the logical gates that define quantum computation in this particular technology. The resulting computational workflow realizes a design tool for silicon donor qubits that can help verify and validate current and near-term experimental devices.

  13. Server-side workflow execution using data grid technology for reproducible analyses of data-intensive hydrologic systems

    Science.gov (United States)

    Essawy, Bakinam T.; Goodall, Jonathan L.; Xu, Hao; Rajasekar, Arcot; Myers, James D.; Kugler, Tracy A.; Billah, Mirza M.; Whitton, Mary C.; Moore, Reagan W.

    2016-04-01

    Many geoscience disciplines utilize complex computational models for advancing understanding and sustainable management of Earth systems. Executing such models and their associated data preprocessing and postprocessing routines can be challenging for a number of reasons including (1) accessing and preprocessing the large volume and variety of data required by the model, (2) postprocessing large data collections generated by the model, and (3) orchestrating data processing tools, each with unique software dependencies, into workflows that can be easily reproduced and reused. To address these challenges, the work reported in this paper leverages the Workflow Structured Object functionality of the Integrated Rule-Oriented Data System and demonstrates how it can be used to access distributed data, encapsulate hydrologic data processing as workflows, and federate with other community-driven cyberinfrastructure systems. The approach is demonstrated for a study investigating the impact of drought on populations in the Carolinas region of the United States. The analysis leverages computational modeling along with data from the Terra Populus project and data management and publication services provided by the Sustainable Environment-Actionable Data project. The work is part of a larger effort under the DataNet Federation Consortium project that aims to demonstrate data and computational interoperability across cyberinfrastructure developed independently by scientific communities.

  14. WARP (workflow for automated and rapid production): a framework for end-to-end automated digital print workflows

    Science.gov (United States)

    Joshi, Parag

    2006-02-01

    Publishing industry is experiencing a major paradigm shift with the advent of digital publishing technologies. A large number of components in the publishing and print production workflow are transformed in this shift. However, the process as a whole requires a great deal of human intervention for decision making and for resolving exceptions during job execution. Furthermore, a majority of the best-of-breed applications for publishing and print production are intrinsically designed and developed to be driven by humans. Thus, the human-intensive nature of the current prepress process accounts for a very significant amount of the overhead costs in fulfillment of jobs on press. It is a challenge to automate the functionality of applications built with the model of human driven exectution. Another challenge is to orchestrate various components in the publishing and print production pipeline such that they work in a seamless manner to enable the system to perform automatic detection of potential failures and take corrective actions in a proactive manner. Thus, there is a great need for a coherent and unifying workflow architecture that streamlines the process and automates it as a whole in order to create an end-to-end digital automated print production workflow that does not involve any human intervention. This paper describes an architecture and building blocks that lay the foundation for a plurality of automated print production workflows.

  15. Iterative Workflows for Numerical Simulations in Subsurface Sciences

    Energy Technology Data Exchange (ETDEWEB)

    Chase, Jared M.; Schuchardt, Karen L.; Chin, George; Daily, Jeffrey A.; Scheibe, Timothy D.

    2008-07-08

    Numerical simulators are frequently used to assess future risks, support remediation and monitoring program decisions, and assist in design of specific remedial actions with respect to groundwater contaminants. Due to the complexity of the subsurface environment and uncertainty in the models, many alternative simulations must be performed, each producing data that is typically post-processed and analyzed before deciding on the next set of simulations Though parts of the process are readily amenable to automation through scientific workflow tools, the larger”research workflow”, is not supported by current tools. We present a detailed use case for subsurface modeling, describe the use case in terms of workflow structure, briefly summarize a prototype that seeks to facilitate the overall modeling process, and discuss the many challenges for building such a comprehensive environment.

  16. The design of cloud workflow systems

    CERN Document Server

    Liu, Xiao; Zhang, Gaofeng

    2011-01-01

    Cloud computing is the latest market-oriented computing paradigm which brings software design and development into a new era characterized by ""XaaS"", i.e. everything as a service. Cloud workflows, as typical software applications in the cloud, are composed of a set of partially ordered cloud software services to achieve specific goals. However, due to the low QoS (quality of service) nature of the cloud environment, the design of workflow systems in the cloud becomes a challenging issue for the delivery of high quality cloud workflow applications. To address such an issue, this book presents

  17. Implementing bioinformatic workflows within the bioextract server.

    Science.gov (United States)

    Lushbough, Carol M; Bergman, Michael K; Lawrence, Carolyn J; Jennewein, Doug; Brendel, Volker

    2008-01-01

    Computational workflows in bioinformatics are becoming increasingly important in the achievement of scientific advances. These workflows typically require the integrated use of multiple, distributed data sources and analytic tools. The BioExtract Server (http://bioextract.org) is a distributed service designed to provide researchers with the web ability to query multiple data sources, save results as searchable data sets, and execute analytic tools. As the researcher works with the system, their tasks are saved in the background. At any time these steps can be saved as a workflow that can then be executed again and/or modified later.

  18. Akuna: An Open Source User Environment for Managing Subsurface Simulation Workflows

    Science.gov (United States)

    Freedman, V. L.; Agarwal, D.; Bensema, K.; Finsterle, S.; Gable, C. W.; Keating, E. H.; Krishnan, H.; Lansing, C.; Moeglein, W.; Pau, G. S. H.; Porter, E.; Scheibe, T. D.

    2014-12-01

    The U.S. Department of Energy (DOE) is investing in development of a numerical modeling toolset called ASCEM (Advanced Simulation Capability for Environmental Management) to support modeling analyses at legacy waste sites. ASCEM is an open source and modular computing framework that incorporates new advances and tools for predicting contaminant fate and transport in natural and engineered systems. The ASCEM toolset includes both a Platform with Integrated Toolsets (called Akuna) and a High-Performance Computing multi-process simulator (called Amanzi). The focus of this presentation is on Akuna, an open-source user environment that manages subsurface simulation workflows and associated data and metadata. In this presentation, key elements of Akuna are demonstrated, which includes toolsets for model setup, database management, sensitivity analysis, parameter estimation, uncertainty quantification, and visualization of both model setup and simulation results. A key component of the workflow is in the automated job launching and monitoring capabilities, which allow a user to submit and monitor simulation runs on high-performance, parallel computers. Visualization of large outputs can also be performed without moving data back to local resources. These capabilities make high-performance computing accessible to the users who might not be familiar with batch queue systems and usage protocols on different supercomputers and clusters.

  19. Facilitating Stewardship of scientific data through standards based workflows

    Science.gov (United States)

    Bastrakova, I.; Kemp, C.; Potter, A. K.

    2013-12-01

    There are main suites of standards that can be used to define the fundamental scientific methodology of data, methods and results. These are firstly Metadata standards to enable discovery of the data (ISO 19115), secondly the Sensor Web Enablement (SWE) suite of standards that include the O&M and SensorML standards and thirdly Ontology that provide vocabularies to define the scientific concepts and relationships between these concepts. All three types of standards have to be utilised by the practicing scientist to ensure that those who ultimately have to steward the data stewards to ensure that the data can be preserved curated and reused and repurposed. Additional benefits of this approach include transparency of scientific processes from the data acquisition to creation of scientific concepts and models, and provision of context to inform data use. Collecting and recording metadata is the first step in scientific data flow. The primary role of metadata is to provide details of geographic extent, availability and high-level description of data suitable for its initial discovery through common search engines. The SWE suite provides standardised patterns to describe observations and measurements taken for these data, capture detailed information about observation or analytical methods, used instruments and define quality determinations. This information standardises browsing capability over discrete data types. The standardised patterns of the SWE standards simplify aggregation of observation and measurement data enabling scientists to transfer disintegrated data to scientific concepts. The first two steps provide a necessary basis for the reasoning about concepts of ';pure' science, building relationship between concepts of different domains (linked-data), and identifying domain classification and vocabularies. Geoscience Australia is re-examining its marine data flows, including metadata requirements and business processes, to achieve a clearer link between

  20. An open source approach to enable the reproducibility of scientific workflows in the ocean sciences

    Science.gov (United States)

    Di Stefano, M.; Fox, P. A.; West, P.; Hare, J. A.; Maffei, A. R.

    2013-12-01

    Every scientist should be able to rerun data analyses conducted by his or her team and regenerate the figures in a paper. However, all too often the correct version of a script goes missing, or the original raw data is filtered by hand and the filtering process is undocumented, or there is lack of collaboration and communication among scientists working in a team. Here we present 3 different use cases in ocean sciences in which end-to-end workflows are tracked. The main tool that is deployed to address these use cases is based on a web application (IPython Notebook) that provides the ability to work on very diverse and heterogeneous data and information sources, providing an effective way to share the and track changes to source code used to generate data products and associated metadata, as well as to track the overall workflow provenance to allow versioned reproducibility of a data product. Use cases selected for this work are: 1) A partial reproduction of the Ecosystem Status Report (ESR) for the Northeast U.S. Continental Shelf Large Marine Ecosystem. Our goal with this use case is to enable not just the traceability but also the reproducibility of this biannual report, keeping track of all the processes behind the generation and validation of time-series and spatial data and information products. An end-to-end workflow with code versioning is developed so that indicators in the report may be traced back to the source datasets. 2) Realtime generation of web pages to be able to visualize one of the environmental indicators from the Ecosystem Advisory for the Northeast Shelf Large Marine Ecosystem web site. 3) Data and visualization integration for ocean climate forecasting. In this use case, we focus on a workflow to describe how to provide access to online data sources in the NetCDF format and other model data, and make use of multicore processing to generate video animation from time series of gridded data. For each use case we show how complete workflows

  1. Workflow to numerically reproduce laboratory ultrasonic datasets

    Institute of Scientific and Technical Information of China (English)

    A. Biryukov; N. Tisato; G. Grasselli

    2014-01-01

    The risks and uncertainties related to the storage of high-level radioactive waste (HLRW) can be reduced thanks to focused studies and investigations. HLRWs are going to be placed in deep geological re-positories, enveloped in an engineered bentonite barrier, whose physical conditions are subjected to change throughout the lifespan of the infrastructure. Seismic tomography can be employed to monitor its physical state and integrity. The design of the seismic monitoring system can be optimized via con-ducting and analyzing numerical simulations of wave propagation in representative repository geometry. However, the quality of the numerical results relies on their initial calibration. The main aim of this paper is to provide a workflow to calibrate numerical tools employing laboratory ultrasonic datasets. The finite difference code SOFI2D was employed to model ultrasonic waves propagating through a laboratory sample. Specifically, the input velocity model was calibrated to achieve a best match between experi-mental and numerical ultrasonic traces. Likely due to the imperfections of the contact surfaces, the resultant velocities of P- and S-wave propagation tend to be noticeably lower than those a priori assigned. Then, the calibrated model was employed to estimate the attenuation in a montmorillonite sample. The obtained low quality factors (Q) suggest that pronounced inelastic behavior of the clay has to be taken into account in geophysical modeling and analysis. Consequently, this contribution should be considered as a first step towards the creation of a numerical tool to evaluate wave propagation in nuclear waste repositories.

  2. Improved compliance by BPM-driven workflow automation.

    Science.gov (United States)

    Holzmüller-Laue, Silke; Göde, Bernd; Fleischer, Heidi; Thurow, Kerstin

    2014-12-01

    Using methods and technologies of business process management (BPM) for the laboratory automation has important benefits (i.e., the agility of high-level automation processes, rapid interdisciplinary prototyping and implementation of laboratory tasks and procedures, and efficient real-time process documentation). A principal goal of the model-driven development is the improved transparency of processes and the alignment of process diagrams and technical code. First experiences of using the business process model and notation (BPMN) show that easy-to-read graphical process models can achieve and provide standardization of laboratory workflows. The model-based development allows one to change processes quickly and an easy adaption to changing requirements. The process models are able to host work procedures and their scheduling in compliance with predefined guidelines and policies. Finally, the process-controlled documentation of complex workflow results addresses modern laboratory needs of quality assurance. BPMN 2.0 as an automation language to control every kind of activity or subprocess is directed to complete workflows in end-to-end relationships. BPMN is applicable as a system-independent and cross-disciplinary graphical language to document all methods in laboratories (i.e., screening procedures or analytical processes). That means, with the BPM standard, a communication method of sharing process knowledge of laboratories is also available. © 2014 Society for Laboratory Automation and Screening.

  3. Security aspects in teleradiology workflow

    Science.gov (United States)

    Soegner, Peter I.; Helweg, Gernot; Holzer, Heimo; zur Nedden, Dieter

    2000-05-01

    The medicolegal necessity of privacy, security and confidentiality was the aim of the attempt to develop a secure teleradiology workflow between the telepartners -- radiologist and the referring physician. To avoid the lack of dataprotection and datasecurity we introduced biometric fingerprint scanners in combination with smart cards to identify the teleradiology partners and communicated over an encrypted TCP/IP satellite link between Innsbruck and Reutte. We used an asymmetric kryptography method to guarantee authentification, integrity of the data-packages and confidentiality of the medical data. It was necessary to use a biometric feature to avoid a case of mistaken identity of persons, who wanted access to the system. Only an invariable electronical identification allowed a legal liability to the final report and only a secure dataconnection allowed the exchange of sensible medical data between different partners of Health Care Networks. In our study we selected the user friendly combination of a smart card and a biometric fingerprint technique, called SkymedTM Double Guard Secure Keyboard (Agfa-Gevaert) to confirm identities and log into the imaging workstations and the electronic patient record. We examined the interoperability of the used software with the existing platforms. Only the WIN-XX operating systems could be protected at the time of our study.

  4. ATEFlap aerodynamic model, a dynamic stall model including the effects of trailing edge flap deflection

    Energy Technology Data Exchange (ETDEWEB)

    Bergami, L.; Gaunaa, M.

    2012-02-15

    The report presents the ATEFlap aerodynamic model, which computes the unsteady lift, drag and moment on a 2D airfoil section equipped with Adaptive Trailing Edge Flap. The model captures the unsteady response related to the effects of the vorticity shed into the wake, and the dynamics of flow separation a thin-airfoil potential flow model is merged with a dynamic stall model of the Beddoes-Leishmann type. The inputs required by the model are steady data for lift, drag, and moment coefficients as function of angle of attack and flap deflection. Further steady data used by the Beddoes- Leishmann dynamic stall model are computed in an external preprocessor application, which gives the user the possibility to verify, and eventually correct, the steady data passed to the aerodynamic model. The ATEFlap aerodynamic model is integrated in the aeroelastic simulation tool HAWC2, thus al- lowing to simulate the response of a wind turbine with trailing edge flaps on the rotor. The algorithms used by the preprocessor, and by aerodynamic model are presented, and modifications to previous implementations of the aerodynamic model are briefly discussed. The performance and the validity of the model are verified by comparing the dynamic response computed by the ATEFlap with solutions from CFD simulations. (Author)

  5. Workflow Dynamics and the Imaging Value Chain: Quantifying the Effect of Designating a Nonimage-Interpretive Task Workflow.

    Science.gov (United States)

    Lee, Matthew H; Schemmel, Andrew J; Pooler, B Dustin; Hanley, Taylor; Kennedy, Tabassum A; Field, Aaron S; Wiegmann, Douglas; Yu, John-Paul J

    To assess the impact of separate non-image interpretive task and image-interpretive task workflows in an academic neuroradiology practice. A prospective, randomized, observational investigation of a centralized academic neuroradiology reading room was performed. The primary reading room fellow was observed over a one-month period using a time-and-motion methodology, recording frequency and duration of tasks performed. Tasks were categorized into separate image interpretive and non-image interpretive workflows. Post-intervention observation of the primary fellow was repeated following the implementation of a consult assistant responsible for non-image interpretive tasks. Pre- and post-intervention data were compared. Following separation of image-interpretive and non-image interpretive workflows, time spent on image-interpretive tasks by the primary fellow increased from 53.8% to 73.2% while non-image interpretive tasks decreased from 20.4% to 4.4%. Mean time duration of image interpretation nearly doubled, from 05:44 to 11:01 (p = 0.002). Decreases in specific non-image interpretive tasks, including phone calls/paging (2.86/hr versus 0.80/hr), in-room consultations (1.36/hr versus 0.80/hr), and protocoling (0.99/hr versus 0.10/hr), were observed. The consult assistant experienced 29.4 task switching events per hour. Rates of specific non-image interpretive tasks for the CA were 6.41/hr for phone calls/paging, 3.60/hr for in-room consultations, and 3.83/hr for protocoling. Separating responsibilities into NIT and IIT workflows substantially increased image interpretation time and decreased TSEs for the primary fellow. Consolidation of NITs into a separate workflow may allow for more efficient task completion. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Workflow in the operating room: review of Arrowhead 2004 seminar on imaging and informatics (Invited Paper)

    Science.gov (United States)

    Lemke, Heinz U.; Ratib, Osman M.; Horii, Steven C.

    2005-04-01

    This review paper is based on the 2004 UCLA Seminar on Imaging and Informatics (http://www.radnet.ucla.edu/Arrowhead2004/) which is a joint endeavour between the UCLA and the CARS organization, focussing on workflow analysis tools and the digital operating room. Eleven specific presentations of the Arrowhead Seminar have been summarized in this review referring to redesigning perioperative care for a high velocity OR, intraoperative ultrasound process and model, surgical workflow and surgical PACS, an integrated view , interactions in the surgical OR, workflow automation strategies and target applications, visualisation solutions for the operating room, navigating the fifth dimension, and design of digital operating rooms and interventional suites

  7. Importance of global aerosol modeling including secondary organic aerosol formed from monoterpene

    OpenAIRE

    Goto, Daisuke; Takemura, Toshihiko; Nakajima, Teruyuki

    2008-01-01

    A global three-dimensional aerosol transport-radiation model, coupled to an atmospheric general circulation model (AGCM), has been extended to improve the model process for organic aerosols, particularly secondary organic aerosols (SOA), and to estimate SOA contributions to direct and indirect radiative effects. Because the SOA formation process is complicated and unknown, the results in different model simulations include large differences. In this work, we simulate SOA production assuming v...

  8. MODEL ANALYSIS AND PARAMETER EXTRACTION FOR MOS CAPACITOR INCLUDING QUANTUM MECHANICAL EFFECTS

    Institute of Scientific and Technical Information of China (English)

    Hai-yan Jiang; Ping-wen Zhang

    2006-01-01

    The high frequency CV curves of MOS capacitor have been studied. It is shown that semiclassical model is a good approximation to quantum model and approaches to classical model when the oxide layer is thick. This conclusion provides us an efficient (semiclassical) model including quantum mechanical effects to do parameter extraction for ultrathi noxide device. Here the effective extracting strategy is designed and numerical experiments demonstrate the validity of the strategy.

  9. Workflow Based Software Development Environment Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this proposed research is to investigate and develop a workflow based tool, the Software Developers Assistant, to facilitate the collaboration between...

  10. Evolutionary optimization of production materials workflow processes

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    We present an evolutionary optimisation technique for stochastic production processes, which is able to find improved production materials workflow processes with respect to arbitrary combinations of numerical quantities associated with the production process. Working from a core fragment...

  11. Implementing Workflow Reconfiguration in WS-BPEL

    DEFF Research Database (Denmark)

    Mazzara, Manuel; Dragoni, Nicola; Zhou, Mu

    2012-01-01

    This paper investigates the problem of dynamic reconfiguration by means of a workflow-based case study used for discussion. We state the requirements on a system implementing the workflow and its reconfiguration, and we describe the system’s design in BPMN. WS-BPEL, a language that would not natu......This paper investigates the problem of dynamic reconfiguration by means of a workflow-based case study used for discussion. We state the requirements on a system implementing the workflow and its reconfiguration, and we describe the system’s design in BPMN. WS-BPEL, a language that would...... not naturally support dynamic change, is used as a target for implementation. The WS-BPEL recovery framework is here exploited to implement the reconfiguration using principles derived from previous research in process algebra and two mappings from BPMN to WS-BPEL are presented, one automatic and only mostly...

  12. Building Scientific Workflows for the Geosciences with Open Community Software

    Science.gov (United States)

    Pierce, M. E.; Marru, S.; Weerawarana, S. M.

    2012-12-01

    We describe the design and development of the Apache Airavata scientific workflow software and its application to problems in geosciences. Airavata is based on Service Oriented Architecture principles and is developed as general purpose software for managing large-scale science applications on supercomputing resources such as the NSF's XSEDE. Based on the NSF-funded EarthCube Workflow Working Group activities, we discuss the application of this software relative to specific requirements (such as data stream data processing, event triggering, dealing with large data sets, and advanced distributed execution patterns involved in data mining). We also consider the role of governance in EarthCube software development and present the development of Airavata software through the Apache Software Foundation's community development model. We discuss the potential impacts on software accountability and sustainability using this model.

  13. Measuring Semantic and Structural Information for Data Oriented Workflow Retrieval with Cost Constraints

    Directory of Open Access Journals (Sweden)

    Yinglong Ma

    2014-01-01

    Full Text Available The reuse of data oriented workflows (DOWs can reduce the cost of workflow system development and control the risk of project failure and therefore is crucial for accelerating the automation of business processes. Reusing workflows can be achieved by measuring the similarity among candidate workflows and selecting the workflow satisfying requirements of users from them. However, due to DOWs being often developed based on an open, distributed, and heterogeneous environment, different users often can impose diverse cost constraints on data oriented workflows. This makes the reuse of DOWs challenging. There is no clear solution for retrieving DOWs with cost constraints. In this paper, we present a novel graph based model of DOWs with cost constraints, called constrained data oriented workflow (CDW, which can express cost constraints that users are often concerned about. An approach is proposed for retrieving CDWs, which seamlessly combines semantic and structural information of CDWs. A distance measure based on matrix theory is adopted to seamlessly combine semantic and structural similarities of CDWs for selecting and reusing them. Finally, the related experiments are made to show the effectiveness and efficiency of our approach.

  14. myExperiment: a repository and social network for the sharing of bioinformatics workflows.

    Science.gov (United States)

    Goble, Carole A; Bhagat, Jiten; Aleksejevs, Sergejs; Cruickshank, Don; Michaelides, Danius; Newman, David; Borkum, Mark; Bechhofer, Sean; Roos, Marco; Li, Peter; De Roure, David

    2010-07-01

    myExperiment (http://www.myexperiment.org) is an online research environment that supports the social sharing of bioinformatics workflows. These workflows are procedures consisting of a series of computational tasks using web services, which may be performed on data from its retrieval, integration and analysis, to the visualization of the results. As a public repository of workflows, myExperiment allows anybody to discover those that are relevant to their research, which can then be reused and repurposed to their specific requirements. Conversely, developers can submit their workflows to myExperiment and enable them to be shared in a secure manner. Since its release in 2007, myExperiment currently has over 3500 registered users and contains more than 1000 workflows. The social aspect to the sharing of these workflows is facilitated by registered users forming virtual communities bound together by a common interest or research project. Contributors of workflows can build their reputation within these communities by receiving feedback and credit from individuals who reuse their work. Further documentation about myExperiment including its REST web service is available from http://wiki.myexperiment.org. Feedback and requests for support can be sent to bugs@myexperiment.org.

  15. A finite element model of the face including an orthotropic skin model under in vivo tension.

    Science.gov (United States)

    Flynn, Cormac; Stavness, Ian; Lloyd, John; Fels, Sidney

    2015-01-01

    Computer models of the human face have the potential to be used as powerful tools in surgery simulation and animation development applications. While existing models accurately represent various anatomical features of the face, the representation of the skin and soft tissues is very simplified. A computer model of the face is proposed in which the skin is represented by an orthotropic hyperelastic constitutive model. The in vivo tension inherent in skin is also represented in the model. The model was tested by simulating several facial expressions by activating appropriate orofacial and jaw muscles. Previous experiments calculated the change in orientation of the long axis of elliptical wounds on patients' faces for wide opening of the mouth and an open-mouth smile (both 30(o)). These results were compared with the average change of maximum principal stress direction in the skin calculated in the face model for wide opening of the mouth (18(o)) and an open-mouth smile (25(o)). The displacements of landmarks on the face for four facial expressions were compared with experimental measurements in the literature. The corner of the mouth in the model experienced the largest displacement for each facial expression (∼11-14 mm). The simulated landmark displacements were within a standard deviation of the measured displacements. Increasing the skin stiffness and skin tension generally resulted in a reduction in landmark displacements upon facial expression.

  16. Flexible End2End Workflow Automation of Hit-Discovery Research.

    Science.gov (United States)

    Holzmüller-Laue, Silke; Göde, Bernd; Thurow, Kerstin

    2014-08-01

    The article considers a new approach of more complex laboratory automation at the workflow layer. The authors purpose the automation of end2end workflows. The combination of all relevant subprocesses-whether automated or manually performed, independently, and in which organizational unit-results in end2end processes that include all result dependencies. The end2end approach focuses on not only the classical experiments in synthesis or screening, but also on auxiliary processes such as the production and storage of chemicals, cell culturing, and maintenance as well as preparatory activities and analyses of experiments. Furthermore, the connection of control flow and data flow in the same process model leads to reducing of effort of the data transfer between the involved systems, including the necessary data transformations. This end2end laboratory automation can be realized effectively with the modern methods of business process management (BPM). This approach is based on a new standardization of the process-modeling notation Business Process Model and Notation 2.0. In drug discovery, several scientific disciplines act together with manifold modern methods, technologies, and a wide range of automated instruments for the discovery and design of target-based drugs. The article discusses the novel BPM-based automation concept with an implemented example of a high-throughput screening of previously synthesized compound libraries. © 2014 Society for Laboratory Automation and Screening.

  17. A novel workflow for seismic net pay estimation with uncertainty

    CERN Document Server

    Glinsky, Michael E; Unaldi, Muhlis; Nagassar, Vishal

    2016-01-01

    This paper presents a novel workflow for seismic net pay estimation with uncertainty. It is demonstrated on the Cassra/Iris Field. The theory for the stochastic wavelet derivation (which estimates the seismic noise level along with the wavelet, time-to-depth mapping, and their uncertainties), the stochastic sparse spike inversion, and the net pay estimation (using secant areas) along with its uncertainty; will be outlined. This includes benchmarking of this methodology on a synthetic model. A critical part of this process is the calibration of the secant areas. This is done in a two step process. First, a preliminary calibration is done with the stochastic reflection response modeling using rock physics relationships derived from the well logs. Second, a refinement is made to the calibration to account for the encountered net pay at the wells. Finally, a variogram structure is estimated from the extracted secant area map, then used to build in the lateral correlation to the ensemble of net pay maps while matc...

  18. User-Driven Workflow for Modeling, Monitoring, Product Development, and Flood Map Delivery Using Satellites for Daily Coverage Over Texas May-June 2015

    Science.gov (United States)

    Green, D. S.; Frye, S. W.; Wells, G. L.; Adler, R. F.; Brakenridge, R.; Bolten, J. D.; Murray, J. J.; Slayback, D. A.; Kirschbaum, D.; Wu, H.; Cappelaere, P. G.; Schumann, G.; Howard, T.; Flamig, Z.; Clark, R. A.; Stough, T.; Chini, M.; Matgen, P.

    2015-12-01

    Intense rainfall during late April and early May 2015 in Texas and Oklahoma led to widespread flooding in several river basins in that region. Texas state agencies were activated for the May-June floods and severe weather event that ensued for six weeks from May 8 until June 19 following Tropical Storm Bill. This poster depicts a case study where modeling flood potential informed decision making authorities for user-driven high resolution satellite acquisitions over the most critical areas and how experimental flood mapping techniques provided the capability for daily on-going monitoring of these events through the use of increased automation. Recent improvements in flood models resulting from higher frequency updates, better spatial resolution, and increased accuracy of now cast and forecast precipitation products coupled with advanced technology to improve situational awareness for decision makers. These advances enabled satellites to be tasked, data products to be developed and distributed, and feedback loops between the emergency authorities, satellite operators, and mapping researchers to deliver a daily stream of relevant products that informed deployment of emergency resources and improved management of the large-scale event across the local, state, and national levels. This collaboration was made possible through inter-agency cooperation on an international scale through the Committee on Earth Observation Satellites Flood Pilot activity that is supported in the USA by NASA, NOAA, and USGS and includes numerous civilian space agency assets from the European Space Agency along with national agencies from Italy, France, Germany, Japan, and others. The poster describes the inter-linking technology infrastructure, the development and delivery of mapping products, and the lessons learned for product improvement in the future.

  19. Advanced Workflows for Fluid Transfer in Faulted Basins.

    OpenAIRE

    Thibaut Muriel; Jardin Anne; Faille Isabelle; Willien Françoise; Guichet Xavier

    2014-01-01

    modélisation de bassin ; faille ; logiciel ;; International audience; The traditional 3D basin modeling workflow is made of the following steps: construction of present day basin architecture, reconstruction of the structural evolution through time, together with fluid flow simulation and heat transfers. In this case, the forward simulation is limited to basin architecture, mainly controlled by erosion, sedimentation and vertical compaction. The tectonic deformation is limited to vertical sli...

  20. Advanced Workflows for Fluid Transfer in Faulted Basins

    Directory of Open Access Journals (Sweden)

    Thibaut Muriel

    2014-07-01

    Full Text Available The traditional 3D basin modeling workflow is made of the following steps: construction of present day basin architecture, reconstruction of the structural evolution through time, together with fluid flow simulation and heat transfers. In this case, the forward simulation is limited to basin architecture, mainly controlled by erosion, sedimentation and vertical compaction. The tectonic deformation is limited to vertical slip along faults. Fault properties are modeled as vertical shear zones along which rock permeability is adjusted to enhance fluid flow or prevent flow to escape. For basins having experienced a more complex tectonic history, this approach is over-simplified. It fails in understanding and representing fluid flow paths due to structural evolution of the basin. This impacts overpressure build-up, and petroleum resources location. Over the past years, a new 3D basin forward code has been developed in IFP Energies nouvelles that is based on a cell centered finite volume discretization which preserves mass on an unstructured grid and describes the various changes in geometry and topology of a basin through time. At the same time, 3D restoration tools based on geomechanical principles of strain minimization were made available that offer a structural scenario at a discrete number of deformation stages of the basin. In this paper, we present workflows integrating these different innovative tools on complex faulted basin architectures where complex means moderate lateral as well as vertical deformation coupled with dynamic fault property modeling. Two synthetic case studies inspired by real basins have been used to illustrate how to apply the workflow, where the difficulties in the workflows are, and what the added value is compared with previous basin modeling approaches.

  1. Hot DA white dwarf model atmosphere calculations: Including improved Ni PI cross sections

    CERN Document Server

    Preval, S P; Badnell, N R; Hubeny, I; Holberg, J B

    2016-01-01

    To calculate realistic models of objects with Ni in their atmospheres, accurate atomic data for the relevant ionization stages needs to be included in model atmosphere calculations. In the context of white dwarf stars, we investigate the effect of changing the Ni {\\sc iv}-{\\sc vi} bound-bound and bound-free atomic data has on model atmosphere calculations. Models including PICS calculated with {\\sc autostructure} show significant flux attenuation of up to $\\sim 80$\\% shortward of 180\\AA\\, in the EUV region compared to a model using hydrogenic PICS. Comparatively, models including a larger set of Ni transitions left the EUV, UV, and optical continua unaffected. We use models calculated with permutations of this atomic data to test for potential changes to measured metal abundances of the hot DA white dwarf G191-B2B. Models including {\\sc autostructure} PICS were found to change the abundances of N and O by as much as $\\sim 22$\\% compared to models using hydrogenic PICS, but heavier species were relatively unaf...

  2. Mathematical Model of Thyristor Inverter Including a Series-parallel Resonant Circuit

    Directory of Open Access Journals (Sweden)

    Miroslaw Luft

    2008-01-01

    Full Text Available The article presents a mathematical model of thyristor inverter including a series-parallel resonant circuit with theaid of state variable method. Maple procedures are used to compute current and voltage waveforms in the inverter.

  3. An Integrated Model for Patient Care and Clinical Trials (IMPACT) to support clinical research visit scheduling workflow for future learning health systems.

    Science.gov (United States)

    Weng, Chunhua; Li, Yu; Berhe, Solomon; Boland, Mary Regina; Gao, Junfeng; Hruby, Gregory W; Steinman, Richard C; Lopez-Jimenez, Carlos; Busacca, Linda; Hripcsak, George; Bakken, Suzanne; Bigger, J Thomas

    2013-08-01

    We describe a clinical research visit scheduling system that can potentially coordinate clinical research visits with patient care visits and increase efficiency at clinical sites where clinical and research activities occur simultaneously. Participatory Design methods were applied to support requirements engineering and to create this software called Integrated Model for Patient Care and Clinical Trials (IMPACT). Using a multi-user constraint satisfaction and resource optimization algorithm, IMPACT automatically synthesizes temporal availability of various research resources and recommends the optimal dates and times for pending research visits. We conducted scenario-based evaluations with 10 clinical research coordinators (CRCs) from diverse clinical research settings to assess the usefulness, feasibility, and user acceptance of IMPACT. We obtained qualitative feedback using semi-structured interviews with the CRCs. Most CRCs acknowledged the usefulness of IMPACT features. Support for collaboration within research teams and interoperability with electronic health records and clinical trial management systems were highly requested features. Overall, IMPACT received satisfactory user acceptance and proves to be potentially useful for a variety of clinical research settings. Our future work includes comparing the effectiveness of IMPACT with that of existing scheduling solutions on the market and conducting field tests to formally assess user adoption.

  4. A framework for service enterprise workflow simulation with multi-agents cooperation

    Science.gov (United States)

    Tan, Wenan; Xu, Wei; Yang, Fujun; Xu, Lida; Jiang, Chuanqun

    2013-11-01

    Process dynamic modelling for service business is the key technique for Service-Oriented information systems and service business management, and the workflow model of business processes is the core part of service systems. Service business workflow simulation is the prevalent approach to be used for analysis of service business process dynamically. Generic method for service business workflow simulation is based on the discrete event queuing theory, which is lack of flexibility and scalability. In this paper, we propose a service workflow-oriented framework for the process simulation of service businesses using multi-agent cooperation to address the above issues. Social rationality of agent is introduced into the proposed framework. Adopting rationality as one social factor for decision-making strategies, a flexible scheduling for activity instances has been implemented. A system prototype has been developed to validate the proposed simulation framework through a business case study.

  5. a Workflow for UAV's Integration Into a Geodesign Platform

    Science.gov (United States)

    Anca, P.; Calugaru, A.; Alixandroae, I.; Nazarie, R.

    2016-06-01

    This paper presents a workflow for the development of various Geodesign scenarios. The subject is important in the context of identifying patterns and designing solutions for a Smart City with optimized public transportation, efficient buildings, efficient utilities, recreational facilities a.s.o.. The workflow describes the procedures starting with acquiring data in the field, data processing, orthophoto generation, DTM generation, integration into a GIS platform and analyzing for a better support for Geodesign. Esri's City Engine is used mostly for 3D modeling capabilities that enable the user to obtain 3D realistic models. The workflow uses as inputs information extracted from images acquired using UAVs technologies, namely eBee, existing 2D GIS geodatabases, and a set of CGA rules. The method that we used further, is called procedural modeling, and uses rules in order to extrude buildings, the street network, parcel zoning and side details, based on the initial attributes from the geodatabase. The resulted products are various scenarios for redesigning, for analyzing new exploitation sites. Finally, these scenarios can be published as interactive web scenes for internal, groups or pubic consultation. In this way, problems like the impact of new constructions being build, re-arranging green spaces or changing routes for public transportation, etc. are revealed through impact and visibility analysis or shadowing analysis and are brought to the citizen's attention. This leads to better decisions.

  6. A Verilog-A large signal model for InP DHBT including thermal effects

    Science.gov (United States)

    Yuxia, Shi; Zhi, Jin; Zhijian, Pan; Yongbo, Su; Yuxiong, Cao; Yan, Wang

    2013-06-01

    A large signal model for InP/InGaAs double heterojunction bipolar transistors including thermal effects has been reported, which demonstrated good agreements of simulations with measurements. On the basis of the previous model in which the double heterojunction effect, current blocking effect and high current effect in current expression are considered, the effect of bandgap narrowing with temperature has been considered in transport current while a formula for model parameters as a function of temperature has been developed. This model is implemented by Verilog-A and embedded in ADS. The proposed model is verified with DC and large signal measurements.

  7. Modelling Mediterranean agro-ecosystems by including agricultural trees in the LPJmL model

    Science.gov (United States)

    Fader, M.; von Bloh, W.; Shi, S.; Bondeau, A.; Cramer, W.

    2015-11-01

    In the Mediterranean region, climate and land use change are expected to impact on natural and agricultural ecosystems by warming, reduced rainfall, direct degradation of ecosystems and biodiversity loss. Human population growth and socioeconomic changes, notably on the eastern and southern shores, will require increases in food production and put additional pressure on agro-ecosystems and water resources. Coping with these challenges requires informed decisions that, in turn, require assessments by means of a comprehensive agro-ecosystem and hydrological model. This study presents the inclusion of 10 Mediterranean agricultural plants, mainly perennial crops, in an agro-ecosystem model (Lund-Potsdam-Jena managed Land - LPJmL): nut trees, date palms, citrus trees, orchards, olive trees, grapes, cotton, potatoes, vegetables and fodder grasses. The model was successfully tested in three model outputs: agricultural yields, irrigation requirements and soil carbon density. With the development presented in this study, LPJmL is now able to simulate in good detail and mechanistically the functioning of Mediterranean agriculture with a comprehensive representation of ecophysiological processes for all vegetation types (natural and agricultural) and in a consistent framework that produces estimates of carbon, agricultural and hydrological variables for the entire Mediterranean basin. This development paves the way for further model extensions aiming at the representation of alternative agro-ecosystems (e.g. agroforestry), and opens the door for a large number of applications in the Mediterranean region, for example assessments of the consequences of land use transitions, the influence of management practices and climate change impacts.

  8. Modelling Mediterranean agro-ecosystems by including agricultural trees in the LPJmL model

    Directory of Open Access Journals (Sweden)

    M. Fader

    2015-06-01

    Full Text Available Climate and land use change in the Mediterranean region is expected to affect natural and agricultural ecosystems by decreases in precipitation, increases in temperature as well as biodiversity loss and anthropogenic degradation of natural resources. Demographic growth in the Eastern and Southern shores will require increases in food production and put additional pressure on agro-ecosystems and water resources. Coping with these challenges requires informed decisions that, in turn, require assessments by means of a comprehensive agro-ecosystem and hydrological model. This study presents the inclusion of 10 Mediterranean agricultural plants, mainly perennial crops, in an agro-ecosystem model (LPJmL: nut trees, date palms, citrus trees, orchards, olive trees, grapes, cotton, potatoes, vegetables and fodder grasses. The model was successfully tested in three model outputs: agricultural yields, irrigation requirements and soil carbon density. With the development presented in this study, LPJmL is now able to simulate in good detail and mechanistically the functioning of Mediterranean agriculture with a comprehensive representation of ecophysiological processes for all vegetation types (natural and agricultural and in a consistent framework that produces estimates of carbon, agricultural and hydrological variables for the entire Mediterranean basin. This development pave the way for further model extensions aiming at the representation of alternative agro-ecosystems (e.g. agroforestry, and opens the door for a large number of applications in the Mediterranean region, for example assessments on the consequences of land use transitions, the influence of management practices and climate change impacts.

  9. Numerical Acoustic Models Including Viscous and Thermal losses: Review of Existing and New Methods

    DEFF Research Database (Denmark)

    Andersen, Peter Risby; Cutanda Henriquez, Vicente; Aage, Niels

    2017-01-01

    This work presents an updated overview of numerical methods including acoustic viscous and thermal losses. Numerical modelling of viscothermal losses has gradually become more important due to the general trend of making acoustic devices smaller. Not including viscothermal acoustic losses in such...

  10. A Community-Driven Workflow Recommendations and Reuse Infrastructure

    Science.gov (United States)

    Zhang, J.; Votava, P.; Lee, T. J.; Lee, C.; Xiao, S.; Nemani, R. R.; Foster, I.

    2013-12-01

    Aiming to connect the Earth science community to accelerate the rate of discovery, NASA Earth Exchange (NEX) has established an online repository and platform, so that researchers can publish and share their tools and models with colleagues. In recent years, workflow has become a popular technique at NEX for Earth scientists to define executable multi-step procedures for data processing and analysis. The ability to discover and reuse knowledge (sharable workflows or workflow) is critical to the future advancement of science. However, as reported in our earlier study, the reusability of scientific artifacts at current time is very low. Scientists often do not feel confident in using other researchers' tools and utilities. One major reason is that researchers are often unaware of the existence of others' data preprocessing processes. Meanwhile, researchers often do not have time to fully document the processes and expose them to others in a standard way. These issues cannot be overcome by the existing workflow search technologies used in NEX and other data projects. Therefore, this project aims to develop a proactive recommendation technology based on collective NEX user behaviors. In this way, we aim to promote and encourage process and workflow reuse within NEX. Particularly, we focus on leveraging peer scientists' best practices to support the recommendation of artifacts developed by others. Our underlying theoretical foundation is rooted in the social cognitive theory, which declares people learn by watching what others do. Our fundamental hypothesis is that sharable artifacts have network properties, much like humans in social networks. More generally, reusable artifacts form various types of social relationships (ties), and may be viewed as forming what organizational sociologists who use network analysis to study human interactions call a 'knowledge network.' In particular, we will tackle two research questions: R1: What hidden knowledge may be extracted from

  11. Data Processing Workflows to Support Reproducible Data-driven Research in Hydrology

    Science.gov (United States)

    Goodall, J. L.; Essawy, B.; Xu, H.; Rajasekar, A.; Moore, R. W.

    2015-12-01

    Geoscience analyses often require the use of existing data sets that are large, heterogeneous, and maintained by different organizations. A particular challenge in creating reproducible analyses using these data sets is automating the workflows required to transform raw datasets into model specific input files and finally into publication ready visualizations. Data grids, such as the Integrated Rule-Oriented Data System (iRODS), are architectures that allow scientists to access and share large data sets that are geographically distributed on the Internet, but appear to the scientist as a single file management system. The DataNet Federation Consortium (DFC) project is built on iRODS and aims to demonstrate data and computational interoperability across scientific communities. This paper leverages iRODS and the DFC to demonstrate how hydrological modeling workflows can be encapsulated as workflows using the iRODS concept of Workflow Structured Objects (WSO). An example use case is presented for automating hydrologic model post-processing routines that demonstrates how WSOs can be created and used within the DFC to automate the creation of data visualizations from large model output collections. By co-locating the workflow used to create the visualization with the data collection, the use case demonstrates how data grid technology aids in reuse, reproducibility, and sharing of workflows within scientific communities.

  12. WorkflowNet2BPEL4WS: A Tool for Translating Unstructured Workflow Processes to Readable BPEL

    DEFF Research Database (Denmark)

    Lassen, Kristian Bisgaard; van der Aalst, Wil M. P.

    2007-01-01

    code and not easy to use by end-users. Therefore, we provide a mapping from WF-nets to BPEL. This mapping builds on the rich theory of Petri nets and can also be used to map other languages (e.g., UML, EPC, BPMN, etc.) onto BPEL. To evaluate WorkflowNet2BPEL4WS we used more than 100 processes modeled...

  13. Enhanced Phosphoproteomic Profiling Workflow For Growth Factor Signaling Analysis

    DEFF Research Database (Denmark)

    Sylvester, Marc; Burbridge, Mike; Leclerc, Gregory;

    2010-01-01

    Background Our understanding of complex signaling networks is still fragmentary. Isolated processes have been studied extensively but cross-talk is omnipresent and precludes intuitive predictions of signaling outcomes. The need for quantitative data on dynamic systems is apparent especially for our...... A549 lung carcinoma cells were used as a model and stimulated with hepatocyte growth factor, epidermal growth factor or fibroblast growth factor. We employed a quick protein digestion workflow with spin filters without using urea. Phosphopeptides in general were enriched by sequential elution from...... transfer dissociation adds confidence in modification site assignment. The workflow is relatively simple but the integration of complementary techniques leads to a deeper insight into cellular signaling networks and the potential pharmacological intervention thereof....

  14. Networked Print Production: Does JDF Provide a Perfect Workflow?

    Directory of Open Access Journals (Sweden)

    Bernd Zipper

    2004-12-01

    Full Text Available The "networked printing works" is a well-worn slogan used by many providers in the graphics industry and for the past number of years printing-works manufacturers have been working on the goal of achieving the "networked printing works". A turning point from the concept to real implementation can now be expected at drupa 2004: JDF (Job Definition Format and thus "networked production" will form the center of interest here. The first approaches towards a complete, networked workflow between prepress, print and postpress in production are already available - the products and solutions will now be presented publicly at drupa 2004. So, drupa 2004 will undoubtedly be the "JDF-drupa" - the drupa where machines learn to communicate with each other digitally - the drupa, where the dream of general system and job communication in the printing industry can be first realized. CIP3, which has since been renamed CIP4, is an international consortium of leading manufacturers from the printing and media industry who have taken on the task of integrating processes for prepress, print and postpress. The association, to which nearly all manufacturers in the graphics industry belong, has succeeded with CIP3 in developing a first international standard for the transmission of control data in the print workflow.Further development of the CIP4 standard now includes a more extensive "system language" called JDF, which will guarantee workflow communication beyond manufacturer boundaries. However, not only data for actual print production will be communicated with JDF (Job Definition Format: planning and calculation data for MIS (Management Information systems and calculation systems will also be prepared. The German printing specialist Hans-Georg Wenke defines JDF as follows: "JDF takes over data from MIS for machines, aggregates and their control desks, data exchange within office applications, and finally ensures that data can be incorporated in the technical workflow

  15. CONNJUR Workflow Builder: a software integration environment for spectral reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert [UConn Health, Department of Molecular Biology and Biophysics (United States); Martyn, Timothy O. [Rensselaer at Hartford, Department of Engineering and Science (United States); Ellis, Heidi J. C. [Western New England College, Department of Computer Science and Information Technology (United States); Gryk, Michael R., E-mail: gryk@uchc.edu [UConn Health, Department of Molecular Biology and Biophysics (United States)

    2015-07-15

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses.

  16. How Workflow Systems Facilitate Business Process Reengineering and Improvement

    Directory of Open Access Journals (Sweden)

    Mohamed El Khadiri

    2012-03-01

    Full Text Available This paper investigates the relationship between workflow systems and business process reengineering and improvement. The study is based on a real case study at the “Centre Rgional dInvestissement” (CRI of Marrakech, Morocco. The CRI is entrusted to coordinate various investment projects at the regional level. Our previous work has shown that workflow system can be a basis for business process reengineering. However, for continuous process improvement, the system has shown to be insufficient as it fails to deal with exceptions and problem resolutions that informal communications provide. However, when this system is augmented with an expanded corporate memory system that includes social tools, to capture informal communication and data, we are closer to a more complete system that facilitates business process reengineering and improvement.

  17. Staffing and Workflow of a Maturing Institutional Repository

    Directory of Open Access Journals (Sweden)

    Debora L. Madsen

    2013-02-01

    Full Text Available Institutional repositories (IRs have become established components of many academic libraries. As an IR matures it will face the challenge of how to scale up its operations to increase the amount and types of content archived. These challenges involve staffing, systems, workflows, and promotion. In the past eight years, Kansas State University's IR (K-REx has grown from a platform for student theses, dissertations, and reports to also include faculty works. The initial workforce of a single faculty member was expanded as a part of a library-wide reorganization, resulting in a cross-departmental team that is better able to accommodate the expansion of the IR. The resultant need to define staff responsibilities and develop resources to manage the workflows has led to the innovations described here, which may prove useful to the greater library community as other IRs mature.

  18. Catalytic conversion of lignin pyrolysis model compound- guaiacol and its kinetic model including coke formation

    Science.gov (United States)

    Zhang, Huiyan; Wang, Yun; Shao, Shanshan; Xiao, Rui

    2016-11-01

    Lignin is the most difficult to be converted and most easy coking component in biomass catalytic pyrolysis to high-value liquid fuels and chemicals. Catalytic conversion of guaiacol as a lignin model compound was conducted in a fixed-bed reactor over ZSM-5 to investigate its conversion and coking behaviors. The effects of temperature, weight hourly space velocity (WHSV) and partial pressure on product distribution were studied. The results show the maximum aromatic carbon yield of 28.55% was obtained at temperature of 650 °C, WHSV of 8 h‑1 and partial pressure of 2.38 kPa, while the coke carbon yield was 19.55%. The reaction pathway was speculated to be removing methoxy group to form phenols with further aromatization to form aromatics. The amount of coke increased with increasing reaction time. The surface area and acidity of catalysts declined as coke formed on the acid sites and blocked the pore channels, which led to the decrease of aromatic yields. Finally, a kinetic model of guaiacol catalytic conversion considering coke deposition was built based on the above reaction pathway to properly predict product distribution. The experimental and model predicting data agreed well. The correlation coefficient of all equations were all higher than 0.90.

  19. Including operational data in QMRA model: development and impact of model inputs.

    Science.gov (United States)

    Jaidi, Kenza; Barbeau, Benoit; Carrière, Annie; Desjardins, Raymond; Prévost, Michèle

    2009-03-01

    A Monte Carlo model, based on the Quantitative Microbial Risk Analysis approach (QMRA), has been developed to assess the relative risks of infection associated with the presence of Cryptosporidium and Giardia in drinking water. The impact of various approaches for modelling the initial parameters of the model on the final risk assessments is evaluated. The Monte Carlo simulations that we performed showed that the occurrence of parasites in raw water was best described by a mixed distribution: log-Normal for concentrations > detection limit (DL), and a uniform distribution for concentrations risks significantly. The mean annual risks for conventional treatment are: 1.97E-03 (removal credit adjusted by log parasite = log spores), 1.58E-05 (log parasite = 1.7 x log spores) or 9.33E-03 (regulatory credits based on the turbidity measurement in filtered water). Using full scale validated SCADA data, the simplified calculation of CT performed at the plant was shown to largely underestimate the risk relative to a more detailed CT calculation, which takes into consideration the downtime and system failure events identified at the plant (1.46E-03 vs. 3.93E-02 for the mean risk).

  20. SHIWA Services for Workflow Creation and Sharing in Hydrometeorolog

    Science.gov (United States)

    Terstyanszky, Gabor; Kiss, Tamas; Kacsuk, Peter; Sipos, Gergely

    2014-05-01

    Researchers want to run scientific experiments on Distributed Computing Infrastructures (DCI) to access large pools of resources and services. To run these experiments requires specific expertise that they may not have. Workflows can hide resources and services as a virtualisation layer providing a user interface that researchers can use. There are many scientific workflow systems but they are not interoperable. To learn a workflow system and create workflows may require significant efforts. Considering these efforts it is not reasonable to expect that researchers will learn new workflow systems if they want to run workflows developed in other workflow systems. To overcome it requires creating workflow interoperability solutions to allow workflow sharing. The FP7 'Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs' (SHIWA) project developed the Coarse-Grained Interoperability concept (CGI). It enables recycling and sharing workflows of different workflow systems and executing them on different DCIs. SHIWA developed the SHIWA Simulation Platform (SSP) to implement the CGI concept integrating three major components: the SHIWA Science Gateway, the workflow engines supported by the CGI concept and DCI resources where workflows are executed. The science gateway contains a portal, a submission service, a workflow repository and a proxy server to support the whole workflow life-cycle. The SHIWA Portal allows workflow creation, configuration, execution and monitoring through a Graphical User Interface using the WS-PGRADE workflow system as the host workflow system. The SHIWA Repository stores the formal description of workflows and workflow engines plus executables and data needed to execute them. It offers a wide-range of browse and search operations. To support non-native workflow execution the SHIWA Submission Service imports the workflow and workflow engine from the SHIWA Repository. This service either invokes locally or remotely

  1. Nonlinear Modeling of a High Precision Servo Injection Molding Machine Including Novel Molding Approach

    Institute of Scientific and Technical Information of China (English)

    何雪松; 王旭永; 冯正进; 章志新; 杨钦廉

    2003-01-01

    A nonlinear mathematical model of the injection molding process for electrohydraulic servo injection molding machine (IMM) is developed.It was found necessary to consider the characteristics of asymmetric cylinder for electrohydraulic servo IMM.The model is based on the dynamics of the machine including servo valve,asymmetric cylinder and screw,and the non-Newtonian flow behavior of polymer melt in injection molding is also considered.The performance of the model was evaluated based on novel approach of molding - injection and compress molding,and the results of simulation and experimental data demonstrate the effectiveness of the model.

  2. a Better Description of Liquid Jet Breakup Using a Spatial Model Including Viscous Effects.

    Science.gov (United States)

    Hammerschlag, William Brian

    Theoretical models describing the operation and disintegration of a liquid jet are often based on an approximate solution of an inviscid jet in the temporal frame of reference. These models provide only a fair first order prediction of growth rate and breakoff length, and are based solely on a surface tension induced instability. A spatial model yielding jet growth rate and including both jet and surrounding atmosphere viscosity and density is now developed. This model is seen to reproduce all the features and limitations of the Weber viscous jet theory. When tested against experiments of water, water and glycerol mixes and binary eutectic tin/lead solder, only fair agreement is observed.

  3. Workflow management in large distributed systems

    Science.gov (United States)

    Legrand, I.; Newman, H.; Voicu, R.; Dobre, C.; Grigoras, C.

    2011-12-01

    The MonALISA (Monitoring Agents using a Large Integrated Services Architecture) framework provides a distributed service system capable of controlling and optimizing large-scale, data-intensive applications. An essential part of managing large-scale, distributed data-processing facilities is a monitoring system for computing facilities, storage, networks, and the very large number of applications running on these systems in near realtime. All this monitoring information gathered for all the subsystems is essential for developing the required higher-level services—the components that provide decision support and some degree of automated decisions—and for maintaining and optimizing workflow in large-scale distributed systems. These management and global optimization functions are performed by higher-level agent-based services. We present several applications of MonALISA's higher-level services including optimized dynamic routing, control, data-transfer scheduling, distributed job scheduling, dynamic allocation of storage resource to running jobs and automated management of remote services among a large set of grid facilities.

  4. Formal Verification of Temporal Properties for Reduced Overhead in Grid Scientific Workflows

    Institute of Scientific and Technical Information of China (English)

    Jun-Wei Cao; Fan Zhang; Ke Xu; Lian-Chcn Liu; Chcng Wu

    2011-01-01

    With quick development of grid techniques and growing complexity of grid applications,it is becoming critical for reasoning temporal properties of grid workflows to probe potential pitfalls and errors,in order to ensure reliability and trustworthiness at the initial design phase.A state Pi calculus is proposed and implemented in this work,which not only enables flexible abstraction and management of historical grid system events,but also facilitates modeling and temporal verification of grid workflows.Furthermore,a relaxed region analysis (RRA) approach is proposed to decompose large scale grid workflows into sequentially composed regions with relaxation of parallel workflow branches,and corresponding verification strategies are also decomposed following modular verification principles.Performance evaluation results show that the RRA approach can dramatically reduce CPU time and memory usage of formal verification.

  5. Including hydrological self-regulating processes in peatland models: Effects on peatmoss drought projections.

    Science.gov (United States)

    Nijp, Jelmer J; Metselaar, Klaas; Limpens, Juul; Teutschbein, Claudia; Peichl, Matthias; Nilsson, Mats B; Berendse, Frank; van der Zee, Sjoerd E A T M

    2017-02-15

    The water content of the topsoil is one of the key factors controlling biogeochemical processes, greenhouse gas emissions and biosphere - atmosphere interactions in many ecosystems, particularly in northern peatlands. In these wetland ecosystems, the water content of the photosynthetic active peatmoss layer is crucial for ecosystem functioning and carbon sequestration, and is sensitive to future shifts in rainfall and drought characteristics. Current peatland models differ in the degree in which hydrological feedbacks are included, but how this affects peatmoss drought projections is unknown. The aim of this paper was to systematically test whether the level of hydrological detail in models could bias projections of water content and drought stress for peatmoss in northern peatlands using downscaled projections for rainfall and potential evapotranspiration in the current (1991-2020) and future climate (2061-2090). We considered four model variants that either include or exclude moss (rain)water storage and peat volume change, as these are two central processes in the hydrological self-regulation of peatmoss carpets. Model performance was validated using field data of a peatland in northern Sweden. Including moss water storage as well as peat volume change resulted in a significant improvement of model performance, despite the extra parameters added. The best performance was achieved if both processes were included. Including moss water storage and peat volume change consistently reduced projected peatmoss drought frequency with >50%, relative to the model excluding both processes. Projected peatmoss drought frequency in the growing season was 17% smaller under future climate than current climate, but was unaffected by including the hydrological self-regulating processes. Our results suggest that ignoring these two fine-scale processes important in hydrological self-regulation of northern peatlands will have large consequences for projected climate change impact on

  6. Modeling an elastic beam with piezoelectric patches by including magnetic effects

    CERN Document Server

    Ozer, A O

    2014-01-01

    Models for piezoelectric beams using Euler-Bernoulli small displacement theory predict the dynamics of slender beams at the low frequency accurately but are insufficient for beams vibrating at high frequencies or beams with low length-to-width aspect ratios. A more thorough model that includes the effects of rotational inertia and shear strain, Mindlin-Timoshenko small displacement theory, is needed to predict the dynamics more accurately for these cases. Moreover, existing models ignore the magnetic effects since the magnetic effects are relatively small. However, it was shown recently \\cite{O-M1} that these effects can substantially change the controllability and stabilizability properties of even a single piezoelectric beam. In this paper, we use a variational approach to derive models that include magnetic effects for an elastic beam with two piezoelectric patches actuated by different voltage sources. Both Euler-Bernoulli and Mindlin-Timoshenko small displacement theories are considered. Due to the magne...

  7. Stability analysis of the extended ADI-FDTD technique including lumped models

    Institute of Scientific and Technical Information of China (English)

    CHEN ZhiHui; CHU QingXin

    2008-01-01

    The numerical stability of the extended alternating-direction-implicit-finite-difference-time-domain (ADI-FDTD) method including lumped models is analyzed.Three common lumped models are investigated:resistor,capacitor,and inductor,and three different formulations for each model are analyzed:the explicit,semi-implicit and implicit schemes.Analysis results show that the extended ADI-FDTD algorithm is not unconditionally stable in the explicit scheme case,and the stability criterion depends on the value of lumped models,but in the semi-implicit and implicit cases,the algorithm is stable.Finally,two simple microstrip circuits including lumped elements are simulated to demonstrate validity of the theoretical results.

  8. The Dynamic Modeling of Multiple Pairs of Spur Gears in Mesh, Including Friction and Geometrical Errors

    Directory of Open Access Journals (Sweden)

    Shengxiang Jia

    2003-01-01

    Full Text Available This article presents a dynamic model of three shafts and two pair of gears in mesh, with 26 degrees of freedom, including the effects of variable tooth stiffness, pitch and profile errors, friction, and a localized tooth crack on one of the gears. The article also details howgeometrical errors in teeth can be included in a model. The model incorporates the effects of variations in torsional mesh stiffness in gear teeth by using a common formula to describe stiffness that occurs as the gears mesh together. The comparison between the presence and absence of geometrical errors in teeth was made by using Matlab and Simulink models, which were developed from the equations of motion. The effects of pitch and profile errors on the resultant input pinion angular velocity coherent-signal of the input pinion's average are discussed by investigating some of the common diagnostic functions and changes to the frequency spectra results.

  9. SAMI2-PE: A model of the ionosphere including multistream interhemispheric photoelectron transport

    Science.gov (United States)

    Varney, R. H.; Swartz, W. E.; Hysell, D. L.; Huba, J. D.

    2012-06-01

    In order to improve model comparisons with recently improved incoherent scatter radar measurements at the Jicamarca Radio Observatory we have added photoelectron transport and energy redistribution to the two dimensional SAMI2 ionospheric model. The photoelectron model uses multiple pitch angle bins, includes effects associated with curved magnetic field lines, and uses an energy degradation procedure which conserves energy on coarse, non-uniformly spaced energy grids. The photoelectron model generates secondary electron production rates and thermal electron heating rates which are then passed to the fluid equations in SAMI2. We then compare electron and ion temperatures and electron densities of this modified SAMI2 model with measurements of these parameters over a range of altitudes from 90 km to 1650 km (L = 1.26) over a 24 hour period. The new electron heating model is a significant improvement over the semi-empirical model used in SAMI2. The electron temperatures above the F-peak from the modified model qualitatively reproduce the shape of the measurements as functions of time and altitude and quantitatively agree with the measurements to within ˜30% or better during the entire day, including during the rapid temperature increase at dawn.

  10. Workflow in clinical trial sites & its association with near miss events for data quality: ethnographic, workflow & systems simulation.

    Directory of Open Access Journals (Sweden)

    Elias Cesar Araujo de Carvalho

    Full Text Available BACKGROUND: With the exponential expansion of clinical trials conducted in (Brazil, Russia, India, and China and VISTA (Vietnam, Indonesia, South Africa, Turkey, and Argentina countries, corresponding gains in cost and enrolment efficiency quickly outpace the consonant metrics in traditional countries in North America and European Union. However, questions still remain regarding the quality of data being collected in these countries. We used ethnographic, mapping and computer simulation studies to identify/address areas of threat to near miss events for data quality in two cancer trial sites in Brazil. METHODOLOGY/PRINCIPAL FINDINGS: Two sites in Sao Paolo and Rio Janeiro were evaluated using ethnographic observations of workflow during subject enrolment and data collection. Emerging themes related to threats to near miss events for data quality were derived from observations. They were then transformed into workflows using UML-AD and modeled using System Dynamics. 139 tasks were observed and mapped through the ethnographic study. The UML-AD detected four major activities in the workflow evaluation of potential research subjects prior to signature of informed consent, visit to obtain subject́s informed consent, regular data collection sessions following study protocol and closure of study protocol for a given project. Field observations pointed to three major emerging themes: (a lack of standardized process for data registration at source document, (b multiplicity of data repositories and (c scarcity of decision support systems at the point of research intervention. Simulation with policy model demonstrates a reduction of the rework problem. CONCLUSIONS/SIGNIFICANCE: Patterns of threats to data quality at the two sites were similar to the threats reported in the literature for American sites. The clinical trial site managers need to reorganize staff workflow by using information technology more efficiently, establish new standard procedures and

  11. Workflow-driven clinical decision support for personalized oncology.

    Science.gov (United States)

    Bucur, Anca; van Leeuwen, Jasper; Christodoulou, Nikolaos; Sigdel, Kamana; Argyri, Katerina; Koumakis, Lefteris; Graf, Norbert; Stamatakos, Georgios

    2016-07-21

    The adoption in oncology of Clinical Decision Support (CDS) may help clinical users to efficiently deal with the high complexity of the domain, lead to improved patient outcomes, and reduce the current knowledge gap between clinical research and practice. While significant effort has been invested in the implementation of CDS, the uptake in the clinic has been limited. The barriers to adoption have been extensively discussed in the literature. In oncology, current CDS solutions are not able to support the complex decisions required for stratification and personalized treatment of patients and to keep up with the high rate of change in therapeutic options and knowledge. To address these challenges, we propose a framework enabling efficient implementation of meaningful CDS that incorporates a large variety of clinical knowledge models to bring to the clinic comprehensive solutions leveraging the latest domain knowledge. We use both literature-based models and models built within the p-medicine project using the rich datasets from clinical trials and care provided by the clinical partners. The framework is open to the biomedical community, enabling reuse of deployed models by third-party CDS implementations and supporting collaboration among modelers, CDS implementers, biomedical researchers and clinicians. To increase adoption and cope with the complexity of patient management in oncology, we also support and leverage the clinical processes adhered to by healthcare organizations. We design an architecture that extends the CDS framework with workflow functionality. The clinical models are embedded in the workflow models and executed at the right time, when and where the recommendations are needed in the clinical process. In this paper we present our CDS framework developed in p-medicine and the CDS implementation leveraging the framework. To support complex decisions, the framework relies on clinical models that encapsulate relevant clinical knowledge. Next to

  12. General spherical anisotropic Jeans models of stellar kinematics: including proper motions and radial velocities

    CERN Document Server

    Cappellari, Michele

    2015-01-01

    Cappellari (2008) presented a flexible and efficient method to model the stellar kinematics of anisotropic axisymmetric and spherical stellar systems. The spherical formalism could be used to model the line-of-sight velocity second moments allowing for essentially arbitrary radial variation in the anisotropy and general luminous and total density profiles. Here we generalize the spherical formalism by providing the expressions for all three components of the projected second moments, including the two proper motion components. A reference implementation is now included in the public JAM package available at http://purl.org/cappellari/software

  13. Modeling Within-Host Dynamics of Influenza Virus Infection Including Immune Responses

    OpenAIRE

    Pawelek, Kasia A.; Huynh, Giao T; Michelle Quinlivan; Ann Cullinane; Libin Rong; Perelson, Alan S.

    2012-01-01

    Influenza virus infection remains a public health problem worldwide. The mechanisms underlying viral control during an uncomplicated influenza virus infection are not fully understood. Here, we developed a mathematical model including both innate and adaptive immune responses to study the within-host dynamics of equine influenza virus infection in horses. By comparing modeling predictions with both interferon and viral kinetic data, we examined the relative roles of target cell availability, ...

  14. A lumped element transformer model including core losses and winding impedances

    OpenAIRE

    Ribbenfjärd, David

    2007-01-01

    In order to design a power transformer it is important to understand its internal electromagnetic behaviour. That can be obtained by measurements on physical transformers, analytical expressions and computer simulations. One benefit with simulations is that the transformer can be studied before it is built physically and that the consequences of changing dimensions and parameters easily can be tested. In this thesis a time-domain transformer model is presented. The model includes core losses ...

  15. Target echo strength modelling at FOI, including results from the BeTSSi II workshop

    CERN Document Server

    Östberg, Martin

    2016-01-01

    An overview of the target echo strength (TS) modelling capacity at the Swedish Defense Research Agency (FOI) is presented. The modelling methods described range from approximate ones, such as raytracing and Kirchhoff approximation codes, to high accuracy full field codes including boundary integral equation methods and finite elements methods. Illustrations of the applicability of the codes are given for a few simple cases tackled during the BeTTSi II (Benchmark Target Echo Strength Simulation) workshop held in Kiel 2014.

  16. Including leakage in network models: an application to calibrate leak valves in EPANET

    OpenAIRE

    Cobacho Jordán, Ricardo; Arregui de la Cruz, Francisco; Soriano Olivares, Javier; Cabrera Rochera, Enrique

    2015-01-01

    EPANET is one of the most widely used software packages for water network hydraulic modelling, and is especially interesting for educational and research purposes because it is in the public domain. However, EPANET simulations are demand-driven, and the program does not include a specific functionality to model water leakage, which is pressure-driven. Consequently, users are required to deal with this drawback by themselves. As a general solution for this problem, this paper presents a method...

  17. Key Characteristics of Combined Accident including TLOFW accident for PSA Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Bo Gyung; Kang, Hyun Gook [KAIST, Daejeon (Korea, Republic of); Yoon, Ho Joon [Khalifa University of Science, Technology and Research, Abu Dhabi (United Arab Emirates)

    2015-05-15

    The conventional PSA techniques cannot adequately evaluate all events. The conventional PSA models usually focus on single internal events such as DBAs, the external hazards such as fire, seismic. However, the Fukushima accident of Japan in 2011 reveals that very rare event is necessary to be considered in the PSA model to prevent the radioactive release to environment caused by poor treatment based on lack of the information, and to improve the emergency operation procedure. Especially, the results from PSA can be used to decision making for regulators. Moreover, designers can consider the weakness of plant safety based on the quantified results and understand accident sequence based on human actions and system availability. This study is for PSA modeling of combined accidents including total loss of feedwater (TLOFW) accident. The TLOFW accident is a representative accident involving the failure of cooling through secondary side. If the amount of heat transfer is not enough due to the failure of secondary side, the heat will be accumulated to the primary side by continuous core decay heat. Transients with loss of feedwater include total loss of feedwater accident, loss of condenser vacuum accident, and closure of all MSIVs. When residual heat removal by the secondary side is terminated, the safety injection into the RCS with direct primary depressurization would provide alternative heat removal. This operation is called feed and bleed (F and B) operation. Combined accidents including TLOFW accident are very rare event and partially considered in conventional PSA model. Since the necessity of F and B operation is related to plant conditions, the PSA modeling for combined accidents including TLOFW accident is necessary to identify the design and operational vulnerabilities.The PSA is significant to assess the risk of NPPs, and to identify the design and operational vulnerabilities. Even though the combined accident is very rare event, the consequence of combined

  18. An analytical method for well-formed workflow/Petri net verification of classical soundness

    Directory of Open Access Journals (Sweden)

    Clempner Julio

    2014-12-01

    Full Text Available In this paper we consider workflow nets as dynamical systems governed by ordinary difference equations described by a particular class of Petri nets. Workflow nets are a formal model of business processes. Well-formed business processes correspond to sound workflow nets. Even if it seems necessary to require the soundness of workflow nets, there exist business processes with conditional behavior that will not necessarily satisfy the soundness property. In this sense, we propose an analytical method for showing that a workflow net satisfies the classical soundness property using a Petri net. To present our statement, we use Lyapunov stability theory to tackle the classical soundness verification problem for a class of dynamical systems described by Petri nets. This class of Petri nets allows a dynamical model representation that can be expressed in terms of difference equations. As a result, by applying Lyapunov theory, the classical soundness property for workflow nets is solved proving that the Petri net representation is stable. We show that a finite and non-blocking workflow net satisfies the sound property if and only if its corresponding PN is stable, i.e., given the incidence matrix A of the corresponding PN, there exists a Փ strictly positive m vector such that AՓ≤ 0. The key contribution of the paper is the analytical method itself that satisfies part of the definition of the classical soundness requirements. The method is designed for practical applications, guarantees that anomalies can be detected without domain knowledge, and can be easily implemented into existing commercial systems that do not support the verification of workflows. The validity of the proposed method is successfully demonstrated by application examples.

  19. Analysis of a generalized model for influenza including differential susceptibility due to immunosuppression

    Science.gov (United States)

    Hincapié, Doracelly; Ospina, Juan

    2014-06-01

    Recently, a mathematical model of pandemic influenza was proposed including typical control strategies such as antivirals, vaccination and school closure; and considering explicitly the effects of immunity acquired from the early outbreaks on the ulterior outbreaks of the disease. In such model the algebraic expression for the basic reproduction number (without control strategies) and the effective reproduction number (with control strategies) were derived and numerically estimated. A drawback of this model of pandemic influenza is that it ignores the effects of the differential susceptibility due to immunosuppression and the effects of the complexity of the actual contact networks between individuals. We have developed a generalized model which includes such effects of heterogeneity. Specifically we consider the influence of the air network connectivity in the spread of pandemic influenza and the influence of the immunosuppresion when the population is divided in two immune classes. We use an algebraic expression, namely the Tutte polynomial, to characterize the complexity of the contact network. Until now, The influence of the air network connectivity in the spread of pandemic influenza has been studied numerically, but not algebraic expressions have been used to summarize the level of network complexity. The generalized model proposed here includes the typical control strategies previously mentioned (antivirals, vaccination and school closure) combined with restrictions on travel. For the generalized model the corresponding reproduction numbers will be algebraically computed and the effect of the contact network will be established in terms of the Tutte polynomial of the network.

  20. The No-Core Gamow Shell Model: Including the continuum in the NCSM

    CERN Document Server

    Barrett, B R; Michel, N; Płoszajczak, M

    2015-01-01

    We are witnessing an era of intense experimental efforts that will provide information about the properties of nuclei far from the line of stability, regarding resonant and scattering states as well as (weakly) bound states. This talk describes our formalism for including these necessary ingredients into the No-Core Shell Model by using the Gamow Shell Model approach. Applications of this new approach, known as the No-Core Gamow Shell Model, both to benchmark cases as well as to unstable nuclei will be given.

  1. Comparison of detergent-based sample preparation workflows for LTQ-Orbitrap analysis of the Escherichia coli proteome.

    Science.gov (United States)

    Tanca, Alessandro; Biosa, Grazia; Pagnozzi, Daniela; Addis, Maria Filippa; Uzzau, Sergio

    2013-09-01

    This work presents a comparative evaluation of several detergent-based sample preparation workflows for the MS-based analysis of bacterial proteomes, performed using the model organism Escherichia coli. Initially, RapiGest- and SDS-based buffers were compared for their protein extraction efficiency and quality of the MS data generated. As a result, SDS performed best in terms of total protein yields and overall number of MS identifications, mainly due to a higher efficiency in extracting high molecular weight (MW) and membrane proteins, while RapiGest led to an enrichment in periplasmic and fimbrial proteins. Then, SDS extracts underwent five different MS sample preparation workflows, including: detergent removal by spin columns followed by in-solution digestion (SC), protein precipitation followed by in-solution digestion in ammonium bicarbonate or urea buffer, filter-aided sample preparation (FASP), and 1DE separation followed by in-gel digestion. On the whole, about 1000 proteins were identified upon LC-MS/MS analysis of all preparations (>1100 with the SC workflow), with FASP producing more identified peptides and a higher mean sequence coverage. Each protocol exhibited specific behaviors in terms of MW, hydrophobicity, and subcellular localization distribution of the identified proteins; a comparative assessment of the different outputs is presented.

  2. The Watts-Strogatz network model developed by including degree distribution: theory and computer simulation

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Y W [Surface Physics Laboratory and Department of Physics, Fudan University, Shanghai 200433 (China); Zhang, L F [Surface Physics Laboratory and Department of Physics, Fudan University, Shanghai 200433 (China); Huang, J P [Surface Physics Laboratory and Department of Physics, Fudan University, Shanghai 200433 (China)

    2007-07-20

    By using theoretical analysis and computer simulations, we develop the Watts-Strogatz network model by including degree distribution, in an attempt to improve the comparison between characteristic path lengths and clustering coefficients predicted by the original Watts-Strogatz network model and those of the real networks with the small-world property. Good agreement between the predictions of the theoretical analysis and those of the computer simulations has been shown. It is found that the developed Watts-Strogatz network model can fit the real small-world networks more satisfactorily. Some other interesting results are also reported by adjusting the parameters in a model degree-distribution function. The developed Watts-Strogatz network model is expected to help in the future analysis of various social problems as well as financial markets with the small-world property.

  3. Dynamics Analysis of an HIV Infection Model including Infected Cells in an Eclipse Stage

    Directory of Open Access Journals (Sweden)

    Shengyu Zhou

    2013-01-01

    Full Text Available In this paper, an HIV infection model including an eclipse stage of infected cells is considered. Some quicker cells in this stage become productively infected cells, a portion of these cells are reverted to the uninfected class, and others will be latent down in the body. We consider CTL-response delay in this model and analyze the effect of time delay on stability of equilibrium. It is shown that the uninfected equilibrium and CTL-absent infection equilibrium are globally asymptotically stable for both ODE and DDE model. And we get the global stability of the CTL-present equilibrium for ODE model. For DDE model, we have proved that the CTL-present equilibrium is locally asymptotically stable in a range of delays and also have studied the existence of Hopf bifurcations at the CTL-present equilibrium. Numerical simulations are carried out to support our main results.

  4. Modeling of the dynamics of wind to power conversion including high wind speed behavior

    DEFF Research Database (Denmark)

    Litong-Palima, Marisciel; Bjerge, Martin Huus; Cutululis, Nicolaos Antonio

    2016-01-01

    of power system studies, but the idea of the proposed wind turbine model is to include the main dynamic effects in order to have a better representation of the fluctuations in the output power and of the fast power ramping especially because of high wind speed shutdowns of the wind turbine. The high wind......This paper proposes and validates an efficient, generic and computationally simple dynamic model for the conversion of the wind speed at hub height into the electrical power by a wind turbine. This proposed wind turbine model was developed as a first step to simulate wind power time series...... for power system studies. This paper focuses on describing and validating the single wind turbine model, and is therefore neither describing wind speed modeling nor aggregation of contributions from a whole wind farm or a power system area. The state-of-the-art is to use static power curves for the purpose...

  5. Visual Workflows for Oil and Gas Exploration

    KAUST Repository

    Hollt, Thomas

    2013-04-14

    The most important resources to fulfill today’s energy demands are fossil fuels, such as oil and natural gas. When exploiting hydrocarbon reservoirs, a detailed and credible model of the subsurface structures to plan the path of the borehole, is crucial in order to minimize economic and ecological risks. Before that, the placement, as well as the operations of oil rigs need to be planned carefully, as off-shore oil exploration is vulnerable to hazards caused by strong currents. The oil and gas industry therefore relies on accurate ocean forecasting systems for planning their operations. This thesis presents visual workflows for creating subsurface models as well as planning the placement and operations of off-shore structures. Creating a credible subsurface model poses two major challenges: First, the structures in highly ambiguous seismic data are interpreted in the time domain. Second, a velocity model has to be built from this interpretation to match the model to depth measurements from wells. If it is not possible to obtain a match at all positions, the interpretation has to be updated, going back to the first step. This results in a lengthy back and forth between the different steps, or in an unphysical velocity model in many cases. We present a novel, integrated approach to interactively creating subsurface models from reflection seismics, by integrating the interpretation of the seismic data using an interactive horizon extraction technique based on piecewise global optimization with velocity modeling. Computing and visualizing the effects of changes to the interpretation and velocity model on the depth-converted model, on the fly enables an integrated feedback loop that enables a completely new connection of the seismic data in time domain, and well data in depth domain. For planning the operations of off-shore structures we present a novel integrated visualization system that enables interactive visual analysis of ensemble simulations used in ocean

  6. Logical provenance in data-oriented workflows?

    KAUST Repository

    Ikeda, R.

    2013-04-01

    We consider the problem of defining, generating, and tracing provenance in data-oriented workflows, in which input data sets are processed by a graph of transformations to produce output results. We first give a new general definition of provenance for general transformations, introducing the notions of correctness, precision, and minimality. We then determine when properties such as correctness and minimality carry over from the individual transformations\\' provenance to the workflow provenance. We describe a simple logical-provenance specification language consisting of attribute mappings and filters. We provide an algorithm for provenance tracing in workflows where logical provenance for each transformation is specified using our language. We consider logical provenance in the relational setting, observing that for a class of Select-Project-Join (SPJ) transformations, logical provenance specifications encode minimal provenance. We have built a prototype system supporting the features and algorithms presented in the paper, and we report a few preliminary experimental results. © 2013 IEEE.

  7. A statistical model including age to predict passenger postures in the rear seats of automobiles.

    Science.gov (United States)

    Park, Jangwoon; Ebert, Sheila M; Reed, Matthew P; Hallman, Jason J

    2016-06-01

    Few statistical models of rear seat passenger posture have been published, and none has taken into account the effects of occupant age. This study developed new statistical models for predicting passenger postures in the rear seats of automobiles. Postures of 89 adults with a wide range of age and body size were measured in a laboratory mock-up in seven seat configurations. Posture-prediction models for female and male passengers were separately developed by stepwise regression using age, body dimensions, seat configurations and two-way interactions as potential predictors. Passenger posture was significantly associated with age and the effects of other two-way interaction variables depended on age. A set of posture-prediction models are presented for women and men, and the prediction results are compared with previously published models. This study is the first study of passenger posture to include a large cohort of older passengers and the first to report a significant effect of age for adults. The presented models can be used to position computational and physical human models for vehicle design and assessment. Practitioner Summary: The significant effects of age, body dimensions and seat configuration on rear seat passenger posture were identified. The models can be used to accurately position computational human models or crash test dummies for older passengers in known rear seat configurations.

  8. Innovative Liner Concepts: Experiments and Impedance Modeling of Liners Including the Effect of Bias Flow

    Science.gov (United States)

    Kelly, Jeff; Betts, Juan Fernando; Fuller, Chris

    2000-01-01

    The study of normal impedance of perforated plate acoustic liners including the effect of bias flow was studied. Two impedance models were developed by modeling the internal flows of perforate orifices as infinite tubes with the inclusion of end corrections to handle finite length effects. These models assumed incompressible and compressible flows, respectively, between the far field and the perforate orifice. The incompressible model was used to predict impedance results for perforated plates with percent open areas ranging from 5% to 15%. The predicted resistance results showed better agreement with experiments for the higher percent open area samples. The agreement also tended to deteriorate as bias flow was increased. For perforated plates with percent open areas ranging from 1% to 5%, the compressible model was used to predict impedance results. The model predictions were closer to the experimental resistance results for the 2% to 3% open area samples. The predictions tended to deteriorate as bias flow was increased. The reactance results were well predicted by the models for the higher percent open area, but deteriorated as the percent open area was lowered (5%) and bias flow was increased. A fit was done on the incompressible model to the experimental database. The fit was performed using an optimization routine that found the optimal set of multiplication coefficients to the non-dimensional groups that minimized the least squares slope error between predictions and experiments. The result of the fit indicated that terms not associated with bias flow required a greater degree of correction than the terms associated with the bias flow. This model improved agreement with experiments by nearly 15% for the low percent open area (5%) samples when compared to the unfitted model. The fitted model and the unfitted model performed equally well for the higher percent open area (10% and 15%).

  9. Fusion rules for the logarithmic $N=1$ superconformal minimal models II: including the Ramond sector

    CERN Document Server

    Canagasabey, Michael

    2015-01-01

    The Virasoro logarithmic minimal models were intensively studied by several groups over the last ten years with much attention paid to the fusion rules and the structures of the indecomposable representations that fusion generates. The analogous study of the fusion rules of the $N=1$ superconformal logarithmic minimal models was initiated in arXiv:1504.03155 as a continuum counterpart to the lattice explorations of arXiv:1312.6763. These works restricted fusion considerations to Neveu-Schwarz representations. Here, this is extended to include the Ramond sector. Technical advances that make this possible include a fermionic Verlinde formula applicable to logarithmic conformal field theories and a twisted version of the fusion algorithm of Nahm and Gaberdiel-Kausch. The results include the first construction and detailed analysis of logarithmic structures in the Ramond sector.

  10. Digital workflow management for quality assessment in pathology.

    Science.gov (United States)

    Kalinski, Thomas; Sel, Saadettin; Hofmann, Harald; Zwönitzer, Ralf; Bernarding, Johannes; Roessner, Albert

    2008-01-01

    Information systems (IS) are well established in the multitude of departments and practices of pathology. Apart from being a collection of doctor's reports, IS can be used to organize and evaluate workflow processes. We report on such a digital workflow management using IS at the Department of Pathology, University Hospital Magdeburg, Germany, and present an evaluation of workflow data collected over a whole year. This allows us to measure workflow processes and to distinguish the effects of alterations in the workflow for quality assessment. Moreover, digital workflow management provides the basis for the integration of diagnostic virtual microscopy.

  11. MEMLS3&a: Microwave Emission Model of Layered Snowpacks adapted to include backscattering

    Directory of Open Access Journals (Sweden)

    M. Proksch

    2015-08-01

    Full Text Available The Microwave Emission Model of Layered Snowpacks (MEMLS was originally developed for microwave emissions of snowpacks in the frequency range 5–100 GHz. It is based on six-flux theory to describe radiative transfer in snow including absorption, multiple volume scattering, radiation trapping due to internal reflection and a combination of coherent and incoherent superposition of reflections between horizontal layer interfaces. Here we introduce MEMLS3&a, an extension of MEMLS, which includes a backscatter model for active microwave remote sensing of snow. The reflectivity is decomposed into diffuse and specular components. Slight undulations of the snow surface are taken into account. The treatment of like- and cross-polarization is accomplished by an empirical splitting parameter q. MEMLS3&a (as well as MEMLS is set up in a way that snow input parameters can be derived by objective measurement methods which avoid fitting procedures of the scattering efficiency of snow, required by several other models. For the validation of the model we have used a combination of active and passive measurements from the NoSREx (Nordic Snow Radar Experiment campaign in Sodankylä, Finland. We find a reasonable agreement between the measurements and simulations, subject to uncertainties in hitherto unmeasured input parameters of the backscatter model. The model is written in Matlab and the code is publicly available for download through the following website: http://www.iapmw.unibe.ch/research/projects/snowtools/memls.html.

  12. MEMLS3&a: Microwave Emission Model of Layered Snowpacks adapted to include backscattering

    Directory of Open Access Journals (Sweden)

    M. Proksch

    2015-03-01

    Full Text Available The Microwave Emission Model of Layered Snowpacks (MEMLS was originally developed for microwave emissions of snowpacks in the frequency range 5–100 GHz. It is based on six-flux theory to describe radiative transfer in snow including absorption, multiple volume scattering, radiation trapping due to internal reflection and a combination of coherent and incoherent superposition of reflections between horizontal layer interfaces. Here we introduce MEMLS3&a, an extension of MEMLS, which includes a backscatter model for active microwave remote sensing of snow. The reflectivity is decomposed into diffuse and specular components. Slight undulations of the snow surface are taken into account. The treatment of like and cross polarization is accomplished by an empirical splitting parameter q. MEMLS3&a (as well as MEMLS is set up in a way that snow input parameters can be derived by objective measurement methods which avoids fitting procedures of the scattering efficiency of snow, required by several other models. For the validation of the model we have used a combination of active and passive measurements from the NoSREx campaign in Sodankylä, Finland. We find a reasonable agreement between the measurements and simulations, subject to uncertainties in hitherto unmeasured input parameters of the backscatter model. The model is written in MATLAB and the code is publicly available for download through the following website: http://www.iapmw.unibe.ch/research/projects/snowtools/memls.html.

  13. MEMLS3&a: Microwave Emission Model of Layered Snowpacks adapted to include backscattering

    Science.gov (United States)

    Proksch, M.; Mätzler, C.; Wiesmann, A.; Lemmetyinen, J.; Schwank, M.; Löwe, H.; Schneebeli, M.

    2015-08-01

    The Microwave Emission Model of Layered Snowpacks (MEMLS) was originally developed for microwave emissions of snowpacks in the frequency range 5-100 GHz. It is based on six-flux theory to describe radiative transfer in snow including absorption, multiple volume scattering, radiation trapping due to internal reflection and a combination of coherent and incoherent superposition of reflections between horizontal layer interfaces. Here we introduce MEMLS3&a, an extension of MEMLS, which includes a backscatter model for active microwave remote sensing of snow. The reflectivity is decomposed into diffuse and specular components. Slight undulations of the snow surface are taken into account. The treatment of like- and cross-polarization is accomplished by an empirical splitting parameter q. MEMLS3&a (as well as MEMLS) is set up in a way that snow input parameters can be derived by objective measurement methods which avoid fitting procedures of the scattering efficiency of snow, required by several other models. For the validation of the model we have used a combination of active and passive measurements from the NoSREx (Nordic Snow Radar Experiment) campaign in Sodankylä, Finland. We find a reasonable agreement between the measurements and simulations, subject to uncertainties in hitherto unmeasured input parameters of the backscatter model. The model is written in Matlab and the code is publicly available for download through the following website: http://www.iapmw.unibe.ch/research/projects/snowtools/memls.html.

  14. Diagnosing Lee Wave Rotor Onset Using a Linear Model Including a Boundary Layer

    Directory of Open Access Journals (Sweden)

    Miguel A. C. Teixeira

    2017-01-01

    Full Text Available A linear model is used to diagnose the onset of rotors in flow over 2D hills, for atmospheres that are neutrally stratified near the surface and stably stratified aloft, with a sharp temperature inversion in between, where trapped lee waves may propagate. This is achieved by coupling an inviscid two-layer mountain-wave model and a bulk boundary-layer model. The full model shows some ability to diagnose flow stagnation associated with rotors as a function of key input parameters, such as the Froude number and the height of the inversion, in numerical simulations and laboratory experiments carried out by previous authors. While calculations including only the effects of mean flow attenuation and velocity perturbation amplification within the surface layer represent flow stagnation fairly well in the more non-hydrostatic cases, only the full model, taking into account the feedback of the surface layer on the inviscid flow, satisfactorily predicts flow stagnation in the most hydrostatic case, although the corresponding condition is unable to discriminate between rotors and hydraulic jumps. Versions of the model not including this feedback severely underestimate the amplitude of trapped lee waves in that case, where the Fourier transform of the hill has zeros, showing that those waves are not forced directly by the orography.

  15. Steady-state analysis of activated sludge processes with a settler model including sludge compression.

    Science.gov (United States)

    Diehl, S; Zambrano, J; Carlsson, B

    2016-01-01

    A reduced model of a completely stirred-tank bioreactor coupled to a settling tank with recycle is analyzed in its steady states. In the reactor, the concentrations of one dominant particulate biomass and one soluble substrate component are modelled. While the biomass decay rate is assumed to be constant, growth kinetics can depend on both substrate and biomass concentrations, and optionally model substrate inhibition. Compressive and hindered settling phenomena are included using the Bürger-Diehl settler model, which consists of a partial differential equation. Steady-state solutions of this partial differential equation are obtained from an ordinary differential equation, making steady-state analysis of the entire plant difficult. A key result showing that the ordinary differential equation can be replaced with an approximate algebraic equation simplifies model analysis. This algebraic equation takes the location of the sludge-blanket during normal operation into account, allowing for the limiting flux capacity caused by compressive settling to easily be included in the steady-state mass balance equations for the entire plant system. This novel approach grants the possibility of more realistic solutions than other previously published reduced models, comprised of yet simpler settler assumptions. The steady-state concentrations, solids residence time, and the wastage flow ratio are functions of the recycle ratio. Solutions are shown for various growth kinetics; with different values of biomass decay rate, influent volumetric flow, and substrate concentration.

  16. Latest cosmological constraints on Cardassian expansion models including the updated gamma-ray bursts

    Institute of Scientific and Technical Information of China (English)

    Nan Liang; Pu-Xun Wua; Zong-Hong Zhu

    2011-01-01

    We constrain the Cardassian expansion models from the latest observations,including the updated Gamma-ray bursts (GRBs),which are calibrated using a cosmology independent method from the Union2 compilation of type Ia supernovae (SNe Ia).By combining the GRB data with the joint observations from the Union2SNe Ia set,along with the results from the Cosmic Microwave Background radiation observation from the seven-year Wilkinson Microwave Anisotropy Probe and the baryonic acoustic oscillation observation galaxy sample from the spectroscopic Sloan Digital Sky Survey Data Release,we find significant constraints on the model parameters of the original Cardassian model ΩM0=n 282+0.015-0.014,n=0.03+0.05-0.05;and n = -0.16+0.25-3.26,β=-0.76+0.34-0.58 of the modified polytropic Cardassian model,which are consistent with the ACDM model in a l-σ confidence region.From the reconstruction of the deceleration parameter q(z) in Cardassian models,we obtain the transition redshift ZT = 0.73 ± 0.04 for the original Cardassian model and ZT = 0.68 ± 0.04 for the modified polytropic Cardassian model.

  17. Safe distance car-following model including backward-looking and its stability analysis

    Science.gov (United States)

    Yang, Da; Jin, Peter Jing; Pu, Yun; Ran, Bin

    2013-03-01

    The focus of this paper is the car-following behavior including backward-looking, simply called the bi-directional looking car-following behavior. This study is motivated by the potential changes of the physical properties of traffic flow caused by the fast developing intelligent transportation system (ITS), especially the new connected vehicle technology. Existing studies on this topic focused on general motors (GM) models and optimal velocity (OV) models. The safe distance car-following model, Gipps' model, which is more widely used in practice have not drawn too much attention in the bi-directional looking context. This paper explores the property of the bi-directional looking extension of Gipps' safe distance model. The stability condition of the proposed model is derived using the linear stability theory and is verified using numerical simulations. The impacts of the driver and vehicle characteristics appeared in the proposed model on the traffic flow stability are also investigated. It is found that taking into account the backward-looking effect in car-following has three types of effect on traffic flow: stabilizing, destabilizing and producing non-physical phenomenon. This conclusion is more sophisticated than the study results based on the OV bi-directional looking car-following models. Moreover, the drivers who have the smaller reaction time or the larger additional delay and think the other vehicles have larger maximum decelerations can stabilize traffic flow.

  18. Quality Metadata Management for Geospatial Scientific Workflows: from Retrieving to Assessing with Online Tools

    Science.gov (United States)

    Leibovici, D. G.; Pourabdollah, A.; Jackson, M.

    2011-12-01

    Experts and decision-makers use or develop models to monitor global and local changes of the environment. Their activities require the combination of data and processing services in a flow of operations and spatial data computations: a geospatial scientific workflow. The seamless ability to generate, re-use and modify a geospatial scientific workflow is an important requirement but the quality of outcomes is equally much important [1]. Metadata information attached to the data and processes, and particularly their quality, is essential to assess the reliability of the scientific model that represents a workflow [2]. Managing tools, dealing with qualitative and quantitative metadata measures of the quality associated with a workflow, are, therefore, required for the modellers. To ensure interoperability, ISO and OGC standards [3] are to be adopted, allowing for example one to define metadata profiles and to retrieve them via web service interfaces. However these standards need a few extensions when looking at workflows, particularly in the context of geoprocesses metadata. We propose to fill this gap (i) at first through the provision of a metadata profile for the quality of processes, and (ii) through providing a framework, based on XPDL [4], to manage the quality information. Web Processing Services are used to implement a range of metadata analyses on the workflow in order to evaluate and present quality information at different levels of the workflow. This generates the metadata quality, stored in the XPDL file. The focus is (a) on the visual representations of the quality, summarizing the retrieved quality information either from the standardized metadata profiles of the components or from non-standard quality information e.g., Web 2.0 information, and (b) on the estimated qualities of the outputs derived from meta-propagation of uncertainties (a principle that we have introduced [5]). An a priori validation of the future decision-making supported by the

  19. A Lumped Thermal Model Including Thermal Coupling and Thermal Boundary Conditions for High Power IGBT Modules

    DEFF Research Database (Denmark)

    Bahman, Amir Sajjad; Ma, Ke; Blaabjerg, Frede

    2017-01-01

    Detailed thermal dynamics of high power IGBT modules are important information for the reliability analysis and thermal design of power electronic systems. However, the existing thermal models have their limits to correctly predict these complicated thermal behavior in the IGBTs: The typically used...... thermal distribution under long-term studies. Meanwhile the boundary conditions for the thermal analysis are modeled and included, which can be adapted to different real field applications of power electronic converters. Finally, the accuracy of the proposed thermal model is verified by FEM simulations...... thermal model based on one-dimensional RC lumps have limits to provide temperature distributions inside the device, moreover some variable factors in the real-field applications like the cooling and heating conditions of the converter cannot be adapted. On the other hand, the more advanced three...

  20. A Lumped Thermal Model Including Thermal Coupling and Thermal Boundary Conditions for High Power IGBT Modules

    DEFF Research Database (Denmark)

    Bahman, Amir Sajjad; Ma, Ke; Blaabjerg, Frede

    2017-01-01

    Detailed thermal dynamics of high power IGBT modules are important information for the reliability analysis and thermal design of power electronic systems. However, the existing thermal models have their limits to correctly predict these complicated thermal behavior in the IGBTs: The typically used...... thermal distribution under long-term studies. Meanwhile the boundary conditions for the thermal analysis are modeled and included, which can be adapted to different real-field applications of power electronic converters. Finally, the accuracy of the proposed thermal model is verified by FEM simulations...... thermal model based on one-dimensional RC lumps have limits to provide temperature distributions inside the device, moreover some variable factors in the real-field applications like the cooling and heating conditions of the converter cannot be adapted. On the other hand, the more advanced three...

  1. An iterative expanding and shrinking process for processor allocation in mixed-parallel workflow scheduling.

    Science.gov (United States)

    Huang, Kuo-Chan; Wu, Wei-Ya; Wang, Feng-Jian; Liu, Hsiao-Ching; Hung, Chun-Hao

    2016-01-01

    Parallel computation has been widely applied in a variety of large-scale scientific and engineering applications. Many studies indicate that exploiting both task and data parallelisms, i.e. mixed-parallel workflows, to solve large computational problems can get better efficacy compared with either pure task parallelism or pure data parallelism. Scheduling traditional workflows of pure task parallelism on parallel systems has long been known to be an NP-complete problem. Mixed-parallel workflow scheduling has to deal with an additional challenging issue of processor allocation. In this paper, we explore the processor allocation issue in scheduling mixed-parallel workflows of moldable tasks, called M-task, and propose an Iterative Allocation Expanding and Shrinking (IAES) approach. Compared to previous approaches, our IAES has two distinguishing features. The first is allocating more processors to the tasks on allocated critical paths for effectively reducing the makespan of workflow execution. The second is allowing the processor allocation of an M-task to shrink during the iterative procedure, resulting in a more flexible and effective process for finding better allocation. The proposed IAES approach has been evaluated with a series of simulation experiments and compared to several well-known previous methods, including CPR, CPA, MCPA, and MCPA2. The experimental results indicate that our IAES approach outperforms those previous methods significantly in most situations, especially when nodes of the same layer in a workflow might have unequal workloads.

  2. DiscopFlow: A new Tool for Discovering Organizational Structures and Interaction Protocols in WorkFlow

    CERN Document Server

    Abdelkafi, Mahdi; Gargouri, Faiez

    2012-01-01

    This work deals with Workflow Mining (WM) a very active and promising research area. First, in this paper we give a critical and comparative study of three representative WM systems of this area: the ProM, InWolve and WorkflowMiner systems. The comparison is made according to quality criteria that we have defined such as the capacity to filter and convert a Workflow log, the capacity to discover workflow perspectives and the capacity to support Multi-Analysis of processes. The major drawback of these systems is the non possibility to deal with organizational perspective discovering issue. We mean by organizational perspective, the organizational structures (federation, coalition, market or hierarchy) and interaction protocols (contract net, auction or vote). This paper defends the idea that organizational dimension in Multi-Agent System is an appropriate approach to support the discovering of this organizational perspective. Second, the paper proposes a Workflow log meta-model which extends the classical one ...

  3. Transmission line model for strained quantum well lasers including carrier transport and carrier heating effects.

    Science.gov (United States)

    Xia, Mingjun; Ghafouri-Shiraz, H

    2016-03-01

    This paper reports a new model for strained quantum well lasers, which are based on the quantum well transmission line modeling method where effects of both carrier transport and carrier heating have been included. We have applied this new model and studied the effect of carrier transport on the output waveform of a strained quantum well laser both in time and frequency domains. It has been found that the carrier transport increases the turn-on, turn-off delay times and damping of the quantum well laser transient response. Also, analysis in the frequency domain indicates that the carrier transport causes the output spectrum of the quantum well laser in steady state to exhibit a redshift which has a narrower bandwidth and lower magnitude. The simulation results of turning-on transients obtained by the proposed model are compared with those obtained by the rate equation laser model. The new model has also been used to study the effects of pump current spikes on the laser output waveforms properties, and it was found that the presence of current spikes causes (i) wavelength blueshift, (ii) larger bandwidth, and (iii) reduces the magnitude and decreases the side-lobe suppression ratio of the laser output spectrum. Analysis in both frequency and time domains confirms that the new proposed model can accurately predict the temporal and spectral behaviors of strained quantum well lasers.

  4. A numerical model including PID control of a multizone crystal growth furnace

    Science.gov (United States)

    Panzarella, Charles H.; Kassemi, Mohammad

    This paper presents a 2D axisymmetric combined conduction and radiation model of a multizone crystal growth furnace. The model is based on a programmable multizone furnace (PMZF) designed and built at NASA Lewis Research Center for growing high quality semiconductor crystals. A novel feature of this model is a control algorithm which automatically adjusts the power in any number of independently controlled heaters to establish the desired crystal temperatures in the furnace model. The control algorithm eliminates the need for numerous trial and error runs previously required to obtain the same results. The finite element code, FIDAP, used to develop the furnace model, was modified to directly incorporate the control algorithm. This algorithm, which presently uses PID control, and the associated heat transfer model are briefly discussed. Together, they have been used to predict the heater power distributions for a variety of furnace configurations and desired temperature profiles. Examples are included to demonstrate the effectiveness of the PID controlled model in establishing isothermal, Bridgman, and other complicated temperature profies in the sample. Finally, an example is given to show how the algorithm can be used to change the desired profile with time according to a prescribed temperature-time evolution.

  5. A continuum model of solvation energies including electrostatic, dispersion, and cavity contributions.

    Science.gov (United States)

    Duignan, Timothy T; Parsons, Drew F; Ninham, Barry W

    2013-08-15

    Physically accurate continuum solvent models that can calculate solvation energies are crucial to explain and predict the behavior of solute particles in water. Here, we present such a model applied to small spherical ions and neutral atoms. It improves upon a basic Born electrostatic model by including a standard cavity energy and adding a dispersion component, consistent with the Born electrostatic energy and using the same cavity size parameter. We show that the well-known, puzzling differences between the solvation energies of ions of the same size is attributable to the neglected dispersion contribution. This depends on dynamic polarizability as well as size. Generally, a large cancellation exists between the cavity and dispersion contributions. This explains the surprising success of the Born model. The model accurately reproduces the solvation energies of the alkali halide ions, as well as the silver(I) and copper(I) ions with an error of 12 kJ mol(-1) (±3%). The solvation energy of the noble gases is also reproduced with an error of 2.6 kJ mol(-1) (±30%). No arbitrary fitting parameters are needed to achieve this. This model significantly improves our understanding of ionic solvation and forms a solid basis for the investigation of other ion-specific effects using a continuum solvent model.

  6. Model for resistance evolution in shape memory alloys including R-phase

    Science.gov (United States)

    Brammajyosula, Ravindra; Buravalla, Vidyashankar; Khandelwal, Ashish

    2011-03-01

    The electrical resistance behavior of a shape memory alloy (SMA) wire can be used for sensing the state of an SMA device. Hence, this study investigates the resistance evolution in SMAs. A lumped parameter model with cosine kinetics to capture the resistance variation during the phase transformation is developed. Several SMA materials show the presence of trigonal or rhombohedral (R) phase as an intermediate phase, apart from the commonly recognized austenite and martensite phases. Most of the SMA models ignore the R-phase effect in their prediction of thermomechanical response. This may be acceptable since the changes in thermomechanical response associated with the R-phase are relatively less. However, the resistivity related effects are pronounced in the presence of the R-phase and its appearance introduces non-monotonicity in the resistivity evolution. This leads to additional complexities in the use of resistance signal for sensing and control. Hence, a lumped model is developed here for resistance evolution including the R-phase effects. A phase-diagram-based model is proposed for predicting electro-thermomechanical response. Both steady state hysteretic response and transient response are modeled. The model predictions are compared with the available test data. Numerical studies have shown that the model is able to capture all the essential features of the resistance evolution in SMAs in the presence of the R-phase.

  7. A New Circuit Model for Spin-Torque Oscillator Including Perpendicular Torque of Magnetic Tunnel Junction

    Directory of Open Access Journals (Sweden)

    Hyein Lim

    2013-01-01

    Full Text Available Spin-torque oscillator (STO is a promising new technology for the future RF oscillators, which is based on the spin-transfer torque (STT effect in magnetic multilayered nanostructure. It is expected to provide a larger tunability, smaller size, lower power consumption, and higher level of integration than the semiconductor-based oscillators. In our previous work, a circuit-level model of the giant magnetoresistance (GMR STO was proposed. In this paper, we present a physics-based circuit-level model of the magnetic tunnel junction (MTJ-based STO. MTJ-STO model includes the effect of perpendicular torque that has been ignored in the GMR-STO model. The variations of three major characteristics, generation frequency, mean oscillation power, and generation linewidth of an MTJ-STO with respect to the amount of perpendicular torque, are investigated, and the results are applied to our model. The operation of the model was verified by HSPICE simulation, and the results show an excellent agreement with the experimental data. The results also prove that a full circuit-level simulation with MJT-STO devices can be made with our proposed model.

  8. Does including physiology improve species distribution model predictions of responses to recent climate change?

    Science.gov (United States)

    Buckley, Lauren B; Waaser, Stephanie A; MacLean, Heidi J; Fox, Richard

    2011-12-01

    Thermal constraints on development are often invoked to predict insect distributions. These constraints tend to be characterized in species distribution models (SDMs) by calculating development time based on a constant lower development temperature (LDT). Here, we assessed whether species-specific estimates of LDT based on laboratory experiments can improve the ability of SDMs to predict the distribution shifts of six U.K. butterflies in response to recent climate warming. We find that species-specific and constant (5 degrees C) LDT degree-day models perform similarly at predicting distributions during the period of 1970-1982. However, when the models for the 1970-1982 period are projected to predict distributions in 1995-1999 and 2000-2004, species-specific LDT degree-day models modestly outperform constant LDT degree-day models. Our results suggest that, while including species-specific physiology in correlative models may enhance predictions of species' distribution responses to climate change, more detailed models may be needed to adequately account for interspecific physiological differences.

  9. Modeling of single char combustion, including CO oxidation in its boundary layer

    Energy Technology Data Exchange (ETDEWEB)

    Lee, C.H.; Longwell, J.P.; Sarofim, A.F.

    1994-10-25

    The combustion of a char particle can be divided into a transient phase where its temperature increases as it is heated by oxidation, and heat transfer from the surrounding gas to an approximately constant temperature stage where gas phase reaction is important and which consumes most of the carbon and an extinction stage caused by carbon burnout. In this work, separate models were developed for the transient heating where gas phase reactions were unimportant and for the steady temperature stage where gas phase reactions were treated in detail. The transient char combustion model incorporates intrinsic char surface production of CO and CO{sub 2}, internal pore diffusion and external mass and heat transfer. The model provides useful information for particle ignition, burning temperature profile, combustion time, and carbon consumption rate. A gas phase reaction model incorporating the full set of 28 elementary C/H/O reactions was developed. This model calculated the gas phase CO oxidation reaction in the boundary layer at particle temperatures of 1250 K and 2500 K by using the carbon consumption rate and the burning temperature at the pseudo-steady state calculated from the temperature profile model but the transient heating was not included. This gas phase model can predict the gas species, and the temperature distributions in the boundary layer, the CO{sub 2}/CO ratio, and the location of CO oxidation. A mechanistic heat and mass transfer model was added to the temperature profile model to predict combustion behavior in a fluidized bed. These models were applied to data from the fluidized combustion of Newlands coal char particles. 52 refs., 60 figs.

  10. Including source uncertainty and prior information in the analysis of stable isotope mixing models.

    Science.gov (United States)

    Ward, Eric J; Semmens, Brice X; Schindler, Daniel E

    2010-06-15

    Stable isotope mixing models offer a statistical framework for estimating the contribution of multiple sources (such as prey) to a mixture distribution. Recent advances in these models have estimated the source proportions using Bayesian methods, but have not explicitly accounted for uncertainty in the mean and variance of sources. We demonstrate that treating these quantities as unknown parameters can reduce bias in the estimated source contributions, although model complexity is increased (thereby increasing the variance of estimates). The advantages of this fully Bayesian approach are particularly apparent when the source geometry is poor or sample sizes are small. A second benefit to treating source quantities as parameters is that prior source information can be included. We present findings from 9 lake food-webs, where the consumer of interest (fish) has a diet composed of 5 sources: aquatic insects, snails, zooplankton, amphipods, and terrestrial insects. We compared the traditional Bayesian stable isotope mixing model with fixed source parameters to our fully Bayesian model-with and without an informative prior. The informative prior has much less impact than the choice of model-the traditional mixing model with fixed source parameters estimates the diet to be dominated by aquatic insects, while the fully Bayesian model estimates the diet to be more balanced but with greater importance of zooplankton. The findings from this example demonstrate that there can be stark differences in inference between the two model approaches, particularly when the source geometry of the mixing model is poor. These analyses also emphasize the importance of investing substantial effort toward characterizing the variation in the isotopic characteristics of source pools to appropriately quantify uncertainties in their contributions to consumers in food webs.

  11. Kinetic modeling of rhamnolipid production by Pseudomonas aeruginosa PAO1 including cell density-dependent regulation.

    Science.gov (United States)

    Henkel, Marius; Schmidberger, Anke; Vogelbacher, Markus; Kühnert, Christian; Beuker, Janina; Bernard, Thomas; Schwartz, Thomas; Syldatk, Christoph; Hausmann, Rudolf

    2014-08-01

    The production of rhamnolipid biosurfactants by Pseudomonas aeruginosa is under complex control of a quorum sensing-dependent regulatory network. Due to a lack of understanding of the kinetics applicable to the process and relevant interrelations of variables, current processes for rhamnolipid production are based on heuristic approaches. To systematically establish a knowledge-based process for rhamnolipid production, a deeper understanding of the time-course and coupling of process variables is required. By combining reaction kinetics, stoichiometry, and experimental data, a process model for rhamnolipid production with P. aeruginosa PAO1 on sunflower oil was developed as a system of coupled ordinary differential equations (ODEs). In addition, cell density-based quorum sensing dynamics were included in the model. The model comprises a total of 36 parameters, 14 of which are yield coefficients and 7 of which are substrate affinity and inhibition constants. Of all 36 parameters, 30 were derived from dedicated experimental results, literature, and databases and 6 of them were used as fitting parameters. The model is able to describe data on biomass growth, substrates, and products obtained from a reference batch process and other validation scenarios. The model presented describes the time-course and interrelation of biomass, relevant substrates, and products on a process level while including a kinetic representation of cell density-dependent regulatory mechanisms.

  12. Codigestion of solid wastes: a review of its uses and perspectives including modeling.

    Science.gov (United States)

    Mata-Alvarez, Joan; Dosta, Joan; Macé, Sandra; Astals, Sergi

    2011-06-01

    The last two years have witnessed a dramatic increase in the number of papers published on the subject of codigestion, highlighting the relevance of this topic within anaerobic digestion research. Consequently, it seems appropriate to undertake a review of codigestion practices starting from the late 1970s, when the first papers related to this concept were published, and continuing to the present day, demonstrating the exponential growth in the interest shown in this approach in recent years. Following a general analysis of the situation, state-of-the-art codigestion is described, focusing on the two most important areas as regards publication: codigestion involving sewage sludge and the organic fraction of municipal solid waste (including a review of the secondary advantages for wastewater treatment plant related to biological nutrient removal), and codigestion in the agricultural sector, that is, including agricultural - farm wastes, and energy crops. Within these areas, a large number of oversized digesters appear which can be used to codigest other substrates, resulting in economic and environmental advantages. Although the situation may be changing, there is still a need for good examples on an industrial scale, particularly with regard to wastewater treatment plants, in order to extend this beneficial practice. In the last section, a detailed analysis of papers addressing the important aspect of modelisation is included. This analysis includes the first codigestion models to be developed as well as recent applications of the standardised anaerobic digestion model ADM1 to codigestion. (This review includes studies ranging from laboratory to industrial scale.).

  13. Enhanced UWB Radio Channel Model for Short-Range Communication Scenarios Including User Dynamics

    DEFF Research Database (Denmark)

    Kovacs, Istvan Zsolt; Nguyen, Tuan Hung; Eggers, Patrick Claus F.

    2005-01-01

    In this paper we propose a SISO UWB radio channel model for short-range radio link scenarios between a fixed device and a dynamic user hand-held device. The channel model is derived based on novel experimental UWB radio propagation investigations carried out in typical indoor PAN scenarios...... including realistic device and user terminal antenna configurations. The radio channel measurements have been performed in the lower UWB frequency band of 3GHz to 5GHz with a 2x4 MIMO antenna configuration. Several environments, user scenarios and two types of user terminals have been used. The developed...

  14. Fuzzy Control of Yaw and Roll Angles of a Simulated Helicopter Model Includes Articulated Manipulators

    Directory of Open Access Journals (Sweden)

    Hossein Sadegh Lafmejani

    2015-09-01

    Full Text Available Fuzzy logic controller (FLC is a heuristic method by If-Then Rules which resembles human intelligence and it is a good method for designing Non-linear control systems. In this paper, an arbitrary helicopter model includes articulated manipulators has been simulated with Matlab SimMechanics toolbox. Due to the difficulties of modeling this complex system, a fuzzy controller with simple fuzzy rules has been designed for its yaw and roll angles in order to stabilize the helicopter while it is in the presence of disturbances or its manipulators are moving for a task. Results reveal that a simple FLC can appropriately control this system.

  15. Including Finite Surface Span Effects in Empirical Jet-Surface Interaction Noise Models

    Science.gov (United States)

    Brown, Clifford A.

    2016-01-01

    The effect of finite span on the jet-surface interaction noise source and the jet mixing noise shielding and reflection effects is considered using recently acquired experimental data. First, the experimental setup and resulting data are presented with particular attention to the role of surface span on far-field noise. These effects are then included in existing empirical models that have previously assumed that all surfaces are semi-infinite. This extended abstract briefly describes the experimental setup and data leaving the empirical modeling aspects for the final paper.

  16. Analytical model for investigation of interior noise characteristics in aircraft with multiple propellers including synchrophasing

    Science.gov (United States)

    Fuller, C. R.

    1986-01-01

    A simplified analytical model of transmission of noise into the interior of propeller-driven aircraft has been developed. The analysis includes directivity and relative phase effects of the propeller noise sources, and leads to a closed form solution for the coupled motion between the interior and exterior fields via the shell (fuselage) vibrational response. Various situations commonly encountered in considering sound transmission into aircraft fuselages are investigated analytically and the results obtained are compared to measurements in real aircraft. In general the model has proved successful in identifying basic mechanisms behind noise transmission phenomena.

  17. Introducing students to digital geological mapping: A workflow based on cheap hardware and free software

    Science.gov (United States)

    Vrabec, Marko; Dolžan, Erazem

    2016-04-01

    The undergraduate field course in Geological Mapping at the University of Ljubljana involves 20-40 students per year, which precludes the use of specialized rugged digital field equipment as the costs would be way beyond the capabilities of the Department. A different mapping area is selected each year with the aim to provide typical conditions that a professional geologist might encounter when doing fieldwork in Slovenia, which includes rugged relief, dense tree cover, and moderately-well- to poorly-exposed bedrock due to vegetation and urbanization. It is therefore mandatory that the digital tools and workflows are combined with classical methods of fieldwork, since, for example, full-time precise GNSS positioning is not viable under such circumstances. Additionally, due to the prevailing combination of complex geological structure with generally poor exposure, students cannot be expected to produce line (vector) maps of geological contacts on the go, so there is no need for such functionality in hardware and software that we use in the field. Our workflow therefore still relies on paper base maps, but is strongly complemented with digital tools to provide robust positioning, track recording, and acquisition of various point-based data. Primary field hardware are students' Android-based smartphones and optionally tablets. For our purposes, the built-in GNSS chips provide adequate positioning precision most of the time, particularly if they are GLONASS-capable. We use Oruxmaps, a powerful free offline map viewer for the Android platform, which facilitates the use of custom-made geopositioned maps. For digital base maps, which we prepare in free Windows QGIS software, we use scanned topographic maps provided by the National Geodetic Authority, but also other maps such as aerial imagery, processed Digital Elevation Models, scans of existing geological maps, etc. Point data, like important outcrop locations or structural measurements, are entered into Oruxmaps as

  18. A workflow for the 3D visualization of meteorological data

    Science.gov (United States)

    Helbig, Carolin; Rink, Karsten

    2014-05-01

    In the future, climate change will strongly influence our environment and living conditions. To predict possible changes, climate models that include basic and process conditions have been developed and big data sets are produced as a result of simulations. The combination of various variables of climate models with spatial data from different sources helps to identify correlations and to study key processes. For our case study we use results of the weather research and forecasting (WRF) model of two regions at different scales that include various landscapes in Northern Central Europe and Baden-Württemberg. We visualize these simulation results in combination with observation data and geographic data, such as river networks, to evaluate processes and analyze if the model represents the atmospheric system sufficiently. For this purpose, a continuous workflow that leads from the integration of heterogeneous raw data to visualization using open source software (e.g. OpenGeoSys Data Explorer, ParaView) is developed. These visualizations can be displayed on a desktop computer or in an interactive virtual reality environment. We established a concept that includes recommended 3D representations and a color scheme for the variables of the data based on existing guidelines and established traditions in the specific domain. To examine changes over time in observation and simulation data, we added the temporal dimension to the visualization. In a first step of the analysis, the visualizations are used to get an overview of the data and detect areas of interest such as regions of convection or wind turbulences. Then, subsets of data sets are extracted and the included variables can be examined in detail. An evaluation by experts from the domains of visualization and atmospheric sciences establish if they are self-explanatory and clearly arranged. These easy-to-understand visualizations of complex data sets are the basis for scientific communication. In addition, they have

  19. An air/sea flux model including the effects of capillary waves

    Science.gov (United States)

    Bourassa, Mark A.

    1993-01-01

    An improved model of the air/sea interface is developed. The improvements consist in including the effect of capillary (surface tension) waves on the tropical surface fluxes and the consideration of the sea state, both of which increase the magnitude of tropical surface fluxes. Changes in surface stress are most significant in the low wind-speed regions, which include the areas where westerly bursts occur. It is shown that the changes, from the regular wind conditions to those of a westerly burst or El-Nino, can double when the effects of capillary waves are considered. This implies a much stronger coupling between the ocean and the atmosphere than is predicted by other boundary layer models.

  20. A complete model of CH+ rotational excitation including radiative and chemical pumping processes

    CERN Document Server

    Godard, Benjamin

    2012-01-01

    Aims. Excitation of far-infrared and submillimetric molecular lines may originate from nonreactive collisions, chemical formation, or far infrared, near-infrared, and optical fluorescences. As a template, we investigate the impact of each of these processes on the excitation of the methylidyne cation CH+ and on the intensities of its rotational transitions recently detected in emission in dense photodissociation regions (PDRs) and in planetary nebulae. Methods. We have developed a nonlocal thermodynamic equilibrium (non-LTE) excitation model that includes the entire energy structure of CH+, i.e. taking into account the pumping of its vibrational and bound and unbound electronic states by near-infrared and optical photons. The model includes the theoretical cross-sections of nonreactive collisions with H, H2, He, and e-, and a Boltzmann distribution is used to describe the probability of populating the excited levels of CH+ during its chemical formation by hydrogenation of C+. To confirm our results we also pe...

  1. Including Flocculation in a Numerical Sediment Transport Model for a Partially-Mixed Estuary

    Science.gov (United States)

    Tarpley, D.; Harris, C. K.; Friedrichs, C. T.

    2016-12-01

    Particle settling velocity impacts the transport of suspended sediment to the first order but fine-grained material like muds tend to form loosely bound aggregates (flocs) whose settling velocity can vary widely. Properties of flocculated sediment such as settling velocity and particle density are difficult to predict because they change in response to several factors including salinity, suspended sediment concentration, turbulent mixing, and organic content. Knowledge of the mechanisms governing flocculation of cohesive sediment is rapidly expanding; especially in response to recent technical advances. As the understanding of particle dynamics progresses, numerical models describing flocculation and break-up are being developed with varying degrees of complexity. While complex models capture the dynamics of the system, their computational costs may prohibit their incorporation into larger model domains. It is important to determine if the computational costs of intricate floc models are justifiable compared to simpler formulations. For this study, we implement an idealized two-dimensional model designed to represent a longitudinal section of a partially mixed estuary that neglects across-channel variation but exhibits salinity driven estuarine circulation. The idealized domain is designed to mimic the primary features of the York River, VA. Suspended load, erosion and deposition are calculated within the sediment transport routines of the COAWST modeling system. We compare different methods for prescribing settling velocity of fine-grained material. The simplest, standard model neglects flocculation dynamics while the complex treatment is a size-class-based flocculation model (FLOCMOD). Differences in tidal and daily averages of suspended load, bulk settling velocity and bed deposition are compared between the standard and FLOCMOD runs, to examine the relative impact of flocculation on sediment transport patterns. We expect FLOCMOD to have greater variability and

  2. Multistate Statistical Modeling: A Tool to Build a Lung Cancer Microsimulation Model That Includes Parameter Uncertainty and Patient Heterogeneity.

    Science.gov (United States)

    Bongers, Mathilda L; de Ruysscher, Dirk; Oberije, Cary; Lambin, Philippe; Uyl-de Groot, Carin A; Coupé, V M H

    2016-01-01

    With the shift toward individualized treatment, cost-effectiveness models need to incorporate patient and tumor characteristics that may be relevant to treatment planning. In this study, we used multistate statistical modeling to inform a microsimulation model for cost-effectiveness analysis of individualized radiotherapy in lung cancer. The model tracks clinical events over time and takes patient and tumor features into account. Four clinical states were included in the model: alive without progression, local recurrence, metastasis, and death. Individual patients were simulated by repeatedly sampling a patient profile, consisting of patient and tumor characteristics. The transitioning of patients between the health states is governed by personalized time-dependent hazard rates, which were obtained from multistate statistical modeling (MSSM). The model simulations for both the individualized and conventional radiotherapy strategies demonstrated internal and external validity. Therefore, MSSM is a useful technique for obtaining the correlated individualized transition rates that are required for the quantification of a microsimulation model. Moreover, we have used the hazard ratios, their 95% confidence intervals, and their covariance to quantify the parameter uncertainty of the model in a correlated way. The obtained model will be used to evaluate the cost-effectiveness of individualized radiotherapy treatment planning, including the uncertainty of input parameters. We discuss the model-building process and the strengths and weaknesses of using MSSM in a microsimulation model for individualized radiotherapy in lung cancer.

  3. Model for safety reports including descriptive examples; Mall foer saekerhetsrapporter med beskrivande exempel

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-12-01

    Several safety reports will be produced in the process of planning and constructing the system for disposal of high-level radioactive waste in Sweden. The present report gives a model, with detailed examples, of how these reports should be organized and what steps they should include. In the near future safety reports will deal with the encapsulation plant and the repository. Later reports will treat operation of the handling systems and the repository.

  4. Extending the formal model of a spatial data infrastructure to include volunteered geographical information

    CSIR Research Space (South Africa)

    Cooper, Antony K

    2011-07-01

    Full Text Available an aggregator of VGI, such as Ushahidi, and the provider of the infrastructure for collecting VGI, such as OpenStreetMap. 3) Broker: A stakeholder who brings End Users and Providers together and assists in the negotiation of contracts between them... model of a spatial data infrastructure to include volunteered geographical information Antony K Cooper*, Petr Rapant?, Jan Hjelmager?, Dominique Laurent?, Adam Iwaniak#, Serena Coetzee$, Harold Moellering? and Ulrich D?ren? *Logistics...

  5. QCD Equation of State From a Chiral Hadronic Model Including Quark Degrees of Freedom

    CERN Document Server

    Rau, Philip; Schramm, Stefan; Stöcker, Horst

    2013-01-01

    This work presents an effective model for strongly interacting matter and the QCD equation of state (EoS). The model includes both hadron and quark degrees of freedom and takes into account the transition of chiral symmetry restoration as well as the deconfinement phase transition. At low temperatures $T$ and baryonic densities $\\rho_B$ a hadron resonance gas is described using a SU(3)-flavor sigma-omega model and a quark phase is introduced in analogy to PNJL models for higher $T$ and $\\rho_B$. In this way, the correct asymptotic degrees of freedom are used in a wide range of $T$ and $\\rho_B$. Here, results of this model concerning the chiral and deconfinement phase transitions and thermodynamic model properties are presented. Large hadron resonance multiplicities in the transition region emphasize the importance of heavy-mass resonance states in this region and their impact on the chiral transition behavior. The resulting phase diagram of QCD matter at small chemical potentials is in line with latest lattic...

  6. A full model for simulation of electrochemical cells including complex behavior

    Science.gov (United States)

    Esperilla, J. J.; Félez, J.; Romero, G.; Carretero, A.

    This communication presents a model of electrochemical cells developed in order to simulate their electrical, chemical and thermal behavior showing the differences when thermal effects are or not considered in the charge-discharge process. The work presented here has been applied to the particular case of the Pb,PbSO 4|H 2SO 4 (aq)|PbO 2,Pb cell, which forms the basis of the lead-acid batteries so widely used in the automotive industry and as traction batteries in electric or hybrid vehicles. Each half-cell is considered independently in the model. For each half-cell, in addition to the main electrode reaction, a secondary reaction is considered: the hydrogen evolution reaction in the negative electrode and the oxygen evolution reaction in the positive. The equilibrium potential is calculated with the Nernst equation, in which the activity coefficients are fitted to an exponential function using experimental data. On the other hand, the two main mechanisms that produce the overpotential are considered, that is the activation or charge transfer and the diffusion mechanisms. First, an isothermal model has been studied in order to show the behavior of the main phenomena. A more complex model has also been studied including thermal behavior. This model is very useful in the case of traction batteries in electric and hybrid vehicles where high current intensities appear. Some simulation results are also presented in order to show the accuracy of the proposed models.

  7. A High-Rate, Single-Crystal Model including Phase Transformations, Plastic Slip, and Twinning

    Energy Technology Data Exchange (ETDEWEB)

    Addessio, Francis L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Theoretical Division; Bronkhorst, Curt Allan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Theoretical Division; Bolme, Cynthia Anne [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Explosive Science and Shock Physics Division; Brown, Donald William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Materials Science and Technology Division; Cerreta, Ellen Kathleen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Materials Science and Technology Division; Lebensohn, Ricardo A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Materials Science and Technology Division; Lookman, Turab [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Theoretical Division; Luscher, Darby Jon [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Theoretical Division; Mayeur, Jason Rhea [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Theoretical Division; Morrow, Benjamin M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Materials Science and Technology Division; Rigg, Paulo A. [Washington State Univ., Pullman, WA (United States). Dept. of Physics. Inst. for Shock Physics

    2016-08-09

    An anisotropic, rate-­dependent, single-­crystal approach for modeling materials under the conditions of high strain rates and pressures is provided. The model includes the effects of large deformations, nonlinear elasticity, phase transformations, and plastic slip and twinning. It is envisioned that the model may be used to examine these coupled effects on the local deformation of materials that are subjected to ballistic impact or explosive loading. The model is formulated using a multiplicative decomposition of the deformation gradient. A plate impact experiment on a multi-­crystal sample of titanium was conducted. The particle velocities at the back surface of three crystal orientations relative to the direction of impact were measured. Molecular dynamics simulations were conducted to investigate the details of the high-­rate deformation and pursue issues related to the phase transformation for titanium. Simulations using the single crystal model were conducted and compared to the high-­rate experimental data for the impact loaded single crystals. The model was found to capture the features of the experiments.

  8. Adaptive workflow simulation of emergency response

    NARCIS (Netherlands)

    Bruinsma, Guido Wybe Jan

    2010-01-01

    Recent incidents and major training exercises in and outside the Netherlands have persistently shown that not having or not sharing information during emergency response are major sources of emergency response inefficiency and error, and affect incident mitigation outcomes through workflow planning

  9. Workflow Automation: A Collective Case Study

    Science.gov (United States)

    Harlan, Jennifer

    2013-01-01

    Knowledge management has proven to be a sustainable competitive advantage for many organizations. Knowledge management systems are abundant, with multiple functionalities. The literature reinforces the use of workflow automation with knowledge management systems to benefit organizations; however, it was not known if process automation yielded…

  10. Beyond Scientific Workflows: Networked Open Processes

    NARCIS (Netherlands)

    Cushing, R.; Bubak, M.; Belloum, A.; de Laat, C.

    2013-01-01

    The multitude of scientific services and processes being developed brings about challenges for future in silico distributed experiments. Choosing the correct service from an expanding body of processes means that the the task of manually building workflows is becoming untenable. In this paper we pro

  11. Workflow Automation: A Collective Case Study

    Science.gov (United States)

    Harlan, Jennifer

    2013-01-01

    Knowledge management has proven to be a sustainable competitive advantage for many organizations. Knowledge management systems are abundant, with multiple functionalities. The literature reinforces the use of workflow automation with knowledge management systems to benefit organizations; however, it was not known if process automation yielded…

  12. Using workflow for projects in higher education

    NARCIS (Netherlands)

    van der Veen, Johan (CTIT); Jones, Valerie M.; Collis, Betty

    2000-01-01

    The WWW is increasingly used as a medium to support education and training. A course at the University of Twente in which groups of students collaborate in the design and production of multimedia instructional materials has now been supported by a website since 1995. Workflow was integrated with

  13. Building Digital Audio Preservation Infrastructure and Workflows

    Science.gov (United States)

    Young, Anjanette; Olivieri, Blynne; Eckler, Karl; Gerontakos, Theodore

    2010-01-01

    In 2009 the University of Washington (UW) Libraries special collections received funding for the digital preservation of its audio indigenous language holdings. The university libraries, where the authors work in various capacities, had begun digitizing image and text collections in 1997. Because of this, at the onset of the project, workflows (a…

  14. On Secure Workflow Decentralisation on the Internet

    Directory of Open Access Journals (Sweden)

    Petteri Kaskenpalo

    2010-06-01

    Full Text Available Decentralised workflow management systems are a new research area, where most work to-date has focused on the system's overall architecture. As little attention has been given to the security aspects in such systems, we follow a security driven approach, and consider, from the perspective of available security building blocks, how security can be implemented and what new opportunities are presented when empowering the decentralised environment with modern distributed security protocols. Our research is motivated by a more general question of how to combine the positive enablers that email exchange enjoys, with the general benefits of workflow systems, and more specifically with the benefits that can be introduced in a decentralised environment. This aims to equip email users with a set of tools to manage the semantics of a message exchange, contents, participants and their roles in the exchange in an environment that provides inherent assurances of security and privacy. This work is based on a survey of contemporary distributed security protocols, and considers how these protocols could be used in implementing a distributed workflow management system with decentralised control . We review a set of these protocols, focusing on the required message sequences in reviewing the protocols, and discuss how these security protocols provide the foundations for implementing core control-flow, data, and resource patterns in a distributed workflow environment.

  15. Using workflow for projects in higher education

    NARCIS (Netherlands)

    van der Veen, Johan (CTIT); Jones, Valerie M.; Collis, Betty

    2000-01-01

    The WWW is increasingly used as a medium to support education and training. A course at the University of Twente in which groups of students collaborate in the design and production of multimedia instructional materials has now been supported by a website since 1995. Workflow was integrated with oth

  16. A 3D model of the oculomotor plant including the pulley system

    Energy Technology Data Exchange (ETDEWEB)

    Viegener, A; Armentano, R L [Fundacion Universitaria Dr. Rene G. Favaloro, SolIs 453 (1078) Buenos Aires (Argentina)

    2007-11-15

    Early models of the oculomotor plant only considered the eye globes and the muscles that move them. Recently, connective tissue structures have been found enveloping the extraocular muscles (EOMs) and firmly anchored to the orbital wall. These structures act as pulleys; they determine the functional origin of the EOMs and, in consequence, their effective pulling direction. A three dimensional model of the oculomotor plant, including pulleys, has been developed and simulations in Simulink were performed during saccadic eye movements. Listing's law was implemented based on the supposition that there exists an eye orientation related signal. The inclusion of the pulleys in the model makes this assumption plausible and simplifies the problem of the plant noncommutativity.

  17. A flexible and qualitatively stable model for cell cycle dynamics including DNA damage effects.

    Science.gov (United States)

    Jeffries, Clark D; Johnson, Charles R; Zhou, Tong; Simpson, Dennis A; Kaufmann, William K

    2012-01-01

    This paper includes a conceptual framework for cell cycle modeling into which the experimenter can map observed data and evaluate mechanisms of cell cycle control. The basic model exhibits qualitative stability, meaning that regardless of magnitudes of system parameters its instances are guaranteed to be stable in the sense that all feasible trajectories converge to a certain trajectory. Qualitative stability can also be described by the signs of real parts of eigenvalues of the system matrix. On the biological side, the resulting model can be tuned to approximate experimental data pertaining to human fibroblast cell lines treated with ionizing radiation, with or without disabled DNA damage checkpoints. Together these properties validate a fundamental, first order systems view of cell dynamics. Classification Codes: 15A68.

  18. RELAP5-3D Code Includes Athena Features and Models

    Energy Technology Data Exchange (ETDEWEB)

    Richard A. Riemke; Cliff B. Davis; Richard R. Schultz

    2006-07-01

    Version 2.3 of the RELAP5-3D computer program includes all features and models previously available only in the ATHENA version of the code. These include the addition of new working fluids (i.e., ammonia, blood, carbon dioxide, glycerol, helium, hydrogen, lead-bismuth, lithium, lithium-lead, nitrogen, potassium, sodium, and sodium-potassium) and a magnetohydrodynamic model that expands the capability of the code to model many more thermal-hydraulic systems. In addition to the new working fluids along with the standard working fluid water, one or more noncondensable gases (e.g., air, argon, carbon dioxide, carbon monoxide, helium, hydrogen, krypton, nitrogen, oxygen, sf6, xenon) can be specified as part of the vapor/gas phase of the working fluid. These noncondensable gases were in previous versions of RELAP5- 3D. Recently four molten salts have been added as working fluids to RELAP5-3D Version 2.4, which has had limited release. These molten salts will be in RELAP5-3D Version 2.5, which will have a general release like RELAP5-3D Version 2.3. Applications that use these new features and models are discussed in this paper.

  19. A Web application for the management of clinical workflow in image-guided and adaptive proton therapy for prostate cancer treatments.

    Science.gov (United States)

    Yeung, Daniel; Boes, Peter; Ho, Meng Wei; Li, Zuofeng

    2015-05-08

    Image-guided radiotherapy (IGRT), based on radiopaque markers placed in the prostate gland, was used for proton therapy of prostate patients. Orthogonal X-rays and the IBA Digital Image Positioning System (DIPS) were used for setup correction prior to treatment and were repeated after treatment delivery. Following a rationale for margin estimates similar to that of van Herk,(1) the daily post-treatment DIPS data were analyzed to determine if an adaptive radiotherapy plan was necessary. A Web application using ASP.NET MVC5, Entity Framework, and an SQL database was designed to automate this process. The designed features included state-of-the-art Web technologies, a domain model closely matching the workflow, a database-supporting concurrency and data mining, access to the DIPS database, secured user access and roles management, and graphing and analysis tools. The Model-View-Controller (MVC) paradigm allowed clean domain logic, unit testing, and extensibility. Client-side technologies, such as jQuery, jQuery Plug-ins, and Ajax, were adopted to achieve a rich user environment and fast response. Data models included patients, staff, treatment fields and records, correction vectors, DIPS images, and association logics. Data entry, analysis, workflow logics, and notifications were implemented. The system effectively modeled the clinical workflow and IGRT process.

  20. RESTFul based heterogeneous Geoprocessing workflow interoperation for Sensor Web Service

    Science.gov (United States)

    Yang, Chao; Chen, Nengcheng; Di, Liping

    2012-10-01

    Advanced sensors on board satellites offer detailed Earth observations. A workflow is one approach for designing, implementing and constructing a flexible and live link between these sensors' resources and users. It can coordinate, organize and aggregate the distributed sensor Web services to meet the requirement of a complex Earth observation scenario. A RESTFul based workflow interoperation method is proposed to integrate heterogeneous workflows into an interoperable unit. The Atom protocols are applied to describe and manage workflow resources. The XML Process Definition Language (XPDL) and Business Process Execution Language (BPEL) workflow standards are applied to structure a workflow that accesses sensor information and one that processes it separately. Then, a scenario for nitrogen dioxide (NO2) from a volcanic eruption is used to investigate the feasibility of the proposed method. The RESTFul based workflows interoperation system can describe, publish, discover, access and coordinate heterogeneous Geoprocessing workflows.

  1. Including policy and management in socio-hydrology models: initial conceptualizations

    Science.gov (United States)

    Hermans, Leon; Korbee, Dorien

    2017-04-01

    Socio-hydrology studies the interactions in coupled human-water systems. So far, the use of dynamic models that capture the direct feedback between societal and hydrological systems has been dominant. What has not yet been included with any particular emphasis, is the policy or management layer, which is a central element in for instance integrated water resources management (IWRM) or adaptive delta management (ADM). Studying the direct interactions between human-water systems generates knowledges that eventually helps influence these interactions in ways that may ensure better outcomes - for society and for the health and sustainability of water systems. This influence sometimes occurs through spontaneous emergence, uncoordinated by societal agents - private sector, citizens, consumers, water users. However, the term 'management' in IWRM and ADM also implies an additional coordinated attempt through various public actors. This contribution is a call to include the policy and management dimension more prominently into the research focus of the socio-hydrology field, and offers first conceptual variables that should be considered in attempts to include this policy or management layer in socio-hydrology models. This is done by drawing on existing frameworks to study policy processes throughout both planning and implementation phases. These include frameworks such as the advocacy coalition framework, collective learning and policy arrangements, which all emphasis longer-term dynamics and feedbacks between actor coalitions in strategic planning and implementation processes. A case about longter-term dynamics in the management of the Haringvliet in the Netherlands is used to illustrate the paper.

  2. EXACT SOLUTIONS FOR NONLINEAR TRANSIENT FLOW MODEL INCLUDING A QUADRATIC GRADIENT TERM

    Institute of Scientific and Technical Information of China (English)

    曹绪龙; 同登科; 王瑞和

    2004-01-01

    The models of the nonlinear radial flow for the infinite and finite reservoirs including a quadratic gradient term were presented. The exact solution was given in real space for flow equation including quadratic gradiet term for both constant-rate and constant pressure production cases in an infinite system by using generalized Weber transform. Analytical solutions for flow equation including quadratic gradient term were also obtained by using the Hankel transform for a finite circular reservoir case. Both closed and constant pressure outer boundary conditions are considered. Moreover, both constant rate and constant pressure inner boundary conditions are considered. The difference between the nonlinear pressure solution and linear pressure solution is analyzed. The difference may be reached about 8% in the long time. The effect of the quadratic gradient term in the large time well test is considered.

  3. From Requirements via Colored Workflow Nets to an Implementation in Several Workflow Systems

    DEFF Research Database (Denmark)

    Mans, Ronny S.; van der Aalst, Willibrordus Martinus Pancratius; Molemann, A.J.;

    2007-01-01

    an Executable Use Case (EUC) and Colored Care organizations, such as hospitals, need to support complex and dynamic workflows. Moreover, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes......Care organizations, such as hospitals, need to support complex and dynamic workflows. More- over, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes an approach where...... an approach where an Executable Use Case (EUC) and Colored Workflow Net (CWN) are used to close the gap between the given requirements specification and the realization of these requirements with the help of a workflow system. This paper describes a large case study where the diagnostic trajectory...

  4. Integrated exploration workflow in the south Middle Magdalena Valley (Colombia)

    Science.gov (United States)

    Moretti, Isabelle; Charry, German Rodriguez; Morales, Marcela Mayorga; Mondragon, Juan Carlos

    2010-03-01

    The HC exploration is presently active in the southern part of the Middle Magdalena Valley but only moderate size discoveries have been made up to date. The majority of these discoveries are at shallow depth in the Tertiary section. The structures located in the Valley are faulted anticlines charged by lateral migration from the Cretaceous source rocks that are assumed to be present and mature eastward below the main thrusts and the Guaduas Syncline. Upper Cretaceous reservoirs have also been positively tested. To reduce the risks linked to the exploration of deeper structures below the western thrusts of the Eastern Cordillera, an integrated study was carried out. It includes the acquisition of new seismic data, the integration of all surface and subsurface data within a 3D-geomodel, a quality control of the structural model by restoration and a modeling of the petroleum system (presence and maturity of the Cretaceous source rocks, potential migration pathways). The various steps of this workflow will be presented as well as the main conclusions in term of source rock, deformation phases and timing of the thrust emplacement versus oil maturation and migration. Our data suggest (or confirm) The good potential of the Umir Fm as a source rock. The early (Paleogene) deformation of the Bituima Trigo fault area. The maturity gap within the Cretaceous source rock between the hangingwall and footwall of the Bituima fault that proves an initial offset of Cretaceous burial in the range of 4.5 km between the Upper Cretaceous series westward and the Lower Cretaceous ones eastward of this fault zone. The post Miocene weak reactivation as dextral strike slip of Cretaceous faults such as the San Juan de Rio Seco fault that corresponds to change in the Cretaceous thickness and therefore in the depth of the thrust decollement.

  5. SPheno 3.1: extensions including flavour, CP-phases and models beyond the MSSM

    Science.gov (United States)

    Porod, W.; Staub, F.

    2012-11-01

    We describe recent extensions of the program SPhenoincluding flavour aspects, CP-phases, R-parity violation and low energy observables. In case of flavour mixing all masses of supersymmetric particles are calculated including the complete flavour structure and all possible CP-phases at the 1-loop level. We give details on implemented seesaw models, low energy observables and the corresponding extension of the SUSY Les Houches Accord. Moreover, we comment on the possibilities to include MSSM extensions in SPheno. Catalogue identifier: ADRV_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADRV_v2_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 154062 No. of bytes in distributed program, including test data, etc.: 1336037 Distribution format: tar.gz Programming language: Fortran95. Computer: PC running under Linux, should run in every Unix environment. Operating system: Linux, Unix. Classification: 11.6. Catalogue identifier of previous version: ADRV_v1_0 Journal reference of previous version: Comput. Phys. Comm. 153(2003)275 Does the new version supersede the previous version?: Yes Nature of problem: The first issue is the determination of the masses and couplings of supersymmetric particles in various supersymmetric models, the R-parity conserved MSSM with generation mixing and including CP-violating phases, various seesaw extensions of the MSSM and the MSSM with bilinear R-parity breaking. Low energy data on Standard Model fermion masses, gauge couplings and electroweak gauge boson masses serve as constraints. Radiative corrections from supersymmetric particles to these inputs must be calculated. Theoretical constraints on the soft SUSY breaking parameters from a high scale theory are imposed and the parameters at the electroweak scale are obtained from the

  6. DISPLAY-2: a two-dimensional shallow layer model for dense gas dispersion including complex features.

    Science.gov (United States)

    Venetsanos, A G; Bartzis, J G; Würtz, J; Papailiou, D D

    2003-04-25

    A two-dimensional shallow layer model has been developed to predict dense gas dispersion, under realistic conditions, including complex features such as two-phase releases, obstacles and inclined ground. The model attempts to predict the time and space evolution of the cloud formed after a release of a two-phase pollutant into the atmosphere. The air-pollutant mixture is assumed ideal. The cloud evolution is described mathematically through the Cartesian, two-dimensional, shallow layer conservation equations for mixture mass, mixture momentum in two horizontal directions, total pollutant mass fraction (vapor and liquid) and mixture internal energy. Liquid mass fraction is obtained assuming phase equilibrium. Account is taken in the conservation equations for liquid slip and eventual liquid rainout through the ground. Entrainment of ambient air is modeled via an entrainment velocity model, which takes into account the effects of ground friction, ground heat transfer and relative motion between cloud and surrounding atmosphere. The model additionally accounts for thin obstacles effects in three ways. First a stepwise description of the obstacle is generated, following the grid cell faces, taking into account the corresponding area blockage. Then obstacle drag on the passing cloud is modeled by adding flow resistance terms in the momentum equations. Finally the effect of extra vorticity generation and entrainment enhancement behind obstacles is modeled by adding locally into the entrainment formula without obstacles, a characteristic velocity scale defined from the obstacle pressure drop and the local cloud height.The present model predictions have been compared against theoretical results for constant volume and constant flux gravity currents. It was found that deviations of the predicted cloud footprint area change with time from the theoretical were acceptably small, if one models the frictional forces between cloud and ambient air, neglecting the Richardson

  7. Sensitivity of an atmospheric photochemistry model to chlorine perturbations including consideration of uncertainty propagation

    Science.gov (United States)

    Stolarski, R. S.; Douglass, A. R.

    1986-01-01

    Models of stratospheric photochemistry are generally tested by comparing their predictions for the composition of the present atmosphere with measurements of species concentrations. These models are then used to make predictions of the atmospheric sensitivity to perturbations. Here the problem of the sensitivity of such a model to chlorine perturbations ranging from the present influx of chlorine-containing compounds to several times that influx is addressed. The effects of uncertainties in input parameters, including reaction rate coefficients, cross sections, solar fluxes, and boundary conditions, are evaluated using a Monte Carlo method in which the values of the input parameters are randomly selected. The results are probability distributions for present atmosheric concentrations and for calculated perturbations due to chlorine from fluorocarbons. For more than 300 Monte Carlo runs the calculated ozone perturbation for continued emission of fluorocarbons at today's rates had a mean value of -6.2 percent, with a 1-sigma width of 5.5 percent. Using the same runs but only allowing the cases in which the calculated present atmosphere values of NO, NO2, and ClO at 25 km altitude fell within the range of measurements yielded a mean ozone depletion of -3 percent, with a 1-sigma deviation of 2.2 percent. The model showed a nonlinear behavior as a function of added fluorocarbons. The mean of the Monte Carlo runs was less nonlinear than the model run using mean value of the input parameters.

  8. Health Promotion Behavior of Chinese International Students in Korea Including Acculturation Factors: A Structural Equation Model.

    Science.gov (United States)

    Kim, Sun Jung; Yoo, Il Young

    2016-03-01

    The purpose of this study was to explain the health promotion behavior of Chinese international students in Korea using a structural equation model including acculturation factors. A survey using self-administered questionnaires was employed. Data were collected from 272 Chinese students who have resided in Korea for longer than 6 months. The data were analyzed using structural equation modeling. The p value of final model is .31. The fitness parameters of the final model such as goodness of fit index, adjusted goodness of fit index, normed fit index, non-normed fit index, and comparative fit index were more than .95. Root mean square of residual and root mean square error of approximation also met the criteria. Self-esteem, perceived health status, acculturative stress and acculturation level had direct effects on health promotion behavior of the participants and the model explained 30.0% of variance. The Chinese students in Korea with higher self-esteem, perceived health status, acculturation level, and lower acculturative stress reported higher health promotion behavior. The findings can be applied to develop health promotion strategies for this population. Copyright © 2016. Published by Elsevier B.V.

  9. S5-4: Formal Modeling of Affordance in Human-Included Systems

    Directory of Open Access Journals (Sweden)

    Namhun Kim

    2012-10-01

    Full Text Available In spite of it being necessary for humans to consider modeling, analysis, and control of human-included systems, it has been considered a challenging problem because of the critical role of humans in complex systems and of humans' capability of executing unanticipated actions–both beneficial and detrimental ones. Thus, to provide systematic approaches to modeling human actions as a part of system behaviors, a formal modeling framework for human-involved systems in which humans play a controlling role based on their perceptual information is presented. The theory of affordance provides definitions of human actions and their associated properties; Finite State Automata (FSA based modeling is capable of mapping nondeterministic humans into computable components in the system representation. In this talk, we investigate the role of perception in human actions in the system operation and examine the representation of perceptual elements in affordance-based modeling formalism. The proposed framework is expected to capture the natural ways in which humans participate in the system as part of its operation. A human-machine cooperative manufacturing system control example and a human agent simulation example will be introduced for the illustrative purposes at the end of the presentation.

  10. An extended gene protein/products Boolean network model including post-transcriptional regulation.

    Science.gov (United States)

    Benso, Alfredo; Di Carlo, Stefano; Politano, Gianfranco; Savino, Alessandro; Vasciaveo, Alessandro

    2014-05-07

    Networks Biology allows the study of complex interactions between biological systems using formal, well structured, and computationally friendly models. Several different network models can be created, depending on the type of interactions that need to be investigated. Gene Regulatory Networks (GRN) are an effective model commonly used to study the complex regulatory mechanisms of a cell. Unfortunately, given their intrinsic complexity and non discrete nature, the computational study of realistic-sized complex GRNs requires some abstractions. Boolean Networks (BNs), for example, are a reliable model that can be used to represent networks where the possible state of a node is a boolean value (0 or 1). Despite this strong simplification, BNs have been used to study both structural and dynamic properties of real as well as randomly generated GRNs. In this paper we show how it is possible to include the post-transcriptional regulation mechanism (a key process mediated by small non-coding RNA molecules like the miRNAs) into the BN model of a GRN. The enhanced BN model is implemented in a software toolkit (EBNT) that allows to analyze boolean GRNs from both a structural and a dynamic point of view. The open-source toolkit is compatible with available visualization tools like Cytoscape and allows to run detailed analysis of the network topology as well as of its attractors, trajectories, and state-space. In the paper, a small GRN built around the mTOR gene is used to demonstrate the main capabilities of the toolkit. The extended model proposed in this paper opens new opportunities in the study of gene regulation. Several of the successful researches done with the support of BN to understand high-level characteristics of regulatory networks, can now be improved to better understand the role of post-transcriptional regulation for example as a network-wide noise-reduction or stabilization mechanisms.

  11. An extended gene protein/products boolean network model including post-transcriptional regulation

    Science.gov (United States)

    2014-01-01

    Background Networks Biology allows the study of complex interactions between biological systems using formal, well structured, and computationally friendly models. Several different network models can be created, depending on the type of interactions that need to be investigated. Gene Regulatory Networks (GRN) are an effective model commonly used to study the complex regulatory mechanisms of a cell. Unfortunately, given their intrinsic complexity and non discrete nature, the computational study of realistic-sized complex GRNs requires some abstractions. Boolean Networks (BNs), for example, are a reliable model that can be used to represent networks where the possible state of a node is a boolean value (0 or 1). Despite this strong simplification, BNs have been used to study both structural and dynamic properties of real as well as randomly generated GRNs. Results In this paper we show how it is possible to include the post-transcriptional regulation mechanism (a key process mediated by small non-coding RNA molecules like the miRNAs) into the BN model of a GRN. The enhanced BN model is implemented in a software toolkit (EBNT) that allows to analyze boolean GRNs from both a structural and a dynamic point of view. The open-source toolkit is compatible with available visualization tools like Cytoscape and allows to run detailed analysis of the network topology as well as of its attractors, trajectories, and state-space. In the paper, a small GRN built around the mTOR gene is used to demonstrate the main capabilities of the toolkit. Conclusions The extended model proposed in this paper opens new opportunities in the study of gene regulation. Several of the successful researches done with the support of BN to understand high-level characteristics of regulatory networks, can now be improved to better understand the role of post-transcriptional regulation for example as a network-wide noise-reduction or stabilization mechanisms. PMID:25080304

  12. Database Support for Workflow Management: The WIDE Project

    NARCIS (Netherlands)

    Grefen, P.W.P.J.; Pernici, B; Sánchez, G.; Unknown, [Unknown

    1999-01-01

    Database Support for Workflow Management: The WIDE Project presents the results of the ESPRIT WIDE project on advanced database support for workflow management. The book discusses the state of the art in combining database management and workflow management technology, especially in the areas of

  13. LHCb: LHCbDirac is a DIRAC extension to support LHCb specific workflows

    CERN Multimedia

    Stagni, Federico

    2012-01-01

    We present LHCbDIRAC, an extension of the DIRAC community Grid solution to handle the LHCb specificities. The DIRAC software has been developed for many years within LHCb only. Nowadays it is a generic software, used by many scientific communities worldwide. Each community wanting to take advantage of DIRAC has to develop an extension, containing all the necessary code for handling their specific cases. LHCbDIRAC is an actively developed extension, implementing the LHCb computing model and workflows. LHCbDIRAC extends DIRAC to handle all the distributed computing activities of LHCb. Such activities include real data processing (reconstruction, stripping and streaming), Monte-Carlo simulation and data replication. Other activities are groups and user analysis, data management, resources management and monitoring, data provenance, accounting for user and production jobs. LHCbDIRAC also provides extensions of the DIRAC interfaces, including a secure web client, python APIs and CLIs. While DIRAC and LHCbDIRAC f...

  14. Complexity Metrics for Workflow Nets

    DEFF Research Database (Denmark)

    Lassen, Kristian Bisgaard; van der Aalst, Wil M.P.

    2009-01-01

    Process modeling languages such as EPCs, BPMN, flow charts, UML activity diagrams, Petri nets, etc.\\ are used to model business processes and to configure process-aware information systems. It is known that users have problems understanding these diagrams. In fact, even process engineers and system......, etc. It seems obvious that the complexity of the model contributes to design errors and a lack of understanding. It is not easy to measure complexity, however. This paper presents three complexity metrics that have been implemented in the process analysis tool ProM. The metrics are defined...

  15. General hypothesis and shell model for the synthesis of semiconductor nanotubes, including carbon nanotubes

    Science.gov (United States)

    Mohammad, S. Noor

    2010-09-01

    Semiconductor nanotubes, including carbon nanotubes, have vast potential for new technology development. The fundamental physics and growth kinetics of these nanotubes are still obscured. Various models developed to elucidate the growth suffer from limited applicability. An in-depth investigation of the fundamentals of nanotube growth has, therefore, been carried out. For this investigation, various features of nanotube growth, and the role of the foreign element catalytic agent (FECA) in this growth, have been considered. Observed growth anomalies have been analyzed. Based on this analysis, a new shell model and a general hypothesis have been proposed for the growth. The essential element of the shell model is the seed generated from segregation during growth. The seed structure has been defined, and the formation of droplet from this seed has been described. A modified definition of the droplet exhibiting adhesive properties has also been presented. Various characteristics of the droplet, required for alignment and organization of atoms into tubular forms, have been discussed. Employing the shell model, plausible scenarios for the formation of carbon nanotubes, and the variation in the characteristics of these carbon nanotubes have been articulated. The experimental evidences, for example, for the formation of shell around a core, dipole characteristics of the seed, and the existence of nanopores in the seed, have been presented. They appear to justify the validity of the proposed model. The diversities of nanotube characteristics, fundamentals underlying the creation of bamboo-shaped carbon nanotubes, and the impurity generation on the surface of carbon nanotubes have been elucidated. The catalytic action of FECA on growth has been quantified. The applicability of the proposed model to the nanotube growth by a variety of mechanisms has been elaborated. These mechanisms include the vapor-liquid-solid mechanism, the oxide-assisted growth mechanism, the self

  16. Construction of biological networks from unstructured information based on a semi-automated curation workflow.

    Science.gov (United States)

    Szostak, Justyna; Ansari, Sam; Madan, Sumit; Fluck, Juliane; Talikka, Marja; Iskandar, Anita; De Leon, Hector; Hofmann-Apitius, Martin; Peitsch, Manuel C; Hoeng, Julia

    2015-06-17

    Capture and representation of scientific knowledge in a structured format are essential to improve the understanding of biological mechanisms involved in complex diseases. Biological knowledge and knowledge about standardized terminologies are difficult to capture from literature in a usable form. A semi-automated knowledge extraction workflow is presented that was developed to allow users to extract causal and correlative relationships from scientific literature and to transcribe them into the computable and human readable Biological Expression Language (BEL). The workflow combines state-of-the-art linguistic tools for recognition of various entities and extraction of knowledge from literature sources. Unlike most other approaches, the workflow outputs the results to a curation interface for manual curation and converts them into BEL documents that can be compiled to form biological networks. We developed a new semi-automated knowledge extraction workflow that was designed to capture and organize scientific knowledge and reduce the required curation skills and effort for this task. The workflow was used to build a network that represents the cellular and molecular mechanisms implicated in atherosclerotic plaque destabilization in an apolipoprotein-E-deficient (ApoE(-/-)) mouse model. The network was generated using knowledge extracted from the primary literature. The resultant atherosclerotic plaque destabilization network contains 304 nodes and 743 edges supported by 33 PubMed referenced articles. A comparison between the semi-automated and conventional curation processes showed similar results, but significantly reduced curation effort for the semi-automated process. Creating structured knowledge from unstructured text is an important step for the mechanistic interpretation and reusability of knowledge. Our new semi-automated knowledge extraction workflow reduced the curation skills and effort required to capture and organize scientific knowledge. The

  17. Analysis of electronic models for solar cells including energy resolved defect densities

    Energy Technology Data Exchange (ETDEWEB)

    Glitzky, Annegret

    2010-07-01

    We introduce an electronic model for solar cells including energy resolved defect densities. The resulting drift-diffusion model corresponds to a generalized van Roosbroeck system with additional source terms coupled with ODEs containing space and energy as parameters for all defect densities. The system has to be considered in heterostructures and with mixed boundary conditions from device simulation. We give a weak formulation of the problem. If the boundary data and the sources are compatible with thermodynamic equilibrium the free energy along solutions decays monotonously. In other cases it may be increasing, but we estimate its growth. We establish boundedness and uniqueness results and prove the existence of a weak solution. This is done by considering a regularized problem, showing its solvability and the boundedness of its solutions independent of the regularization level. (orig.)

  18. Nonlinear Acoustics FDTD method including Frequency Power Law Attenuation for Soft Tissue Modeling

    CERN Document Server

    Jiménez, Noé; Sánchez-Morcillo, Víctor; Camarena, Francisco; Hou, Yi; Konofagou, Elisa E

    2014-01-01

    This paper describes a model for nonlinear acoustic wave propagation through absorbing and weakly dispersive media, and its numerical solution by means of finite differences in time domain method (FDTD). The attenuation is based on multiple relaxation processes, and provides frequency dependent absorption and dispersion without using computational expensive convolutional operators. In this way, by using an optimization algorithm the coefficients for the relaxation processes can be obtained in order to fit a frequency power law that agrees the experimentally measured attenuation data for heterogeneous media over the typical frequency range for ultrasound medical applications. Our results show that two relaxation processes are enough to fit attenuation data for most soft tissues in this frequency range including the fundamental and the first ten harmonics. Furthermore, this model can fit experimental attenuation data that do not follow exactly a frequency power law over the frequency range of interest. The main...

  19. Particle-based modeling of heterogeneous chemical kinetics including mass transfer

    Science.gov (United States)

    Sengar, A.; Kuipers, J. A. M.; van Santen, Rutger A.; Padding, J. T.

    2017-08-01

    Connecting the macroscopic world of continuous fields to the microscopic world of discrete molecular events is important for understanding several phenomena occurring at physical boundaries of systems. An important example is heterogeneous catalysis, where reactions take place at active surfaces, but the effective reaction rates are determined by transport limitations in the bulk fluid and reaction limitations on the catalyst surface. In this work we study the macro-micro connection in a model heterogeneous catalytic reactor by means of stochastic rotation dynamics. The model is able to resolve the convective and diffusive interplay between participating species, while including adsorption, desorption, and reaction processes on the catalytic surface. Here we apply the simulation methodology to a simple straight microchannel with a catalytic strip. Dimensionless Damkohler numbers are used to comment on the spatial concentration profiles of reactants and products near the catalyst strip and in the bulk. We end the discussion with an outlook on more complicated geometries and increasingly complex reactions.

  20. Models of epidemics: when contact repetition and clustering should be included

    Directory of Open Access Journals (Sweden)

    Scholz Roland W

    2009-06-01

    Full Text Available Abstract Background The spread of infectious disease is determined by biological factors, e.g. the duration of the infectious period, and social factors, e.g. the arrangement of potentially contagious contacts. Repetitiveness and clustering of contacts are known to be relevant factors influencing the transmission of droplet or contact transmitted diseases. However, we do not yet completely know under what conditions repetitiveness and clustering should be included for realistically modelling disease spread. Methods We compare two different types of individual-based models: One assumes random mixing without repetition of contacts, whereas the other assumes that the same contacts repeat day-by-day. The latter exists in two variants, with and without clustering. We systematically test and compare how the total size of an outbreak differs between these model types depending on the key parameters transmission probability, number of contacts per day, duration of the infectious period, different levels of clustering and varying proportions of repetitive contacts. Results The simulation runs under different parameter constellations provide the following results: The difference between both model types is highest for low numbers of contacts per day and low transmission probabilities. The number of contacts and the transmission probability have a higher influence on this difference than the duration of the infectious period. Even when only minor parts of the daily contacts are repetitive and clustered can there be relevant differences compared to a purely random mixing model. Conclusion We show that random mixing models provide acceptable estimates of the total outbreak size if the number of contacts per day is high or if the per-contact transmission probability is high, as seen in typical childhood diseases such as measles. In the case of very short infectious periods, for instance, as in Norovirus, models assuming repeating contacts will also behave