WorldWideScience

Sample records for bioinformatics process management

  1. Bioinformatics process management: information flow via a computational journal

    Directory of Open Access Journals (Sweden)

    Lushington Gerald

    2007-12-01

    Full Text Available Abstract This paper presents the Bioinformatics Computational Journal (BCJ, a framework for conducting and managing computational experiments in bioinformatics and computational biology. These experiments often involve series of computations, data searches, filters, and annotations which can benefit from a structured environment. Systems to manage computational experiments exist, ranging from libraries with standard data models to elaborate schemes to chain together input and output between applications. Yet, although such frameworks are available, their use is not widespread–ad hoc scripts are often required to bind applications together. The BCJ explores another solution to this problem through a computer based environment suitable for on-site use, which builds on the traditional laboratory notebook paradigm. It provides an intuitive, extensible paradigm designed for expressive composition of applications. Extensive features facilitate sharing data, computational methods, and entire experiments. By focusing on the bioinformatics and computational biology domain, the scope of the computational framework was narrowed, permitting us to implement a capable set of features for this domain. This report discusses the features determined critical by our system and other projects, along with design issues. We illustrate the use of our implementation of the BCJ on two domain-specific examples.

  2. When process mining meets bioinformatics

    NARCIS (Netherlands)

    Jagadeesh Chandra Bose, R.P.; Aalst, van der W.M.P.; Nurcan, S.

    2011-01-01

    Process mining techniques can be used to extract non-trivial process related knowledge and thus generate interesting insights from event logs. Similarly, bioinformatics aims at increasing the understanding of biological processes through the analysis of information associated with biological

  3. Bioinformatics

    DEFF Research Database (Denmark)

    Baldi, Pierre; Brunak, Søren

    , and medicine will be particularly affected by the new results and the increased understanding of life at the molecular level. Bioinformatics is the development and application of computer methods for analysis, interpretation, and prediction, as well as for the design of experiments. It has emerged...

  4. Exploiting graphics processing units for computational biology and bioinformatics.

    Science.gov (United States)

    Payne, Joshua L; Sinnott-Armstrong, Nicholas A; Moore, Jason H

    2010-09-01

    Advances in the video gaming industry have led to the production of low-cost, high-performance graphics processing units (GPUs) that possess more memory bandwidth and computational capability than central processing units (CPUs), the standard workhorses of scientific computing. With the recent release of generalpurpose GPUs and NVIDIA's GPU programming language, CUDA, graphics engines are being adopted widely in scientific computing applications, particularly in the fields of computational biology and bioinformatics. The goal of this article is to concisely present an introduction to GPU hardware and programming, aimed at the computational biologist or bioinformaticist. To this end, we discuss the primary differences between GPU and CPU architecture, introduce the basics of the CUDA programming language, and discuss important CUDA programming practices, such as the proper use of coalesced reads, data types, and memory hierarchies. We highlight each of these topics in the context of computing the all-pairs distance between instances in a dataset, a common procedure in numerous disciplines of scientific computing. We conclude with a runtime analysis of the GPU and CPU implementations of the all-pairs distance calculation. We show our final GPU implementation to outperform the CPU implementation by a factor of 1700.

  5. Graphics processing units in bioinformatics, computational biology and systems biology.

    Science.gov (United States)

    Nobile, Marco S; Cazzaniga, Paolo; Tangherloni, Andrea; Besozzi, Daniela

    2017-09-01

    Several studies in Bioinformatics, Computational Biology and Systems Biology rely on the definition of physico-chemical or mathematical models of biological systems at different scales and levels of complexity, ranging from the interaction of atoms in single molecules up to genome-wide interaction networks. Traditional computational methods and software tools developed in these research fields share a common trait: they can be computationally demanding on Central Processing Units (CPUs), therefore limiting their applicability in many circumstances. To overcome this issue, general-purpose Graphics Processing Units (GPUs) are gaining an increasing attention by the scientific community, as they can considerably reduce the running time required by standard CPU-based software, and allow more intensive investigations of biological systems. In this review, we present a collection of GPU tools recently developed to perform computational analyses in life science disciplines, emphasizing the advantages and the drawbacks in the use of these parallel architectures. The complete list of GPU-powered tools here reviewed is available at http://bit.ly/gputools. © The Author 2016. Published by Oxford University Press.

  6. Bio-jETI: a service integration, design, and provisioning platform for orchestrated bioinformatics processes.

    Science.gov (United States)

    Margaria, Tiziana; Kubczak, Christian; Steffen, Bernhard

    2008-04-25

    With Bio-jETI, we introduce a service platform for interdisciplinary work on biological application domains and illustrate its use in a concrete application concerning statistical data processing in R and xcms for an LC/MS analysis of FAAH gene knockout. Bio-jETI uses the jABC environment for service-oriented modeling and design as a graphical process modeling tool and the jETI service integration technology for remote tool execution. As a service definition and provisioning platform, Bio-jETI has the potential to become a core technology in interdisciplinary service orchestration and technology transfer. Domain experts, like biologists not trained in computer science, directly define complex service orchestrations as process models and use efficient and complex bioinformatics tools in a simple and intuitive way.

  7. Bioinformatics strategies in life sciences: from data processing and data warehousing to biological knowledge extraction.

    Science.gov (United States)

    Thiele, Herbert; Glandorf, Jörg; Hufnagel, Peter

    2010-05-27

    With the large variety of Proteomics workflows, as well as the large variety of instruments and data-analysis software available, researchers today face major challenges validating and comparing their Proteomics data. Here we present a new generation of the ProteinScape bioinformatics platform, now enabling researchers to manage Proteomics data from the generation and data warehousing to a central data repository with a strong focus on the improved accuracy, reproducibility and comparability demanded by many researchers in the field. It addresses scientists; current needs in proteomics identification, quantification and validation. But producing large protein lists is not the end point in Proteomics, where one ultimately aims to answer specific questions about the biological condition or disease model of the analyzed sample. In this context, a new tool has been developed at the Spanish Centro Nacional de Biotecnologia Proteomics Facility termed PIKE (Protein information and Knowledge Extractor) that allows researchers to control, filter and access specific information from genomics and proteomic databases, to understand the role and relationships of the proteins identified in the experiments. Additionally, an EU funded project, ProDac, has coordinated systematic data collection in public standards-compliant repositories like PRIDE. This will cover all aspects from generating MS data in the laboratory, assembling the whole annotation information and storing it together with identifications in a standardised format.

  8. Bioinformatics Strategies in Life Sciences: From Data Processing and Data Warehousing to Biological Knowledge Extraction

    Directory of Open Access Journals (Sweden)

    Thiele Herbert

    2010-03-01

    Full Text Available With the large variety of Proteomics workflows, as well as the large variety of instruments and data-analysis software available, researchers today face major challenges validating and comparing their Proteomics data. Here we present a new generation of the ProteinScapeTM bioinformatics platform, now enabling researchers to manage Proteomics data from the generation and data warehousing to a central data repository with a strong focus on the improved accuracy, reproducibility and comparability demanded by many researchers in the field. It addresses scientists` current needs in proteomics identification, quantification and validation. But producing large protein lists is not the end point in Proteomics, where one ultimately aims to answer specific questions about the biological condition or disease model of the analyzed sample. In this context, a new tool has been developed at the Spanish Centro Nacional de Biotecnologia Proteomics Facility termed PIKE (Protein information and Knowledge Extractor that allows researchers to control, filter and access specific information from genomics and proteomic databases, to understand the role and relationships of the proteins identified in the experiments. Additionally, an EU funded project, ProDac, has coordinated systematic data collection in public standards-compliant repositories like PRIDE. This will cover all aspects from generating MS data in the laboratory, assembling the whole annotation information and storing it together with identifications in a standardised format.

  9. Nispero: a cloud-computing based Scala tool specially suited for bioinformatics data processing

    OpenAIRE

    Evdokim Kovach; Alexey Alekhin; Eduardo Pareja Tobes; Raquel Tobes; Eduardo Pareja; Marina Manrique

    2014-01-01

    Nowadays it is widely accepted that the bioinformatics data analysis is a real bottleneck in many research activities related to life sciences. High-throughput technologies like Next Generation Sequencing (NGS) have completely reshaped the biology and bioinformatics landscape. Undoubtedly NGS has allowed important progress in many life-sciences related fields but has also presented interesting challenges in terms of computation capabilities and algorithms. Many kinds of tasks related with NGS...

  10. Bioinformatics and Cancer

    Science.gov (United States)

    Researchers take on challenges and opportunities to mine "Big Data" for answers to complex biological questions. Learn how bioinformatics uses advanced computing, mathematics, and technological platforms to store, manage, analyze, and understand data.

  11. Data mining for bioinformatics applications

    CERN Document Server

    Zengyou, He

    2015-01-01

    Data Mining for Bioinformatics Applications provides valuable information on the data mining methods have been widely used for solving real bioinformatics problems, including problem definition, data collection, data preprocessing, modeling, and validation. The text uses an example-based method to illustrate how to apply data mining techniques to solve real bioinformatics problems, containing 45 bioinformatics problems that have been investigated in recent research. For each example, the entire data mining process is described, ranging from data preprocessing to modeling and result validation. Provides valuable information on the data mining methods have been widely used for solving real bioinformatics problems Uses an example-based method to illustrate how to apply data mining techniques to solve real bioinformatics problems Contains 45 bioinformatics problems that have been investigated in recent research.

  12. Deep learning in bioinformatics.

    Science.gov (United States)

    Min, Seonwoo; Lee, Byunghan; Yoon, Sungroh

    2017-09-01

    In the era of big data, transformation of biomedical big data into valuable knowledge has been one of the most important challenges in bioinformatics. Deep learning has advanced rapidly since the early 2000s and now demonstrates state-of-the-art performance in various fields. Accordingly, application of deep learning in bioinformatics to gain insight from data has been emphasized in both academia and industry. Here, we review deep learning in bioinformatics, presenting examples of current research. To provide a useful and comprehensive perspective, we categorize research both by the bioinformatics domain (i.e. omics, biomedical imaging, biomedical signal processing) and deep learning architecture (i.e. deep neural networks, convolutional neural networks, recurrent neural networks, emergent architectures) and present brief descriptions of each study. Additionally, we discuss theoretical and practical issues of deep learning in bioinformatics and suggest future research directions. We believe that this review will provide valuable insights and serve as a starting point for researchers to apply deep learning approaches in their bioinformatics studies. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  13. Process management practice

    International Nuclear Information System (INIS)

    Pyeon, In Beom

    1983-04-01

    This book gives descriptions of qualifying subject and test scope like production plan and control, economic feasibility, process management, quality management and operations research, industrial economics like materials and marketing management, production management such as meaning and goals of process management and production plan and control, basic economic concept, official interest and equivalence, and depreciation, and OR concept such as network analysis and PERT CPM and stimulation.

  14. Interdisciplinary Introductory Course in Bioinformatics

    Science.gov (United States)

    Kortsarts, Yana; Morris, Robert W.; Utell, Janine M.

    2010-01-01

    Bioinformatics is a relatively new interdisciplinary field that integrates computer science, mathematics, biology, and information technology to manage, analyze, and understand biological, biochemical and biophysical information. We present our experience in teaching an interdisciplinary course, Introduction to Bioinformatics, which was developed…

  15. Project management process.

    Science.gov (United States)

    2007-03-01

    This course provides INDOT staff with foundational knowledge and skills in project management principles and methodologies. INDOTs project management processes provide the tools for interdisciplinary teams to efficiently and effectively deliver pr...

  16. Virtual Bioinformatics Distance Learning Suite

    Science.gov (United States)

    Tolvanen, Martti; Vihinen, Mauno

    2004-01-01

    Distance learning as a computer-aided concept allows students to take courses from anywhere at any time. In bioinformatics, computers are needed to collect, store, process, and analyze massive amounts of biological and biomedical data. We have applied the concept of distance learning in virtual bioinformatics to provide university course material…

  17. Management oriented process

    International Nuclear Information System (INIS)

    2004-01-01

    ANAV decided to implement process-oriented management by adopting the U. S. NEI (Nuclear Electric Industry) model. The article describes the initial phases of the project, its current status and future prospects. The project has been considered as an improvement in the areas of organization and human factors. Recently, IAEA standard drafts are including processes as an accepted management model. (Author)

  18. EURASIP journal on bioinformatics & systems biology

    National Research Council Canada - National Science Library

    2006-01-01

    "The overall aim of "EURASIP Journal on Bioinformatics and Systems Biology" is to publish research results related to signal processing and bioinformatics theories and techniques relevant to a wide...

  19. Business process quality management

    NARCIS (Netherlands)

    Reijers, H.A.; Mendling, J.; Recker, J.; Brocke, vom J.; Rosemann, M.

    2010-01-01

    Abstract Process modeling is a central element in any approach to Business Process Management (BPM). However, what hinders both practitioners and aca demics is the lack of support for assessing the quality of process models — let alone realizing high quality process models. Existing frameworks are

  20. Bioinformatics for Exploration

    Science.gov (United States)

    Johnson, Kathy A.

    2006-01-01

    For the purpose of this paper, bioinformatics is defined as the application of computer technology to the management of biological information. It can be thought of as the science of developing computer databases and algorithms to facilitate and expedite biological research. This is a crosscutting capability that supports nearly all human health areas ranging from computational modeling, to pharmacodynamics research projects, to decision support systems within autonomous medical care. Bioinformatics serves to increase the efficiency and effectiveness of the life sciences research program. It provides data, information, and knowledge capture which further supports management of the bioastronautics research roadmap - identifying gaps that still remain and enabling the determination of which risks have been addressed.

  1. Introduction to bioinformatics.

    Science.gov (United States)

    Can, Tolga

    2014-01-01

    Bioinformatics is an interdisciplinary field mainly involving molecular biology and genetics, computer science, mathematics, and statistics. Data intensive, large-scale biological problems are addressed from a computational point of view. The most common problems are modeling biological processes at the molecular level and making inferences from collected data. A bioinformatics solution usually involves the following steps: Collect statistics from biological data. Build a computational model. Solve a computational modeling problem. Test and evaluate a computational algorithm. This chapter gives a brief introduction to bioinformatics by first providing an introduction to biological terminology and then discussing some classical bioinformatics problems organized by the types of data sources. Sequence analysis is the analysis of DNA and protein sequences for clues regarding function and includes subproblems such as identification of homologs, multiple sequence alignment, searching sequence patterns, and evolutionary analyses. Protein structures are three-dimensional data and the associated problems are structure prediction (secondary and tertiary), analysis of protein structures for clues regarding function, and structural alignment. Gene expression data is usually represented as matrices and analysis of microarray data mostly involves statistics analysis, classification, and clustering approaches. Biological networks such as gene regulatory networks, metabolic pathways, and protein-protein interaction networks are usually modeled as graphs and graph theoretic approaches are used to solve associated problems such as construction and analysis of large-scale networks.

  2. Process Management Plans

    Directory of Open Access Journals (Sweden)

    Tomasz Miksa

    2014-07-01

    Full Text Available In the era of research infrastructures and big data, sophisticated data management practices are becoming essential building blocks of successful science. Most practices follow a data-centric approach, which does not take into account the processes that created, analysed and presented the data. This fact limits the possibilities for reliable verification of results. Furthermore, it does not guarantee the reuse of research, which is one of the key aspects of credible data-driven science. For that reason, we propose the introduction of the new concept of Process Management Plans, which focus on the identification, description, sharing and preservation of the entire scientific processes. They enable verification and later reuse of result data and processes of scientific experiments. In this paper we describe the structure and explain the novelty of Process Management Plans by showing in what way they complement existing Data Management Plans. We also highlight key differences, major advantages, as well as references to tools and solutions that can facilitate the introduction of Process Management Plans.

  3. PROJECT SCOPE MANAGEMENT PROCESS

    Directory of Open Access Journals (Sweden)

    Yana Derenskaya

    2018-01-01

    Full Text Available The purpose of the article is to define the essence of project scope management process, its components, as well as to develop an algorithm of project scope management in terms of pharmaceutical production. Methodology. To carry out the study, available information sources on standards of project management in whole and elements of project scope management in particular are analysed. Methods of system and structural analysis, logical generalization are used to study the totality of subprocesses of project scope management, input and output documents, and to provide each of them. Methods of network planning are used to construct a precedence diagram of project scope management process. Results of the research showed that components of the project scope management are managing the scope of the project product and managing the content of project work. It is the second component is investigated in the presented work as a subject of research. Accordingly, it is defined that project scope management process is to substantiate and bring to the realization the necessary amount of work that ensures the successful implementation of the project (achievement of its goal and objectives of individual project participants. It is also determined that the process of managing the project scope takes into account the planning, definition of the project scope, creation of the structure of project work, confirmation of the scope and management of the project scope. Participants of these subprocesses are: customer, investor, and other project participants – external organizations (contractors of the project; project review committee; project manager and project team. It is revealed that the key element of planning the project scope is the formation of the structure of design work, the justification of the number of works, and the sequence of their implementation. It is recommended to use the following sequence of stages for creating the structure of project work

  4. Applications of Structural Biology and Bioinformatics in the Investigation of Oxidative Stress-Related Processes

    NARCIS (Netherlands)

    Bersch, Beate; Groves, Matthew; Johann, Klare; Torda, Andrew; Ortiz, Dario; Laher, I.

    2014-01-01

    Reactive oxygen species (ROS)-mediated dysfunction of certain biological processes is implicated in different diseases in humans, including cardiovascular, cancer, or neurodegenerative disorders. Not only human cells and tissues are affected by ROS but also all other biological systems, including

  5. Managing Software Process Evolution

    DEFF Research Database (Denmark)

    This book focuses on the design, development, management, governance and application of evolving software processes that are aligned with changing business objectives, such as expansion to new domains or shifting to global production. In the context of an evolving business world, it examines...... the complete software process lifecycle, from the initial definition of a product to its systematic improvement. In doing so, it addresses difficult problems, such as how to implement processes in highly regulated domains or where to find a suitable notation system for documenting processes, and provides...... essential insights and tips to help readers manage process evolutions. And last but not least, it provides a wealth of examples and cases on how to deal with software evolution in practice. Reflecting these topics, the book is divided into three parts. Part 1 focuses on software business transformation...

  6. Trends in IT Innovation to Build a Next Generation Bioinformatics Solution to Manage and Analyse Biological Big Data Produced by NGS Technologies

    Directory of Open Access Journals (Sweden)

    Alexandre G. de Brevern

    2015-01-01

    Full Text Available Sequencing the human genome began in 1994, and 10 years of work were necessary in order to provide a nearly complete sequence. Nowadays, NGS technologies allow sequencing of a whole human genome in a few days. This deluge of data challenges scientists in many ways, as they are faced with data management issues and analysis and visualization drawbacks due to the limitations of current bioinformatics tools. In this paper, we describe how the NGS Big Data revolution changes the way of managing and analysing data. We present how biologists are confronted with abundance of methods, tools, and data formats. To overcome these problems, focus on Big Data Information Technology innovations from web and business intelligence. We underline the interest of NoSQL databases, which are much more efficient than relational databases. Since Big Data leads to the loss of interactivity with data during analysis due to high processing time, we describe solutions from the Business Intelligence that allow one to regain interactivity whatever the volume of data is. We illustrate this point with a focus on the Amadea platform. Finally, we discuss visualization challenges posed by Big Data and present the latest innovations with JavaScript graphic libraries.

  7. Trends in IT Innovation to Build a Next Generation Bioinformatics Solution to Manage and Analyse Biological Big Data Produced by NGS Technologies

    Science.gov (United States)

    de Brevern, Alexandre G.; Meyniel, Jean-Philippe; Fairhead, Cécile; Neuvéglise, Cécile; Malpertuy, Alain

    2015-01-01

    Sequencing the human genome began in 1994, and 10 years of work were necessary in order to provide a nearly complete sequence. Nowadays, NGS technologies allow sequencing of a whole human genome in a few days. This deluge of data challenges scientists in many ways, as they are faced with data management issues and analysis and visualization drawbacks due to the limitations of current bioinformatics tools. In this paper, we describe how the NGS Big Data revolution changes the way of managing and analysing data. We present how biologists are confronted with abundance of methods, tools, and data formats. To overcome these problems, focus on Big Data Information Technology innovations from web and business intelligence. We underline the interest of NoSQL databases, which are much more efficient than relational databases. Since Big Data leads to the loss of interactivity with data during analysis due to high processing time, we describe solutions from the Business Intelligence that allow one to regain interactivity whatever the volume of data is. We illustrate this point with a focus on the Amadea platform. Finally, we discuss visualization challenges posed by Big Data and present the latest innovations with JavaScript graphic libraries. PMID:26125026

  8. Trends in IT Innovation to Build a Next Generation Bioinformatics Solution to Manage and Analyse Biological Big Data Produced by NGS Technologies.

    Science.gov (United States)

    de Brevern, Alexandre G; Meyniel, Jean-Philippe; Fairhead, Cécile; Neuvéglise, Cécile; Malpertuy, Alain

    2015-01-01

    Sequencing the human genome began in 1994, and 10 years of work were necessary in order to provide a nearly complete sequence. Nowadays, NGS technologies allow sequencing of a whole human genome in a few days. This deluge of data challenges scientists in many ways, as they are faced with data management issues and analysis and visualization drawbacks due to the limitations of current bioinformatics tools. In this paper, we describe how the NGS Big Data revolution changes the way of managing and analysing data. We present how biologists are confronted with abundance of methods, tools, and data formats. To overcome these problems, focus on Big Data Information Technology innovations from web and business intelligence. We underline the interest of NoSQL databases, which are much more efficient than relational databases. Since Big Data leads to the loss of interactivity with data during analysis due to high processing time, we describe solutions from the Business Intelligence that allow one to regain interactivity whatever the volume of data is. We illustrate this point with a focus on the Amadea platform. Finally, we discuss visualization challenges posed by Big Data and present the latest innovations with JavaScript graphic libraries.

  9. TECHNOLOGY MANAGEMENT PROCESS FRAMEWORK

    Directory of Open Access Journals (Sweden)

    Ikura Yamamoto

    2012-02-01

    Full Text Available The effective management of technology as a source of competitive advantage is of vital importance for many organizations. It is necessary to understand, communicate and integrate technology strategy with marketing, financial, operations and human resource strategies. This is of particular importance when one considers the increasing cost, pace and complexity of technology developments, combined with shortening product life cycles. A five process model provides a framework within which technology management activities can be understood: identification, selection, acquisition, exploitation and protection. Based on this model, a technology management assessment procedure has been developed, using an ``action research’’ approach. This paper presents an industrial case study describing the first full application of the procedure within a high-volume manufacturing business. The impact of applying the procedure is assessed in terms of benefits to the participating business, together with improvements to the assessment procedure itself, in the context of the action research framework. Keyword: Technology, Strategy, Management, Assessment

  10. Networked business process management

    NARCIS (Netherlands)

    Grefen, P.W.P.J.

    2013-01-01

    In the current economy, a shift can be seen from stand-alone business organizations to networks of tightly collaborating business organizations. To allow this tight collaboration, business process management in these collaborative networks is becoming increasingly important. This paper discusses

  11. Fundamentals of business process management

    NARCIS (Netherlands)

    Dumas, Marlon; La Rosa, Marcello; Mendling, Jan; Reijers, Hajo A.

    2018-01-01

    This textbook covers the entire Business Process Management (BPM) lifecycle, from process identification to process monitoring, covering along the way process modelling, analysis, redesign and automation. Concepts, methods and tools from business management, computer science and industrial

  12. Business process management: a survey

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Hofstede, ter A.H.M.; Weske, M.H.; Aalst, van der W.M.P.; Hofstede, ter A.H.M.; Weske, M.H.

    2003-01-01

    Business Process Management (BPM) includes methods, techniques, and tools to support the design, enactment, management, and analysis of operational business processes. It can be considered as an extension of classical Workflow Management (WFM) systems and approaches. Although the practical relevance

  13. Processes for managing pathogens.

    Science.gov (United States)

    Godfree, Alan; Farrell, Joseph

    2005-01-01

    Wastewater contains human, animal, and plant pathogens capable of causing viral, bacterial, or parasitic infections. There are several routes whereby sewage pathogens may affect human health, including direct contact, contamination of food crops, zoonoses, and vectors. The range and numbers of pathogens in municipal wastewater vary with the level of endemic disease in the community, discharges from commercial activities, and seasonal factors. Regulations to control pathogen risk in the United States and Europe arising from land application of biosolids are based on the concept of multiple barriers to the prevention of transmission. The barriers are (i) treatment to reduce pathogen content and vector attraction, (ii) restrictions on crops grown on land to which biosolids have been applied, and (iii) minimum intervals following application and grazing or harvesting. Wastewater treatment reduces number of pathogens in the wastewater by concentrating them with the solids in the sludge. Although some treatment processes are designed specifically to inactivate pathogens, many are not, and the actual mechanisms of microbial inactivation are not fully understood for all processes. Vector attraction is reduced by stabilization (reduction of readily biodegradable material) and/or incorporation immediately following application. Concerns about health risks have renewed interest in the effects of treatment (on pathogens) and advanced treatment methods, and work performed in the United States suggests that Class A pathogen reduction can be achieved less expensively than previously thought. Effective pathogen risk management requires control to the complete chain of sludge treatment, biosolids handling and application, and post-application activities. This may be achieved by adherence to quality management systems based on hazard analysis critical control point (HACCP) principles.

  14. Taking Bioinformatics to Systems Medicine.

    Science.gov (United States)

    van Kampen, Antoine H C; Moerland, Perry D

    2016-01-01

    Systems medicine promotes a range of approaches and strategies to study human health and disease at a systems level with the aim of improving the overall well-being of (healthy) individuals, and preventing, diagnosing, or curing disease. In this chapter we discuss how bioinformatics critically contributes to systems medicine. First, we explain the role of bioinformatics in the management and analysis of data. In particular we show the importance of publicly available biological and clinical repositories to support systems medicine studies. Second, we discuss how the integration and analysis of multiple types of omics data through integrative bioinformatics may facilitate the determination of more predictive and robust disease signatures, lead to a better understanding of (patho)physiological molecular mechanisms, and facilitate personalized medicine. Third, we focus on network analysis and discuss how gene networks can be constructed from omics data and how these networks can be decomposed into smaller modules. We discuss how the resulting modules can be used to generate experimentally testable hypotheses, provide insight into disease mechanisms, and lead to predictive models. Throughout, we provide several examples demonstrating how bioinformatics contributes to systems medicine and discuss future challenges in bioinformatics that need to be addressed to enable the advancement of systems medicine.

  15. Bioinformatics resource manager v2.3: an integrated software environment for systems biology with microRNA and cross-species analysis tools

    Directory of Open Access Journals (Sweden)

    Tilton Susan C

    2012-11-01

    Full Text Available Abstract Background MicroRNAs (miRNAs are noncoding RNAs that direct post-transcriptional regulation of protein coding genes. Recent studies have shown miRNAs are important for controlling many biological processes, including nervous system development, and are highly conserved across species. Given their importance, computational tools are necessary for analysis, interpretation and integration of high-throughput (HTP miRNA data in an increasing number of model species. The Bioinformatics Resource Manager (BRM v2.3 is a software environment for data management, mining, integration and functional annotation of HTP biological data. In this study, we report recent updates to BRM for miRNA data analysis and cross-species comparisons across datasets. Results BRM v2.3 has the capability to query predicted miRNA targets from multiple databases, retrieve potential regulatory miRNAs for known genes, integrate experimentally derived miRNA and mRNA datasets, perform ortholog mapping across species, and retrieve annotation and cross-reference identifiers for an expanded number of species. Here we use BRM to show that developmental exposure of zebrafish to 30 uM nicotine from 6–48 hours post fertilization (hpf results in behavioral hyperactivity in larval zebrafish and alteration of putative miRNA gene targets in whole embryos at developmental stages that encompass early neurogenesis. We show typical workflows for using BRM to integrate experimental zebrafish miRNA and mRNA microarray datasets with example retrievals for zebrafish, including pathway annotation and mapping to human ortholog. Functional analysis of differentially regulated (p Conclusions BRM provides the ability to mine complex data for identification of candidate miRNAs or pathways that drive phenotypic outcome and, therefore, is a useful hypothesis generation tool for systems biology. The miRNA workflow in BRM allows for efficient processing of multiple miRNA and mRNA datasets in a single

  16. Bioinformatics resource manager v2.3: an integrated software environment for systems biology with microRNA and cross-species analysis tools

    Science.gov (United States)

    2012-01-01

    Background MicroRNAs (miRNAs) are noncoding RNAs that direct post-transcriptional regulation of protein coding genes. Recent studies have shown miRNAs are important for controlling many biological processes, including nervous system development, and are highly conserved across species. Given their importance, computational tools are necessary for analysis, interpretation and integration of high-throughput (HTP) miRNA data in an increasing number of model species. The Bioinformatics Resource Manager (BRM) v2.3 is a software environment for data management, mining, integration and functional annotation of HTP biological data. In this study, we report recent updates to BRM for miRNA data analysis and cross-species comparisons across datasets. Results BRM v2.3 has the capability to query predicted miRNA targets from multiple databases, retrieve potential regulatory miRNAs for known genes, integrate experimentally derived miRNA and mRNA datasets, perform ortholog mapping across species, and retrieve annotation and cross-reference identifiers for an expanded number of species. Here we use BRM to show that developmental exposure of zebrafish to 30 uM nicotine from 6–48 hours post fertilization (hpf) results in behavioral hyperactivity in larval zebrafish and alteration of putative miRNA gene targets in whole embryos at developmental stages that encompass early neurogenesis. We show typical workflows for using BRM to integrate experimental zebrafish miRNA and mRNA microarray datasets with example retrievals for zebrafish, including pathway annotation and mapping to human ortholog. Functional analysis of differentially regulated (p<0.05) gene targets in BRM indicates that nicotine exposure disrupts genes involved in neurogenesis, possibly through misregulation of nicotine-sensitive miRNAs. Conclusions BRM provides the ability to mine complex data for identification of candidate miRNAs or pathways that drive phenotypic outcome and, therefore, is a useful hypothesis

  17. Management of processes of electrochemical dimensional processing

    Science.gov (United States)

    Akhmetov, I. D.; Zakirova, A. R.; Sadykov, Z. B.

    2017-09-01

    In different industries a lot high-precision parts are produced from hard-processed scarce materials. Forming such details can only be acting during non-contact processing, or a minimum of effort, and doable by the use, for example, of electro-chemical processing. At the present stage of development of metal working processes are important management issues electrochemical machining and its automation. This article provides some indicators and factors of electrochemical machining process.

  18. Bioinformatics in translational drug discovery.

    Science.gov (United States)

    Wooller, Sarah K; Benstead-Hume, Graeme; Chen, Xiangrong; Ali, Yusuf; Pearl, Frances M G

    2017-08-31

    Bioinformatics approaches are becoming ever more essential in translational drug discovery both in academia and within the pharmaceutical industry. Computational exploitation of the increasing volumes of data generated during all phases of drug discovery is enabling key challenges of the process to be addressed. Here, we highlight some of the areas in which bioinformatics resources and methods are being developed to support the drug discovery pipeline. These include the creation of large data warehouses, bioinformatics algorithms to analyse 'big data' that identify novel drug targets and/or biomarkers, programs to assess the tractability of targets, and prediction of repositioning opportunities that use licensed drugs to treat additional indications. © 2017 The Author(s).

  19. Online Bioinformatics Tutorials | Office of Cancer Genomics

    Science.gov (United States)

    Bioinformatics is a scientific discipline that applies computer science and information technology to help understand biological processes. The NIH provides a list of free online bioinformatics tutorials, either generated by the NIH Library or other institutes, which includes introductory lectures and "how to" videos on using various tools.

  20. Air Quality Management Process Cycle

    Science.gov (United States)

    Air quality management are activities a regulatory authority undertakes to protect human health and the environment from the harmful effects of air pollution. The process of managing air quality can be illustrated as a cycle of inter-related elements.

  1. Waste Management Process Improvement Project

    International Nuclear Information System (INIS)

    Atwood, J.; Borden, G.; Rangel, G. R.

    2002-01-01

    The Bechtel Hanford-led Environmental Restoration Contractor team's Waste Management Process Improvement Project is working diligently with the U.S. Department of Energy's (DOE) Richland Operations Office to improve the waste management process to meet DOE's need for an efficient, cost-effective program for the management of dangerous, low-level and mixed-low-level waste. Additionally the program must meet all applicable regulatory requirements. The need for improvement was highlighted when a change in the Groundwater/Vadose Zone Integration Project's waste management practices resulted in a larger amount of waste being generated than the waste management organization had been set up to handle

  2. Comprehensive Environmental Management Process

    International Nuclear Information System (INIS)

    Hjeresen, D.L.; Roybal, S.L.

    1994-01-01

    This report contains information about Los Alamos National Laboratory's Comprehensive Environmental Management Plan. The topics covered include: waste minimization, waste generation, environmental concerns, public relations of the laboratory, and how this plan will help to answer to the demands of the laboratory as their mission changes

  3. Biggest challenges in bioinformatics.

    Science.gov (United States)

    Fuller, Jonathan C; Khoueiry, Pierre; Dinkel, Holger; Forslund, Kristoffer; Stamatakis, Alexandros; Barry, Joseph; Budd, Aidan; Soldatos, Theodoros G; Linssen, Katja; Rajput, Abdul Mateen

    2013-04-01

    The third Heidelberg Unseminars in Bioinformatics (HUB) was held on 18th October 2012, at Heidelberg University, Germany. HUB brought together around 40 bioinformaticians from academia and industry to discuss the 'Biggest Challenges in Bioinformatics' in a 'World Café' style event.

  4. Biggest challenges in bioinformatics

    OpenAIRE

    Fuller, Jonathan C; Khoueiry, Pierre; Dinkel, Holger; Forslund, Kristoffer; Stamatakis, Alexandros; Barry, Joseph; Budd, Aidan; Soldatos, Theodoros G; Linssen, Katja; Rajput, Abdul Mateen

    2013-01-01

    The third Heidelberg Unseminars in Bioinformatics (HUB) was held in October at Heidelberg University in Germany. HUB brought together around 40 bioinformaticians from academia and industry to discuss the ‘Biggest Challenges in Bioinformatics' in a ‘World Café' style event.

  5. A bioinformatics potpourri.

    Science.gov (United States)

    Schönbach, Christian; Li, Jinyan; Ma, Lan; Horton, Paul; Sjaugi, Muhammad Farhan; Ranganathan, Shoba

    2018-01-19

    The 16th International Conference on Bioinformatics (InCoB) was held at Tsinghua University, Shenzhen from September 20 to 22, 2017. The annual conference of the Asia-Pacific Bioinformatics Network featured six keynotes, two invited talks, a panel discussion on big data driven bioinformatics and precision medicine, and 66 oral presentations of accepted research articles or posters. Fifty-seven articles comprising a topic assortment of algorithms, biomolecular networks, cancer and disease informatics, drug-target interactions and drug efficacy, gene regulation and expression, imaging, immunoinformatics, metagenomics, next generation sequencing for genomics and transcriptomics, ontologies, post-translational modification, and structural bioinformatics are the subject of this editorial for the InCoB2017 supplement issues in BMC Genomics, BMC Bioinformatics, BMC Systems Biology and BMC Medical Genomics. New Delhi will be the location of InCoB2018, scheduled for September 26-28, 2018.

  6. Bioinformatics and moonlighting proteins

    Directory of Open Access Journals (Sweden)

    Sergio eHernández

    2015-06-01

    Full Text Available Multitasking or moonlighting is the capability of some proteins to execute two or more biochemical functions. Usually, moonlighting proteins are experimentally revealed by serendipity. For this reason, it would be helpful that Bioinformatics could predict this multifunctionality, especially because of the large amounts of sequences from genome projects. In the present work, we analyse and describe several approaches that use sequences, structures, interactomics and current bioinformatics algorithms and programs to try to overcome this problem. Among these approaches are: a remote homology searches using Psi-Blast, b detection of functional motifs and domains, c analysis of data from protein-protein interaction databases (PPIs, d match the query protein sequence to 3D databases (i.e., algorithms as PISITE, e mutation correlation analysis between amino acids by algorithms as MISTIC. Programs designed to identify functional motif/domains detect mainly the canonical function but usually fail in the detection of the moonlighting one, Pfam and ProDom being the best methods. Remote homology search by Psi-Blast combined with data from interactomics databases (PPIs have the best performance. Structural information and mutation correlation analysis can help us to map the functional sites. Mutation correlation analysis can only be used in very specific situations –it requires the existence of multialigned family protein sequences - but can suggest how the evolutionary process of second function acquisition took place. The multitasking protein database MultitaskProtDB (http://wallace.uab.es/multitask/, previously published by our group, has been used as a benchmark for the all of the analyses.

  7. Application of machine learning methods in bioinformatics

    Science.gov (United States)

    Yang, Haoyu; An, Zheng; Zhou, Haotian; Hou, Yawen

    2018-05-01

    Faced with the development of bioinformatics, high-throughput genomic technology have enabled biology to enter the era of big data. [1] Bioinformatics is an interdisciplinary, including the acquisition, management, analysis, interpretation and application of biological information, etc. It derives from the Human Genome Project. The field of machine learning, which aims to develop computer algorithms that improve with experience, holds promise to enable computers to assist humans in the analysis of large, complex data sets.[2]. This paper analyzes and compares various algorithms of machine learning and their applications in bioinformatics.

  8. Issues Management Process Course # 38401

    Energy Technology Data Exchange (ETDEWEB)

    Binion, Ula Marie [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-02-01

    The purpose of this training it to advise Issues Management Coordinators (IMCs) on the revised Contractor Assurance System (CAS) Issues Management (IM) process. Terminal Objectives: Understand the Laboratory’s IM process; Understand your role in the Laboratory’s IM process. Learning Objectives: Describe the IM process within the context of the CAS; Describe the importance of implementing an institutional IM process at LANL; Describe the process flow for the Laboratory’s IM process; Apply the definition of an issue; Use available resources to determine initial screening risk levels for issues; Describe the required major process steps for each risk level; Describe the personnel responsibilities for IM process implementation; Access available resources to support IM process implementation.

  9. A Bioinformatics Facility for NASA

    Science.gov (United States)

    Schweighofer, Karl; Pohorille, Andrew

    2006-01-01

    Building on an existing prototype, we have fielded a facility with bioinformatics technologies that will help NASA meet its unique requirements for biological research. This facility consists of a cluster of computers capable of performing computationally intensive tasks, software tools, databases and knowledge management systems. Novel computational technologies for analyzing and integrating new biological data and already existing knowledge have been developed. With continued development and support, the facility will fulfill strategic NASA s bioinformatics needs in astrobiology and space exploration. . As a demonstration of these capabilities, we will present a detailed analysis of how spaceflight factors impact gene expression in the liver and kidney for mice flown aboard shuttle flight STS-108. We have found that many genes involved in signal transduction, cell cycle, and development respond to changes in microgravity, but that most metabolic pathways appear unchanged.

  10. Congestion Management System Process Report

    Science.gov (United States)

    1996-03-01

    In January 1995, the Indianapolis Metropolitan Planning Organization with the help of an interagency Study Review Committee began the process of developing a Congestion Management System (CMS) Plan resulting in this report. This report documents the ...

  11. When cloud computing meets bioinformatics: a review.

    Science.gov (United States)

    Zhou, Shuigeng; Liao, Ruiqi; Guan, Jihong

    2013-10-01

    In the past decades, with the rapid development of high-throughput technologies, biology research has generated an unprecedented amount of data. In order to store and process such a great amount of data, cloud computing and MapReduce were applied to many fields of bioinformatics. In this paper, we first introduce the basic concepts of cloud computing and MapReduce, and their applications in bioinformatics. We then highlight some problems challenging the applications of cloud computing and MapReduce to bioinformatics. Finally, we give a brief guideline for using cloud computing in biology research.

  12. Computational biology and bioinformatics in Nigeria.

    Science.gov (United States)

    Fatumo, Segun A; Adoga, Moses P; Ojo, Opeolu O; Oluwagbemi, Olugbenga; Adeoye, Tolulope; Ewejobi, Itunuoluwa; Adebiyi, Marion; Adebiyi, Ezekiel; Bewaji, Clement; Nashiru, Oyekanmi

    2014-04-01

    Over the past few decades, major advances in the field of molecular biology, coupled with advances in genomic technologies, have led to an explosive growth in the biological data generated by the scientific community. The critical need to process and analyze such a deluge of data and turn it into useful knowledge has caused bioinformatics to gain prominence and importance. Bioinformatics is an interdisciplinary research area that applies techniques, methodologies, and tools in computer and information science to solve biological problems. In Nigeria, bioinformatics has recently played a vital role in the advancement of biological sciences. As a developing country, the importance of bioinformatics is rapidly gaining acceptance, and bioinformatics groups comprised of biologists, computer scientists, and computer engineers are being constituted at Nigerian universities and research institutes. In this article, we present an overview of bioinformatics education and research in Nigeria. We also discuss professional societies and academic and research institutions that play central roles in advancing the discipline in Nigeria. Finally, we propose strategies that can bolster bioinformatics education and support from policy makers in Nigeria, with potential positive implications for other developing countries.

  13. Computational biology and bioinformatics in Nigeria.

    Directory of Open Access Journals (Sweden)

    Segun A Fatumo

    2014-04-01

    Full Text Available Over the past few decades, major advances in the field of molecular biology, coupled with advances in genomic technologies, have led to an explosive growth in the biological data generated by the scientific community. The critical need to process and analyze such a deluge of data and turn it into useful knowledge has caused bioinformatics to gain prominence and importance. Bioinformatics is an interdisciplinary research area that applies techniques, methodologies, and tools in computer and information science to solve biological problems. In Nigeria, bioinformatics has recently played a vital role in the advancement of biological sciences. As a developing country, the importance of bioinformatics is rapidly gaining acceptance, and bioinformatics groups comprised of biologists, computer scientists, and computer engineers are being constituted at Nigerian universities and research institutes. In this article, we present an overview of bioinformatics education and research in Nigeria. We also discuss professional societies and academic and research institutions that play central roles in advancing the discipline in Nigeria. Finally, we propose strategies that can bolster bioinformatics education and support from policy makers in Nigeria, with potential positive implications for other developing countries.

  14. Management of Organizational Change Processes

    Directory of Open Access Journals (Sweden)

    Vladimir-Codrin Ionescu

    2015-12-01

    Full Text Available Contemporary organizations need to understand the meaning of change and to tackle it as a source for improving processes and activities, aiming at increasing the performance and competitiveness. From this perspective, the paper presents approaches to organizational change and highlights the fundamental objectives which the organizations set for themselves by designing and implementing organizational change programs. The conceptual framework of the change management is defined and the stages of the change management process are presented. In the final part of the paper the problem of resistance to change is highlighted by explaining the content of the stages that employees go through in the process of adapting to change within organizations

  15. Electronic Handbooks Simplify Process Management

    Science.gov (United States)

    2012-01-01

    Getting a multitude of people to work together to manage processes across many organizations for example, flight projects, research, technologies, or data centers and others is not an easy task. Just ask Dr. Barry E. Jacobs, a research computer scientist at Goddard Space Flight Center. He helped NASA develop a process management solution that provided documenting tools for process developers and participants to help them quickly learn, adapt, test, and teach their views. Some of these tools included editable files for subprocess descriptions, document descriptions, role guidelines, manager worksheets, and references. First utilized for NASA's Headquarters Directives Management process, the approach led to the invention of a concept called the Electronic Handbook (EHB). This EHB concept was successfully applied to NASA's Small Business Innovation Research (SBIR) and Small Business Technology Transfer (STTR) programs, among other NASA programs. Several Federal agencies showed interest in the concept, so Jacobs and his team visited these agencies to show them how their specific processes could be managed by the methodology, as well as to create mockup versions of the EHBs.

  16. BioShaDock: a community driven bioinformatics shared Docker-based tools registry.

    Science.gov (United States)

    Moreews, François; Sallou, Olivier; Ménager, Hervé; Le Bras, Yvan; Monjeaud, Cyril; Blanchet, Christophe; Collin, Olivier

    2015-01-01

    Linux container technologies, as represented by Docker, provide an alternative to complex and time-consuming installation processes needed for scientific software. The ease of deployment and the process isolation they enable, as well as the reproducibility they permit across environments and versions, are among the qualities that make them interesting candidates for the construction of bioinformatic infrastructures, at any scale from single workstations to high throughput computing architectures. The Docker Hub is a public registry which can be used to distribute bioinformatic software as Docker images. However, its lack of curation and its genericity make it difficult for a bioinformatics user to find the most appropriate images needed. BioShaDock is a bioinformatics-focused Docker registry, which provides a local and fully controlled environment to build and publish bioinformatic software as portable Docker images. It provides a number of improvements over the base Docker registry on authentication and permissions management, that enable its integration in existing bioinformatic infrastructures such as computing platforms. The metadata associated with the registered images are domain-centric, including for instance concepts defined in the EDAM ontology, a shared and structured vocabulary of commonly used terms in bioinformatics. The registry also includes user defined tags to facilitate its discovery, as well as a link to the tool description in the ELIXIR registry if it already exists. If it does not, the BioShaDock registry will synchronize with the registry to create a new description in the Elixir registry, based on the BioShaDock entry metadata. This link will help users get more information on the tool such as its EDAM operations, input and output types. This allows integration with the ELIXIR Tools and Data Services Registry, thus providing the appropriate visibility of such images to the bioinformatics community.

  17. Modelling Hospital Materials Management Processes

    Directory of Open Access Journals (Sweden)

    Raffaele Iannone

    2013-06-01

    integrated and detailed analysis and description model for hospital materials management data and tasks, which is able to tackle information from patient requirements to usage, from replenishment requests to supplying and handling activities. The model takes account of medical risk reduction, traceability and streamlined processes perspectives. Second, the paper translates this information into a business process model and mathematical formalization.The study provides a useful guide to the various relevant technology‐related, management and business issues, laying the foundations of an efficient reengineering of the supply chain to reduce healthcare costs and improve the quality of care.

  18. Incorporating Genomics and Bioinformatics across the Life Sciences Curriculum

    Energy Technology Data Exchange (ETDEWEB)

    Ditty, Jayna L.; Kvaal, Christopher A.; Goodner, Brad; Freyermuth, Sharyn K.; Bailey, Cheryl; Britton, Robert A.; Gordon, Stuart G.; Heinhorst, Sabine; Reed, Kelynne; Xu, Zhaohui; Sanders-Lorenz, Erin R.; Axen, Seth; Kim, Edwin; Johns, Mitrick; Scott, Kathleen; Kerfeld, Cheryl A.

    2011-08-01

    into courses or independent research projects requires infrastructure for organizing and assessing student work. Here, we present a new platform for faculty to keep current with the rapidly changing field of bioinformatics, the Integrated Microbial Genomes Annotation Collaboration Toolkit (IMG-ACT). It was developed by instructors from both research-intensive and predominately undergraduate institutions in collaboration with the Department of Energy-Joint Genome Institute (DOE-JGI) as a means to innovate and update undergraduate education and faculty development. The IMG-ACT program provides a cadre of tools, including access to a clearinghouse of genome sequences, bioinformatics databases, data storage, instructor course management, and student notebooks for organizing the results of their bioinformatic investigations. In the process, IMG-ACT makes it feasible to provide undergraduate research opportunities to a greater number and diversity of students, in contrast to the traditional mentor-to-student apprenticeship model for undergraduate research, which can be too expensive and time-consuming to provide for every undergraduate. The IMG-ACT serves as the hub for the network of faculty and students that use the system for microbial genome analysis. Open access of the IMG-ACT infrastructure to participating schools ensures that all types of higher education institutions can utilize it. With the infrastructure in place, faculty can focus their efforts on the pedagogy of bioinformatics, involvement of students in research, and use of this tool for their own research agenda. What the original faculty members of the IMG-ACT development team present here is an overview of how the IMG-ACT program has affected our development in terms of teaching and research with the hopes that it will inspire more faculty to get involved.

  19. Rough-fuzzy pattern recognition applications in bioinformatics and medical imaging

    CERN Document Server

    Maji, Pradipta

    2012-01-01

    Learn how to apply rough-fuzzy computing techniques to solve problems in bioinformatics and medical image processing Emphasizing applications in bioinformatics and medical image processing, this text offers a clear framework that enables readers to take advantage of the latest rough-fuzzy computing techniques to build working pattern recognition models. The authors explain step by step how to integrate rough sets with fuzzy sets in order to best manage the uncertainties in mining large data sets. Chapters are logically organized according to the major phases of pattern recognition systems dev

  20. NCCDS configuration management process improvement

    Science.gov (United States)

    Shay, Kathy

    1993-01-01

    By concentrating on defining and improving specific Configuration Management (CM) functions, processes, procedures, personnel selection/development, and tools, internal and external customers received improved CM services. Job performance within the section increased in both satisfaction and output. Participation in achieving major improvements has led to the delivery of consistent quality CM products as well as significant decreases in every measured CM metrics category.

  1. Computational methods to study the structure and dynamics of biomolecules and biomolecular processes from bioinformatics to molecular quantum mechanics

    CERN Document Server

    2014-01-01

    Since the second half of the 20th century machine computations have played a critical role in science and engineering. Computer-based techniques have become especially important in molecular biology, since they often represent the only viable way to gain insights into the behavior of a biological system as a whole. The complexity of biological systems, which usually needs to be analyzed on different time- and size-scales and with different levels of accuracy, requires the application of different approaches, ranging from comparative analysis of sequences and structural databases, to the analysis of networks of interdependence between cell components and processes, through coarse-grained modeling to atomically detailed simulations, and finally to molecular quantum mechanics. This book provides a comprehensive overview of modern computer-based techniques for computing the structure, properties and dynamics of biomolecules and biomolecular processes. The twenty-two chapters, written by scientists from all over t...

  2. USAR managing and updating process

    International Nuclear Information System (INIS)

    Prah, M.; Spiler, J.

    1996-01-01

    In this paper basis and background of the FSAR (Final Safety Analysis Report) document and its conversion process to the USAR (Updated Safety Analysis Report) document are described. In addition, there are internal and external reviews as approval process presented. The following is included in our new approach to manage USAR changes: initiating the USAR change, technical reviewing, preparing a safety evaluation, KSC (Krsko Safety Committee) and KOC (Krsko Operating Committee) review, ESD Director approval, and the Regulatory Body review or approval. The intensive technological modification activities started in the year 1992 when the NEK Engineering Services Division was established. These activities are one of the most important reason for a very intensive USAR items change. The other reason for its conversation to an electronic format is a possibility for easier and faster searching, updating and changing process and introducing a new systematic USAR managing approach as mentioned above. (author)

  3. OpenHelix: bioinformatics education outside of a different box.

    Science.gov (United States)

    Williams, Jennifer M; Mangan, Mary E; Perreault-Micale, Cynthia; Lathe, Scott; Sirohi, Neeraj; Lathe, Warren C

    2010-11-01

    The amount of biological data is increasing rapidly, and will continue to increase as new rapid technologies are developed. Professionals in every area of bioscience will have data management needs that require publicly available bioinformatics resources. Not all scientists desire a formal bioinformatics education but would benefit from more informal educational sources of learning. Effective bioinformatics education formats will address a broad range of scientific needs, will be aimed at a variety of user skill levels, and will be delivered in a number of different formats to address different learning styles. Informal sources of bioinformatics education that are effective are available, and will be explored in this review.

  4. Artificial intelligence and process management

    International Nuclear Information System (INIS)

    Epton, J.B.A.

    1989-01-01

    Techniques derived from work in artificial intelligence over the past few decades are beginning to change the approach in applying computers to process management. To explore this new approach and gain real practical experience of its potential a programme of experimental applications was initiated by Sira in collaboration with the process industry. This programme encompassed a family of experimental applications ranging from process monitoring, through supervisory control and troubleshooting to planning and scheduling. The experience gained has led to a number of conclusions regarding the present level of maturity of the technology, the potential for further developments and the measures required to secure the levels of system integrity necessary in on-line applications to critical processes. (author)

  5. Bioinformatics and its application in animal health: a review | Soetan ...

    African Journals Online (AJOL)

    Bioinformatics is an interdisciplinary subject, which uses computer application, statistics, mathematics and engineering for the analysis and management of biological information. It has become an important tool for basic and applied research in veterinary sciences. Bioinformatics has brought about advancements into ...

  6. Development of Bioinformatics Infrastructure for Genomics Research.

    Science.gov (United States)

    Mulder, Nicola J; Adebiyi, Ezekiel; Adebiyi, Marion; Adeyemi, Seun; Ahmed, Azza; Ahmed, Rehab; Akanle, Bola; Alibi, Mohamed; Armstrong, Don L; Aron, Shaun; Ashano, Efejiro; Baichoo, Shakuntala; Benkahla, Alia; Brown, David K; Chimusa, Emile R; Fadlelmola, Faisal M; Falola, Dare; Fatumo, Segun; Ghedira, Kais; Ghouila, Amel; Hazelhurst, Scott; Isewon, Itunuoluwa; Jung, Segun; Kassim, Samar Kamal; Kayondo, Jonathan K; Mbiyavanga, Mamana; Meintjes, Ayton; Mohammed, Somia; Mosaku, Abayomi; Moussa, Ahmed; Muhammd, Mustafa; Mungloo-Dilmohamud, Zahra; Nashiru, Oyekanmi; Odia, Trust; Okafor, Adaobi; Oladipo, Olaleye; Osamor, Victor; Oyelade, Jellili; Sadki, Khalid; Salifu, Samson Pandam; Soyemi, Jumoke; Panji, Sumir; Radouani, Fouzia; Souiai, Oussama; Tastan Bishop, Özlem

    2017-06-01

    Although pockets of bioinformatics excellence have developed in Africa, generally, large-scale genomic data analysis has been limited by the availability of expertise and infrastructure. H3ABioNet, a pan-African bioinformatics network, was established to build capacity specifically to enable H3Africa (Human Heredity and Health in Africa) researchers to analyze their data in Africa. Since the inception of the H3Africa initiative, H3ABioNet's role has evolved in response to changing needs from the consortium and the African bioinformatics community. H3ABioNet set out to develop core bioinformatics infrastructure and capacity for genomics research in various aspects of data collection, transfer, storage, and analysis. Various resources have been developed to address genomic data management and analysis needs of H3Africa researchers and other scientific communities on the continent. NetMap was developed and used to build an accurate picture of network performance within Africa and between Africa and the rest of the world, and Globus Online has been rolled out to facilitate data transfer. A participant recruitment database was developed to monitor participant enrollment, and data is being harmonized through the use of ontologies and controlled vocabularies. The standardized metadata will be integrated to provide a search facility for H3Africa data and biospecimens. Because H3Africa projects are generating large-scale genomic data, facilities for analysis and interpretation are critical. H3ABioNet is implementing several data analysis platforms that provide a large range of bioinformatics tools or workflows, such as Galaxy, the Job Management System, and eBiokits. A set of reproducible, portable, and cloud-scalable pipelines to support the multiple H3Africa data types are also being developed and dockerized to enable execution on multiple computing infrastructures. In addition, new tools have been developed for analysis of the uniquely divergent African data and for

  7. Bioinformatics: A History of Evolution "In Silico"

    Science.gov (United States)

    Ondrej, Vladan; Dvorak, Petr

    2012-01-01

    Bioinformatics, biological databases, and the worldwide use of computers have accelerated biological research in many fields, such as evolutionary biology. Here, we describe a primer of nucleotide sequence management and the construction of a phylogenetic tree with two examples; the two selected are from completely different groups of organisms:…

  8. XCPU2 process management system

    Energy Technology Data Exchange (ETDEWEB)

    Ionkov, Latchesar [Los Alamos National Laboratory; Van Hensbergen, Eric [IBM AUSTIN RESEARCH LAB

    2009-01-01

    Xcpu2 is a new process management system that allows the users to specify custom file system for a running job. Most cluster management systems enforce single software distribution running on all nodes. Xcpu2 allows programs running on the cluster to work in environment identical to the user's desktop, using the same versions of the libraries and tools the user installed locally, and accessing the configuration file in the same places they are located on the desktop. Xcpu2 builds on our earlier work with the Xcpu system. Like Xcpu, Xcpu2's process management interface is represented as a set of files exported by a 9P file server. It supports heterogeneous clusters and multiple head nodes. Unlike Xcpu, it uses pull instead of push model. In this paper we describe the Xcpu2 clustering model, its operation and how the per-job filesystem configuration can be used to solve some of the common problems when running a cluster.

  9. Advance in structural bioinformatics

    CERN Document Server

    Wei, Dongqing; Zhao, Tangzhen; Dai, Hao

    2014-01-01

    This text examines in detail mathematical and physical modeling, computational methods and systems for obtaining and analyzing biological structures, using pioneering research cases as examples. As such, it emphasizes programming and problem-solving skills. It provides information on structure bioinformatics at various levels, with individual chapters covering introductory to advanced aspects, from fundamental methods and guidelines on acquiring and analyzing genomics and proteomics sequences, the structures of protein, DNA and RNA, to the basics of physical simulations and methods for conform

  10. Translational Bioinformatics and Clinical Research (Biomedical) Informatics.

    Science.gov (United States)

    Sirintrapun, S Joseph; Zehir, Ahmet; Syed, Aijazuddin; Gao, JianJiong; Schultz, Nikolaus; Cheng, Donavan T

    2015-06-01

    Translational bioinformatics and clinical research (biomedical) informatics are the primary domains related to informatics activities that support translational research. Translational bioinformatics focuses on computational techniques in genetics, molecular biology, and systems biology. Clinical research (biomedical) informatics involves the use of informatics in discovery and management of new knowledge relating to health and disease. This article details 3 projects that are hybrid applications of translational bioinformatics and clinical research (biomedical) informatics: The Cancer Genome Atlas, the cBioPortal for Cancer Genomics, and the Memorial Sloan Kettering Cancer Center clinical variants and results database, all designed to facilitate insights into cancer biology and clinical/therapeutic correlations. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Crowdsourcing for bioinformatics.

    Science.gov (United States)

    Good, Benjamin M; Su, Andrew I

    2013-08-15

    Bioinformatics is faced with a variety of problems that require human involvement. Tasks like genome annotation, image analysis, knowledge-base population and protein structure determination all benefit from human input. In some cases, people are needed in vast quantities, whereas in others, we need just a few with rare abilities. Crowdsourcing encompasses an emerging collection of approaches for harnessing such distributed human intelligence. Recently, the bioinformatics community has begun to apply crowdsourcing in a variety of contexts, yet few resources are available that describe how these human-powered systems work and how to use them effectively in scientific domains. Here, we provide a framework for understanding and applying several different types of crowdsourcing. The framework considers two broad classes: systems for solving large-volume 'microtasks' and systems for solving high-difficulty 'megatasks'. Within these classes, we discuss system types, including volunteer labor, games with a purpose, microtask markets and open innovation contests. We illustrate each system type with successful examples in bioinformatics and conclude with a guide for matching problems to crowdsourcing solutions that highlights the positives and negatives of different approaches.

  12. Phylogenetic trees in bioinformatics

    Energy Technology Data Exchange (ETDEWEB)

    Burr, Tom L [Los Alamos National Laboratory

    2008-01-01

    Genetic data is often used to infer evolutionary relationships among a collection of viruses, bacteria, animal or plant species, or other operational taxonomic units (OTU). A phylogenetic tree depicts such relationships and provides a visual representation of the estimated branching order of the OTUs. Tree estimation is unique for several reasons, including: the types of data used to represent each OTU; the use ofprobabilistic nucleotide substitution models; the inference goals involving both tree topology and branch length, and the huge number of possible trees for a given sample of a very modest number of OTUs, which implies that fmding the best tree(s) to describe the genetic data for each OTU is computationally demanding. Bioinformatics is too large a field to review here. We focus on that aspect of bioinformatics that includes study of similarities in genetic data from multiple OTUs. Although research questions are diverse, a common underlying challenge is to estimate the evolutionary history of the OTUs. Therefore, this paper reviews the role of phylogenetic tree estimation in bioinformatics, available methods and software, and identifies areas for additional research and development.

  13. What is Business Process Management?

    DEFF Research Database (Denmark)

    Møller, Charles; Tan, Rune Domino; Maack, Carsten Jessen

    2007-01-01

    Business Process Management (BPM) is an emerging new field in business. However there is no academically agreed upon conceptual framework. The aim of this paper is to establish a conceptual framework grounded in the recent literature. The purpose of this work is to ensure a better foundation...... for future research and to discussion of the implications of BPM on Enterprise Information Systems (EIS). The starting point of this study is a focused literature review of the BPM concept. This literature review leads to the formulation of a conceptual framework for BPM which is evaluated using...

  14. Environmental management of business processes

    Directory of Open Access Journals (Sweden)

    Vesna Čančer

    2000-01-01

    Full Text Available Since the decision-makers in enterprises will accept the goals of environmental management only if they are motivated enough, comprehensible and useful tools should be generated to support environmentally oriented business decision-making. For that reason, a general optimisation model of the multiphase business process is presented in this paper. This model includes the possibilities for an integrated approach to environmental protection so that it can be applied as a scenario by the business process simulation for the evaluation of environmentally oriented business decisions on business performance. Furthermore, development and application possibilities of the presented model are introduced. Some measures of resource efficiency are developed using the presented optimisation model.

  15. Configuration Management Process Assessment Strategy

    Science.gov (United States)

    Henry, Thad

    2014-01-01

    Purpose: To propose a strategy for assessing the development and effectiveness of configuration management systems within Programs, Projects, and Design Activities performed by technical organizations and their supporting development contractors. Scope: Various entities CM Systems will be assessed dependent on Project Scope (DDT&E), Support Services and Acquisition Agreements. Approach: Model based structured against assessing organizations CM requirements including best practices maturity criteria. The model is tailored to the entity being assessed dependent on their CM system. The assessment approach provides objective feedback to Engineering and Project Management of the observed CM system maturity state versus the ideal state of the configuration management processes and outcomes(system). center dot Identifies strengths and risks versus audit gotcha's (findings/observations). center dot Used "recursively and iteratively" throughout program lifecycle at select points of need. (Typical assessments timing is Post PDR/Post CDR) center dot Ideal state criteria and maturity targets are reviewed with the assessed entity prior to an assessment (Tailoring) and is dependent on the assessed phase of the CM system. center dot Supports exit success criteria for Preliminary and Critical Design Reviews. center dot Gives a comprehensive CM system assessment which ultimately supports configuration verification activities.*

  16. Knowledge management: processes and systems | Igbinovia ...

    African Journals Online (AJOL)

    Knowledge management: processes and systems. ... Information Impact: Journal of Information and Knowledge Management ... observation, role reversal technique, and discussion forums as well as the forms of knowledge representation to include report writing, database management system and institutional repositories.

  17. The GMOD Drupal Bioinformatic Server Framework

    Science.gov (United States)

    Papanicolaou, Alexie; Heckel, David G.

    2010-01-01

    Motivation: Next-generation sequencing technologies have led to the widespread use of -omic applications. As a result, there is now a pronounced bioinformatic bottleneck. The general model organism database (GMOD) tool kit (http://gmod.org) has produced a number of resources aimed at addressing this issue. It lacks, however, a robust online solution that can deploy heterogeneous data and software within a Web content management system (CMS). Results: We present a bioinformatic framework for the Drupal CMS. It consists of three modules. First, GMOD-DBSF is an application programming interface module for the Drupal CMS that simplifies the programming of bioinformatic Drupal modules. Second, the Drupal Bioinformatic Software Bench (biosoftware_bench) allows for a rapid and secure deployment of bioinformatic software. An innovative graphical user interface (GUI) guides both use and administration of the software, including the secure provision of pre-publication datasets. Third, we present genes4all_experiment, which exemplifies how our work supports the wider research community. Conclusion: Given the infrastructure presented here, the Drupal CMS may become a powerful new tool set for bioinformaticians. The GMOD-DBSF base module is an expandable community resource that decreases development time of Drupal modules for bioinformatics. The biosoftware_bench module can already enhance biologists' ability to mine their own data. The genes4all_experiment module has already been responsible for archiving of more than 150 studies of RNAi from Lepidoptera, which were previously unpublished. Availability and implementation: Implemented in PHP and Perl. Freely available under the GNU Public License 2 or later from http://gmod-dbsf.googlecode.com Contact: alexie@butterflybase.org PMID:20971988

  18. The GMOD Drupal bioinformatic server framework.

    Science.gov (United States)

    Papanicolaou, Alexie; Heckel, David G

    2010-12-15

    Next-generation sequencing technologies have led to the widespread use of -omic applications. As a result, there is now a pronounced bioinformatic bottleneck. The general model organism database (GMOD) tool kit (http://gmod.org) has produced a number of resources aimed at addressing this issue. It lacks, however, a robust online solution that can deploy heterogeneous data and software within a Web content management system (CMS). We present a bioinformatic framework for the Drupal CMS. It consists of three modules. First, GMOD-DBSF is an application programming interface module for the Drupal CMS that simplifies the programming of bioinformatic Drupal modules. Second, the Drupal Bioinformatic Software Bench (biosoftware_bench) allows for a rapid and secure deployment of bioinformatic software. An innovative graphical user interface (GUI) guides both use and administration of the software, including the secure provision of pre-publication datasets. Third, we present genes4all_experiment, which exemplifies how our work supports the wider research community. Given the infrastructure presented here, the Drupal CMS may become a powerful new tool set for bioinformaticians. The GMOD-DBSF base module is an expandable community resource that decreases development time of Drupal modules for bioinformatics. The biosoftware_bench module can already enhance biologists' ability to mine their own data. The genes4all_experiment module has already been responsible for archiving of more than 150 studies of RNAi from Lepidoptera, which were previously unpublished. Implemented in PHP and Perl. Freely available under the GNU Public License 2 or later from http://gmod-dbsf.googlecode.com.

  19. Planning bioinformatics workflows using an expert system

    Science.gov (United States)

    Chen, Xiaoling; Chang, Jeffrey T.

    2017-01-01

    Abstract Motivation: Bioinformatic analyses are becoming formidably more complex due to the increasing number of steps required to process the data, as well as the proliferation of methods that can be used in each step. To alleviate this difficulty, pipelines are commonly employed. However, pipelines are typically implemented to automate a specific analysis, and thus are difficult to use for exploratory analyses requiring systematic changes to the software or parameters used. Results: To automate the development of pipelines, we have investigated expert systems. We created the Bioinformatics ExperT SYstem (BETSY) that includes a knowledge base where the capabilities of bioinformatics software is explicitly and formally encoded. BETSY is a backwards-chaining rule-based expert system comprised of a data model that can capture the richness of biological data, and an inference engine that reasons on the knowledge base to produce workflows. Currently, the knowledge base is populated with rules to analyze microarray and next generation sequencing data. We evaluated BETSY and found that it could generate workflows that reproduce and go beyond previously published bioinformatics results. Finally, a meta-investigation of the workflows generated from the knowledge base produced a quantitative measure of the technical burden imposed by each step of bioinformatics analyses, revealing the large number of steps devoted to the pre-processing of data. In sum, an expert system approach can facilitate exploratory bioinformatic analysis by automating the development of workflows, a task that requires significant domain expertise. Availability and Implementation: https://github.com/jefftc/changlab Contact: jeffrey.t.chang@uth.tmc.edu PMID:28052928

  20. Management of Technology - a political process approach

    DEFF Research Database (Denmark)

    Koch, Christian

    1999-01-01

    Most management of technology writings fail to address enterprise developments as political processes, where visions, coalitions and emergence are central features. The paper report of a participants observation study of management of technology processes.......Most management of technology writings fail to address enterprise developments as political processes, where visions, coalitions and emergence are central features. The paper report of a participants observation study of management of technology processes....

  1. BioShaDock: a community driven bioinformatics shared Docker-based tools registry [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    François Moreews

    2015-12-01

    Full Text Available Linux container technologies, as represented by Docker, provide an alternative to complex and time-consuming installation processes needed for scientific software. The ease of deployment and the process isolation they enable, as well as the reproducibility they permit across environments and versions, are among the qualities that make them interesting candidates for the construction of bioinformatic infrastructures, at any scale from single workstations to high throughput computing architectures. The Docker Hub is a public registry which can be used to distribute bioinformatic software as Docker images. However, its lack of curation and its genericity make it difficult for a bioinformatics user to find the most appropriate images needed. BioShaDock is a bioinformatics-focused Docker registry, which provides a local and fully controlled environment to build and publish bioinformatic software as portable Docker images. It provides a number of improvements over the base Docker registry on authentication and permissions management, that enable its integration in existing bioinformatic infrastructures such as computing platforms. The metadata associated with the registered images are domain-centric, including for instance concepts defined in the EDAM ontology, a shared and structured vocabulary of commonly used terms in bioinformatics. The registry also includes user defined tags to facilitate its discovery, as well as a link to the tool description in the ELIXIR registry if it already exists. If it does not, the BioShaDock registry will synchronize with the registry to create a new description in the Elixir registry, based on the BioShaDock entry metadata. This link will help users get more information on the tool such as its EDAM operations, input and output types. This allows integration with the ELIXIR Tools and Data Services Registry, thus providing the appropriate visibility of such images to the bioinformatics community.

  2. Sample and data management process description

    International Nuclear Information System (INIS)

    Kessner, J.H.

    2000-01-01

    The sample and data management process was initiated in 1994 as a result of a process improvement workshop. The purpose of the workshop was to develop a sample and data management process that would reduce cycle time and costs, simplify systems and procedures, and improve customer satisfaction for sampling, analytical services, and data management activities

  3. Dynamic process management for engineering environments

    NARCIS (Netherlands)

    Mentink, R.J.; van Houten, Frederikus J.A.M.; Kals, H.J.J.

    2003-01-01

    The research presented in this paper proposes a concept for dynamic process management as part of an integrated approach to engineering process support. The theory of information management is the starting point for the development of a process management system based on evolution of information

  4. COMPARISON OF POPULAR BIOINFORMATICS DATABASES

    OpenAIRE

    Abdulganiyu Abdu Yusuf; Zahraddeen Sufyanu; Kabir Yusuf Mamman; Abubakar Umar Suleiman

    2016-01-01

    Bioinformatics is the application of computational tools to capture and interpret biological data. It has wide applications in drug development, crop improvement, agricultural biotechnology and forensic DNA analysis. There are various databases available to researchers in bioinformatics. These databases are customized for a specific need and are ranged in size, scope, and purpose. The main drawbacks of bioinformatics databases include redundant information, constant change, data spread over m...

  5. Bioinformatics-Aided Venomics

    Directory of Open Access Journals (Sweden)

    Quentin Kaas

    2015-06-01

    Full Text Available Venomics is a modern approach that combines transcriptomics and proteomics to explore the toxin content of venoms. This review will give an overview of computational approaches that have been created to classify and consolidate venomics data, as well as algorithms that have helped discovery and analysis of toxin nucleic acid and protein sequences, toxin three-dimensional structures and toxin functions. Bioinformatics is used to tackle specific challenges associated with the identification and annotations of toxins. Recognizing toxin transcript sequences among second generation sequencing data cannot rely only on basic sequence similarity because toxins are highly divergent. Mass spectrometry sequencing of mature toxins is challenging because toxins can display a large number of post-translational modifications. Identifying the mature toxin region in toxin precursor sequences requires the prediction of the cleavage sites of proprotein convertases, most of which are unknown or not well characterized. Tracing the evolutionary relationships between toxins should consider specific mechanisms of rapid evolution as well as interactions between predatory animals and prey. Rapidly determining the activity of toxins is the main bottleneck in venomics discovery, but some recent bioinformatics and molecular modeling approaches give hope that accurate predictions of toxin specificity could be made in the near future.

  6. THE STRATEGIC PERFORMANCE MANAGEMENT PROCESS

    Directory of Open Access Journals (Sweden)

    Radu Catalina

    2009-05-01

    Full Text Available Contemporary trends in global competition, rapid technological developments and increased use of management information systems and the Internet, developments in planning and control and management thinking, and changing demographics are putting pressures

  7. Adaptive Process Management with ADEPT2

    NARCIS (Netherlands)

    Reichert, M.U.; Rinderle, S.B.; Kreher, U; Dadam, P.

    2005-01-01

    This demo paper describes core functions of the ADEPT2 process management system. In the ADEPT project we have been working on the design and implementation of a next generation process management software. Based on a conceptual framework for dynamic process changes, on novel process support

  8. Human Resource Management in the Enhancement Processes of Knowledge Management

    Directory of Open Access Journals (Sweden)

    Didi Sundiman

    2017-11-01

    Full Text Available This research explored Human Resource Management (HRM in enhancement processes of knowledge management. This research explored how HRM practice enhanced the operational of knowledge management. Data were collected by a survey by interviewing 12 informants from Small and Medium Enterprise (SME. The results show that HRM practice gives initiative in the enhancement process of the knowledge management strategy applied to the company. It can be concluded that each sub-component of HRM affects the components of knowledge management, and HRM is highly influential and has a positive effect on quality management processes and vice versa in the work environment.

  9. Thermal energy management process experiment

    Science.gov (United States)

    Ollendorf, S.

    1984-01-01

    The thermal energy management processes experiment (TEMP) will demonstrate that through the use of two-phase flow technology, thermal systems can be significantly enhanced by increasing heat transport capabilities at reduced power consumption while operating within narrow temperature limits. It has been noted that such phenomena as excess fluid puddling, priming, stratification, and surface tension effects all tend to mask the performance of two-phase flow systems in a 1-g field. The flight experiment approach would be to attack the experiment to an appropriate mounting surface with a 15 to 20 meter effective length and provide a heat input and output station in the form of heaters and a radiator. Using environmental data, the size, location, and orientation of the experiment can be optimized. The approach would be to provide a self-contained panel and mount it to the STEP through a frame. A small electronics package would be developed to interface with the STEP avionics for command and data handling. During the flight, heaters on the evaporator will be exercised to determine performance. Flight data will be evaluated against the ground tests to determine any anomalous behavior.

  10. Knowledge management vs business process management in contemporary enterprises

    Directory of Open Access Journals (Sweden)

    Bitkowska Agnieszka

    2016-06-01

    Full Text Available The main objective of this paper is to identify the system of knowledge management in contemporary process organizations in business process perspective, especially with regard to technological and social conditions. Methodology is based on literature analysis and case studies. The integration of knowledge management technologies, concepts and methods into organizational business processes is challenging research issue today. The concepts of knowledge management and business process management should be analyzed jointly in the contemporary enterprises. Despite of the growing interest among researchers and practitioners of the concept of the knowledge management referring to business process management there is a lack of articles in this area. Appropriate approach to the modelling of knowledge management processes, as well as the use of IT tools, and a motivation system are of key importance for the introduction of this solution in organizations.

  11. Emergent Computation Emphasizing Bioinformatics

    CERN Document Server

    Simon, Matthew

    2005-01-01

    Emergent Computation is concerned with recent applications of Mathematical Linguistics or Automata Theory. This subject has a primary focus upon "Bioinformatics" (the Genome and arising interest in the Proteome), but the closing chapter also examines applications in Biology, Medicine, Anthropology, etc. The book is composed of an organized examination of DNA, RNA, and the assembly of amino acids into proteins. Rather than examine these areas from a purely mathematical viewpoint (that excludes much of the biochemical reality), the author uses scientific papers written mostly by biochemists based upon their laboratory observations. Thus while DNA may exist in its double stranded form, triple stranded forms are not excluded. Similarly, while bases exist in Watson-Crick complements, mismatched bases and abasic pairs are not excluded, nor are Hoogsteen bonds. Just as there are four bases naturally found in DNA, the existence of additional bases is not ignored, nor amino acids in addition to the usual complement of...

  12. Optimization and standardization of pavement management processes.

    Science.gov (United States)

    2004-08-01

    This report addresses issues related to optimization and standardization of current pavement management processes in Kentucky. Historical pavement management records were analyzed, which indicates that standardization is necessary in future pavement ...

  13. Managing Process Variants in the Process Life Cycle

    NARCIS (Netherlands)

    Hallerbach, A.; Bauer, Th.; Reichert, M.U.

    2007-01-01

    When designing process-aware information systems, often variants of the same process have to be specified. Each variant then constitutes an adjustment of a particular process to specific requirements building the process context. Current Business Process Management (BPM) tools do not adequately

  14. Incident Management: Process into Practice

    Science.gov (United States)

    Isaac, Gayle; Moore, Brian

    2011-01-01

    Tornados, shootings, fires--these are emergencies that require fast action by school district personnel, but they are not the only incidents that require risk management. The authors have introduced the National Incident Management System (NIMS) and the Incident Command System (ICS) and assured that these systems can help educators plan for and…

  15. Business process management and IT management: The missing integration

    DEFF Research Database (Denmark)

    Rahimi, Fatemeh; Møller, Charles; Hvam, Lars

    2016-01-01

    of IT on process innovations, the association between business process management and IT management is under-explored. Drawing on a literature analysis of the capabilities of business process and IT governance frameworks and findings from a case study, we propose the need for horizontal integration between the two......The importance of business processes and the centrality of IT to contemporary organizations' performance calls for a specific focus on business process management and IT management. Despite the wide scope of business process management covering both business and IT domains, and the profound impact...... management functions to enable strategic and operational business - IT alignment. We further argue that the role of IT in an organization influences the direction of integration between the two functions and thus the choice of integration mechanisms. Using case study findings, we propose...

  16. Method for Business Process Management System Selection

    NARCIS (Netherlands)

    Thijs van de Westelaken; Bas Terwee; Pascal Ravesteijn

    2013-01-01

    In recent years business process management (BPM) and specifically information systems that support the analysis, design and execution of processes (also called business process management systems (BPMS)) are getting more attention. This has lead to an increase in research on BPM and BPMS. However

  17. A new approach in Business Process Management

    OpenAIRE

    Hurbean, Luminita

    2008-01-01

    The new wave of BPM (Business Process Management) is not Business Process Reenginering, enterprise application integration, workflow management or another packaged application – it's the synthesis and extension of all these technologies and techniques into a unified whole. This unified whole becomes a new foundation upon which the enterprise is built, an enterprise more in tune with the true nature of business processes and their management. In a competitive economy, where margins continu...

  18. Process-based software project management

    CERN Document Server

    Goodman, F Alan

    2006-01-01

    Not connecting software project management (SPM) to actual, real-world development processes can lead to a complete divorcing of SPM to software engineering that can undermine any successful software project. By explaining how a layered process architectural model improves operational efficiency, Process-Based Software Project Management outlines a new method that is more effective than the traditional method when dealing with SPM. With a clear and easy-to-read approach, the book discusses the benefits of an integrated project management-process management connection. The described tight coup

  19. Business Process Management Theory and Applications

    CERN Document Server

    2013-01-01

    Business Process Management (BPM) has been in existence for decades. It  uses, complements, integrates and extends theories, methods and tools from  other scientific disciplines like: strategic management, information technology, managerial accounting, operations management etc. During this period the main focus themes of researchers and professionals in BPM  were: business process modeling, business process analysis, activity based costing, business process simulation, performance measurement, workflow management, the link between information technology and BPM for process automation etc. More recently the focus moved to subjects like Knowledge Management, Enterprise Resource Planning (ERP) Systems, Service Oriented Architectures (SOAs), Process Intelligence (PI) and even  Social Networks. In this collection of papers we present a review of the work and the outcomes achieved in the classic BPM fields as well as a deeper insight on recent advances in BPM. We present a review of business process modeling a...

  20. BUSINESS PROCESS MANAGEMENT SYSTEMS TECHNOLOGY COMPONENTS ANALYSIS

    Directory of Open Access Journals (Sweden)

    Andrea Giovanni Spelta

    2007-05-01

    Full Text Available The information technology that supports the implementation of the business process management appproach is called Business Process Management System (BPMS. The main components of the BPMS solution framework are process definition repository, process instances repository, transaction manager, conectors framework, process engine and middleware. In this paper we define and characterize the role and importance of the components of BPMS's framework. The research method adopted was the case study, through the analysis of the implementation of the BPMS solution in an insurance company called Chubb do Brasil. In the case study, the process "Manage Coinsured Events"" is described and characterized, as well as the components of the BPMS solution adopted and implemented by Chubb do Brasil for managing this process.

  1. Biowep: a workflow enactment portal for bioinformatics applications.

    Science.gov (United States)

    Romano, Paolo; Bartocci, Ezio; Bertolini, Guglielmo; De Paoli, Flavio; Marra, Domenico; Mauri, Giancarlo; Merelli, Emanuela; Milanesi, Luciano

    2007-03-08

    The huge amount of biological information, its distribution over the Internet and the heterogeneity of available software tools makes the adoption of new data integration and analysis network tools a necessity in bioinformatics. ICT standards and tools, like Web Services and Workflow Management Systems (WMS), can support the creation and deployment of such systems. Many Web Services are already available and some WMS have been proposed. They assume that researchers know which bioinformatics resources can be reached through a programmatic interface and that they are skilled in programming and building workflows. Therefore, they are not viable to the majority of unskilled researchers. A portal enabling these to take profit from new technologies is still missing. We designed biowep, a web based client application that allows for the selection and execution of a set of predefined workflows. The system is available on-line. Biowep architecture includes a Workflow Manager, a User Interface and a Workflow Executor. The task of the Workflow Manager is the creation and annotation of workflows. These can be created by using either the Taverna Workbench or BioWMS. Enactment of workflows is carried out by FreeFluo for Taverna workflows and by BioAgent/Hermes, a mobile agent-based middleware, for BioWMS ones. Main workflows' processing steps are annotated on the basis of their input and output, elaboration type and application domain by using a classification of bioinformatics data and tasks. The interface supports users authentication and profiling. Workflows can be selected on the basis of users' profiles and can be searched through their annotations. Results can be saved. We developed a web system that support the selection and execution of predefined workflows, thus simplifying access for all researchers. The implementation of Web Services allowing specialized software to interact with an exhaustive set of biomedical databases and analysis software and the creation of

  2. Biowep: a workflow enactment portal for bioinformatics applications

    Directory of Open Access Journals (Sweden)

    Romano Paolo

    2007-03-01

    Full Text Available Abstract Background The huge amount of biological information, its distribution over the Internet and the heterogeneity of available software tools makes the adoption of new data integration and analysis network tools a necessity in bioinformatics. ICT standards and tools, like Web Services and Workflow Management Systems (WMS, can support the creation and deployment of such systems. Many Web Services are already available and some WMS have been proposed. They assume that researchers know which bioinformatics resources can be reached through a programmatic interface and that they are skilled in programming and building workflows. Therefore, they are not viable to the majority of unskilled researchers. A portal enabling these to take profit from new technologies is still missing. Results We designed biowep, a web based client application that allows for the selection and execution of a set of predefined workflows. The system is available on-line. Biowep architecture includes a Workflow Manager, a User Interface and a Workflow Executor. The task of the Workflow Manager is the creation and annotation of workflows. These can be created by using either the Taverna Workbench or BioWMS. Enactment of workflows is carried out by FreeFluo for Taverna workflows and by BioAgent/Hermes, a mobile agent-based middleware, for BioWMS ones. Main workflows' processing steps are annotated on the basis of their input and output, elaboration type and application domain by using a classification of bioinformatics data and tasks. The interface supports users authentication and profiling. Workflows can be selected on the basis of users' profiles and can be searched through their annotations. Results can be saved. Conclusion We developed a web system that support the selection and execution of predefined workflows, thus simplifying access for all researchers. The implementation of Web Services allowing specialized software to interact with an exhaustive set of biomedical

  3. An Adaptive Hybrid Multiprocessor technique for bioinformatics sequence alignment

    KAUST Repository

    Bonny, Talal; Salama, Khaled N.; Zidan, Mohammed A.

    2012-01-01

    Sequence alignment algorithms such as the Smith-Waterman algorithm are among the most important applications in the development of bioinformatics. Sequence alignment algorithms must process large amounts of data which may take a long time. Here, we

  4. Information management in process planning

    NARCIS (Netherlands)

    Lutters, Diederick; Wijnker, T.C.; Kals, H.J.J.

    1999-01-01

    A recently proposed reference model indicates the use of structured information as the basis for the control of design and manufacturing processes. The model is used as a basis to describe the integration of design and process planning. A differentiation is made between macro- and micro process

  5. Microbial bioinformatics 2020.

    Science.gov (United States)

    Pallen, Mark J

    2016-09-01

    Microbial bioinformatics in 2020 will remain a vibrant, creative discipline, adding value to the ever-growing flood of new sequence data, while embracing novel technologies and fresh approaches. Databases and search strategies will struggle to cope and manual curation will not be sustainable during the scale-up to the million-microbial-genome era. Microbial taxonomy will have to adapt to a situation in which most microorganisms are discovered and characterised through the analysis of sequences. Genome sequencing will become a routine approach in clinical and research laboratories, with fresh demands for interpretable user-friendly outputs. The "internet of things" will penetrate healthcare systems, so that even a piece of hospital plumbing might have its own IP address that can be integrated with pathogen genome sequences. Microbiome mania will continue, but the tide will turn from molecular barcoding towards metagenomics. Crowd-sourced analyses will collide with cloud computing, but eternal vigilance will be the price of preventing the misinterpretation and overselling of microbial sequence data. Output from hand-held sequencers will be analysed on mobile devices. Open-source training materials will address the need for the development of a skilled labour force. As we boldly go into the third decade of the twenty-first century, microbial sequence space will remain the final frontier! © 2016 The Author. Microbial Biotechnology published by John Wiley & Sons Ltd and Society for Applied Microbiology.

  6. BUSINESS PROCESS REENGINEERING AS THE METHOD OF PROCESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    O. Honcharova

    2013-09-01

    Full Text Available The article is devoted to the analysis of process management approach. The main understanding of process management approach has been researched in the article. The definition of process and process management has been given. Also the methods of business process improvement has been analyzed, among them are fast-analysis solution technology (FAST, benchmarking, reprojecting and reengineering. The main results of using business process improvement have been described in figures of reducing cycle time, costs and errors. Also the tasks of business process reengineering have been noticed. The main stages of business process reengineering have been noticed. The main efficiency results of business process reengineering and its success factors have been determined.

  7. Strategic management process in hospitals.

    Science.gov (United States)

    Zovko, V

    2001-01-01

    Strategic management is concerned with strategic choices and strategic implementation; it provides the means by which organizations meet their objectives. In the case of hospitals it helps executives and all employees to understand the real purpose and long term goals of the hospital. Also, it helps the hospital find its place in the health care service provision chain, and enables the hospital to coordinate its activities with other organizations in the health care system. Strategic management is a tool, rather than a solution, that helps executives to identify root causes of major problems in the hospital.

  8. The process approach to service quality management

    OpenAIRE

    Kamila Kowalik; Dorota Klimecka-Tatar

    2018-01-01

    In this paper a model of service quality management based on the process approach has been presented. The first part of the article contains the theoretical framework of service quality and the process approach in management. Next, quality of service process has been presented in reference to a process-based definition in quoted literature. Finally, the outcomes of a customer questionnaire concerning the validity of particular quality attributes has been presented. The collected data in relat...

  9. Managing the training process in bodybuilding

    OpenAIRE

    Netík, Tomáš

    2011-01-01

    Anotation Title: Managing the training process in bodybuilding This thesis describes the sport of bodybuilding and its training process. It also describes aspects of managing the training process and based on own research determines what impact the training process is applied to changes in the state of training and performance. There is also considering the impact of specific training methods and intensification. In the research of this thesis, we used different testing methods. We performed ...

  10. Engineering bioinformatics: building reliability, performance and productivity into bioinformatics software.

    Science.gov (United States)

    Lawlor, Brendan; Walsh, Paul

    2015-01-01

    There is a lack of software engineering skills in bioinformatic contexts. We discuss the consequences of this lack, examine existing explanations and remedies to the problem, point out their shortcomings, and propose alternatives. Previous analyses of the problem have tended to treat the use of software in scientific contexts as categorically different from the general application of software engineering in commercial settings. In contrast, we describe bioinformatic software engineering as a specialization of general software engineering, and examine how it should be practiced. Specifically, we highlight the difference between programming and software engineering, list elements of the latter and present the results of a survey of bioinformatic practitioners which quantifies the extent to which those elements are employed in bioinformatics. We propose that the ideal way to bring engineering values into research projects is to bring engineers themselves. We identify the role of Bioinformatic Engineer and describe how such a role would work within bioinformatic research teams. We conclude by recommending an educational emphasis on cross-training software engineers into life sciences, and propose research on Domain Specific Languages to facilitate collaboration between engineers and bioinformaticians.

  11. Engineering bioinformatics: building reliability, performance and productivity into bioinformatics software

    Science.gov (United States)

    Lawlor, Brendan; Walsh, Paul

    2015-01-01

    There is a lack of software engineering skills in bioinformatic contexts. We discuss the consequences of this lack, examine existing explanations and remedies to the problem, point out their shortcomings, and propose alternatives. Previous analyses of the problem have tended to treat the use of software in scientific contexts as categorically different from the general application of software engineering in commercial settings. In contrast, we describe bioinformatic software engineering as a specialization of general software engineering, and examine how it should be practiced. Specifically, we highlight the difference between programming and software engineering, list elements of the latter and present the results of a survey of bioinformatic practitioners which quantifies the extent to which those elements are employed in bioinformatics. We propose that the ideal way to bring engineering values into research projects is to bring engineers themselves. We identify the role of Bioinformatic Engineer and describe how such a role would work within bioinformatic research teams. We conclude by recommending an educational emphasis on cross-training software engineers into life sciences, and propose research on Domain Specific Languages to facilitate collaboration between engineers and bioinformaticians. PMID:25996054

  12. Technologies for Collaborative Business Process Management

    NARCIS (Netherlands)

    Sadiq, Shazia; Reichert, M.U.; Schulz, Karsten

    Business process management (BPM) has become an extensive area of research with several specialized aspects. BPM is viewed from highly diverse angles ranging from a management strategy to a software system. It is widely acknowledged that process enforcement technologies hold the potential to provide

  13. Designing XML schemas for bioinformatics.

    Science.gov (United States)

    Bruhn, Russel Elton; Burton, Philip John

    2003-06-01

    Data interchange bioinformatics databases will, in the future, most likely take place using extensible markup language (XML). The document structure will be described by an XML Schema rather than a document type definition (DTD). To ensure flexibility, the XML Schema must incorporate aspects of Object-Oriented Modeling. This impinges on the choice of the data model, which, in turn, is based on the organization of bioinformatics data by biologists. Thus, there is a need for the general bioinformatics community to be aware of the design issues relating to XML Schema. This paper, which is aimed at a general bioinformatics audience, uses examples to describe the differences between a DTD and an XML Schema and indicates how Unified Modeling Language diagrams may be used to incorporate Object-Oriented Modeling in the design of schema.

  14. MAPI: towards the integrated exploitation of bioinformatics Web Services.

    Science.gov (United States)

    Ramirez, Sergio; Karlsson, Johan; Trelles, Oswaldo

    2011-10-27

    Bioinformatics is commonly featured as a well assorted list of available web resources. Although diversity of services is positive in general, the proliferation of tools, their dispersion and heterogeneity complicate the integrated exploitation of such data processing capacity. To facilitate the construction of software clients and make integrated use of this variety of tools, we present a modular programmatic application interface (MAPI) that provides the necessary functionality for uniform representation of Web Services metadata descriptors including their management and invocation protocols of the services which they represent. This document describes the main functionality of the framework and how it can be used to facilitate the deployment of new software under a unified structure of bioinformatics Web Services. A notable feature of MAPI is the modular organization of the functionality into different modules associated with specific tasks. This means that only the modules needed for the client have to be installed, and that the module functionality can be extended without the need for re-writing the software client. The potential utility and versatility of the software library has been demonstrated by the implementation of several currently available clients that cover different aspects of integrated data processing, ranging from service discovery to service invocation with advanced features such as workflows composition and asynchronous services calls to multiple types of Web Services including those registered in repositories (e.g. GRID-based, SOAP, BioMOBY, R-bioconductor, and others).

  15. Career management: understanding the process.

    Science.gov (United States)

    Mackowiak, J; Eckel, F M

    1985-02-01

    This article is the first of a three-part series on career management for hospital pharmacists. Work attitudes, life cycles, needs, and career trends are discussed. Three basic work attitudes exist. Some see work as punishment. Others believe work in itself is good, i.e., they have a strong work ethic. Some view work as a means to satisfy, at least partially, a range of needs. Attitudinal transition points are likely to occur at specific times in the adult life cycle. The stages of the life cycle can be labeled as leaving, reaching out, questioning, midlife crisis, settling down, and mellowing. A progression through each of these stages is required for normal adult psychological development. Every individual exhibits a blend of needs that changes throughout life. Jobs can fulfill existence, relatedness, and growth needs. Relatedness needs include the need for love, affiliation, social esteem, and power, and growth needs include the need for self-esteem, competence, achievement, and autonomy. Three important career trends are the changing opportunities for advancement, women in careers, and dual-career couples. The number of women pharmacists is increasing as is the number of two-career couples. Tips for managing two-career relationships are presented. Pharmacists can manage their careers more effectively by understanding their needs, identifying their basic attitude toward work, and being aware of the trends occurring in pharmacy.

  16. Cluster Flow: A user-friendly bioinformatics workflow tool [version 1; referees: 3 approved

    Directory of Open Access Journals (Sweden)

    Philip Ewels

    2016-12-01

    Full Text Available Pipeline tools are becoming increasingly important within the field of bioinformatics. Using a pipeline manager to manage and run workflows comprised of multiple tools reduces workload and makes analysis results more reproducible. Existing tools require significant work to install and get running, typically needing pipeline scripts to be written from scratch before running any analysis. We present Cluster Flow, a simple and flexible bioinformatics pipeline tool designed to be quick and easy to install. Cluster Flow comes with 40 modules for common NGS processing steps, ready to work out of the box. Pipelines are assembled using these modules with a simple syntax that can be easily modified as required. Core helper functions automate many common NGS procedures, making running pipelines simple. Cluster Flow is available with an GNU GPLv3 license on GitHub. Documentation, examples and an online demo are available at http://clusterflow.io.

  17. A Process Management System for Networked Manufacturing

    Science.gov (United States)

    Liu, Tingting; Wang, Huifen; Liu, Linyan

    With the development of computer, communication and network, networked manufacturing has become one of the main manufacturing paradigms in the 21st century. Under the networked manufacturing environment, there exist a large number of cooperative tasks susceptible to alterations, conflicts caused by resources and problems of cost and quality. This increases the complexity of administration. Process management is a technology used to design, enact, control, and analyze networked manufacturing processes. It supports efficient execution, effective management, conflict resolution, cost containment and quality control. In this paper we propose an integrated process management system for networked manufacturing. Requirements of process management are analyzed and architecture of the system is presented. And a process model considering process cost and quality is developed. Finally a case study is provided to explain how the system runs efficiently.

  18. Method for Business Process Management System Selection

    OpenAIRE

    Westelaken, van de, Thijs; Terwee, Bas; Ravesteijn, Pascal

    2013-01-01

    In recent years business process management (BPM) and specifically information systems that support the analysis, design and execution of processes (also called business process management systems (BPMS)) are getting more attention. This has lead to an increase in research on BPM and BPMS. However the research on BPMS is mostly focused on the architecture of the system and how to implement such systems. How to select a BPM system that fits the strategy and goals of a specific organization is ...

  19. Bioinformatics Education in Pathology Training: Current Scope and Future Direction

    Directory of Open Access Journals (Sweden)

    Michael R Clay

    2017-04-01

    Full Text Available Training anatomic and clinical pathology residents in the principles of bioinformatics is a challenging endeavor. Most residents receive little to no formal exposure to bioinformatics during medical education, and most of the pathology training is spent interpreting histopathology slides using light microscopy or focused on laboratory regulation, management, and interpretation of discrete laboratory data. At a minimum, residents should be familiar with data structure, data pipelines, data manipulation, and data regulations within clinical laboratories. Fellowship-level training should incorporate advanced principles unique to each subspecialty. Barriers to bioinformatics education include the clinical apprenticeship training model, ill-defined educational milestones, inadequate faculty expertise, and limited exposure during medical training. Online educational resources, case-based learning, and incorporation into molecular genomics education could serve as effective educational strategies. Overall, pathology bioinformatics training can be incorporated into pathology resident curricula, provided there is motivation to incorporate, institutional support, educational resources, and adequate faculty expertise.

  20. THE ANALYSIS OF RISK MANAGEMENT PROCESS WITHIN MANAGEMENT

    Directory of Open Access Journals (Sweden)

    ROMANESCU MARCEL LAURENTIU

    2016-10-01

    Full Text Available This article highlights the risk analysis within management, focusing on how a company could practicaly integrate the risks management in the existing leading process. Subsequently, it is exemplified the way of manage risk effectively, which gives numerous advantages to all firms, including improving their decision-making process. All these lead to the conclusion that the degree of risk specific to companies is very high, but if managers make the best decisions then it can diminish it and all business activitiy and its income are not influenced by factors that could disturb in a negative way .

  1. Bioinformatics on the Cloud Computing Platform Azure

    Science.gov (United States)

    Shanahan, Hugh P.; Owen, Anne M.; Harrison, Andrew P.

    2014-01-01

    We discuss the applicability of the Microsoft cloud computing platform, Azure, for bioinformatics. We focus on the usability of the resource rather than its performance. We provide an example of how R can be used on Azure to analyse a large amount of microarray expression data deposited at the public database ArrayExpress. We provide a walk through to demonstrate explicitly how Azure can be used to perform these analyses in Appendix S1 and we offer a comparison with a local computation. We note that the use of the Platform as a Service (PaaS) offering of Azure can represent a steep learning curve for bioinformatics developers who will usually have a Linux and scripting language background. On the other hand, the presence of an additional set of libraries makes it easier to deploy software in a parallel (scalable) fashion and explicitly manage such a production run with only a few hundred lines of code, most of which can be incorporated from a template. We propose that this environment is best suited for running stable bioinformatics software by users not involved with its development. PMID:25050811

  2. Process Management Practices In Healthcare Institutions

    Directory of Open Access Journals (Sweden)

    Şükrü Kılıç

    2015-09-01

    Full Text Available Healthcare institutions differ from other service businesses by their “matrix organizational structure” and “error-free output” requirement. However, the processes stay the same for all organizational activities at different levels. One of the post-modern management approach is to focus on basis of necessary processes and fundamental organizational changes. This case study aims to initially explain the characteristics of healthcare institutions and the basic conceptual properties of process and process management. Then the effect of the “management throughprocesses approach” over organization will be discussed. Finally; process management at healthcare institutions, scope of health care and examples of the other post-modern approaches will be examined with their outputs

  3. The growing need for microservices in bioinformatics

    Directory of Open Access Journals (Sweden)

    Christopher L Williams

    2016-01-01

    Full Text Available Objective: Within the information technology (IT industry, best practices and standards are constantly evolving and being refined. In contrast, computer technology utilized within the healthcare industry often evolves at a glacial pace, with reduced opportunities for justified innovation. Although the use of timely technology refreshes within an enterprise′s overall technology stack can be costly, thoughtful adoption of select technologies with a demonstrated return on investment can be very effective in increasing productivity and at the same time, reducing the burden of maintenance often associated with older and legacy systems. In this brief technical communication, we introduce the concept of microservices as applied to the ecosystem of data analysis pipelines. Microservice architecture is a framework for dividing complex systems into easily managed parts. Each individual service is limited in functional scope, thereby conferring a higher measure of functional isolation and reliability to the collective solution. Moreover, maintenance challenges are greatly simplified by virtue of the reduced architectural complexity of each constitutive module. This fact notwithstanding, rendered overall solutions utilizing a microservices-based approach provide equal or greater levels of functionality as compared to conventional programming approaches. Bioinformatics, with its ever-increasing demand for performance and new testing algorithms, is the perfect use-case for such a solution. Moreover, if promulgated within the greater development community as an open-source solution, such an approach holds potential to be transformative to current bioinformatics software development. Context: Bioinformatics relies on nimble IT framework which can adapt to changing requirements. Aims: To present a well-established software design and deployment strategy as a solution for current challenges within bioinformatics Conclusions: Use of the microservices framework

  4. The growing need for microservices in bioinformatics.

    Science.gov (United States)

    Williams, Christopher L; Sica, Jeffrey C; Killen, Robert T; Balis, Ulysses G J

    2016-01-01

    Within the information technology (IT) industry, best practices and standards are constantly evolving and being refined. In contrast, computer technology utilized within the healthcare industry often evolves at a glacial pace, with reduced opportunities for justified innovation. Although the use of timely technology refreshes within an enterprise's overall technology stack can be costly, thoughtful adoption of select technologies with a demonstrated return on investment can be very effective in increasing productivity and at the same time, reducing the burden of maintenance often associated with older and legacy systems. In this brief technical communication, we introduce the concept of microservices as applied to the ecosystem of data analysis pipelines. Microservice architecture is a framework for dividing complex systems into easily managed parts. Each individual service is limited in functional scope, thereby conferring a higher measure of functional isolation and reliability to the collective solution. Moreover, maintenance challenges are greatly simplified by virtue of the reduced architectural complexity of each constitutive module. This fact notwithstanding, rendered overall solutions utilizing a microservices-based approach provide equal or greater levels of functionality as compared to conventional programming approaches. Bioinformatics, with its ever-increasing demand for performance and new testing algorithms, is the perfect use-case for such a solution. Moreover, if promulgated within the greater development community as an open-source solution, such an approach holds potential to be transformative to current bioinformatics software development. Bioinformatics relies on nimble IT framework which can adapt to changing requirements. To present a well-established software design and deployment strategy as a solution for current challenges within bioinformatics. Use of the microservices framework is an effective methodology for the fabrication and

  5. The growing need for microservices in bioinformatics

    Science.gov (United States)

    Williams, Christopher L.; Sica, Jeffrey C.; Killen, Robert T.; Balis, Ulysses G. J.

    2016-01-01

    Objective: Within the information technology (IT) industry, best practices and standards are constantly evolving and being refined. In contrast, computer technology utilized within the healthcare industry often evolves at a glacial pace, with reduced opportunities for justified innovation. Although the use of timely technology refreshes within an enterprise's overall technology stack can be costly, thoughtful adoption of select technologies with a demonstrated return on investment can be very effective in increasing productivity and at the same time, reducing the burden of maintenance often associated with older and legacy systems. In this brief technical communication, we introduce the concept of microservices as applied to the ecosystem of data analysis pipelines. Microservice architecture is a framework for dividing complex systems into easily managed parts. Each individual service is limited in functional scope, thereby conferring a higher measure of functional isolation and reliability to the collective solution. Moreover, maintenance challenges are greatly simplified by virtue of the reduced architectural complexity of each constitutive module. This fact notwithstanding, rendered overall solutions utilizing a microservices-based approach provide equal or greater levels of functionality as compared to conventional programming approaches. Bioinformatics, with its ever-increasing demand for performance and new testing algorithms, is the perfect use-case for such a solution. Moreover, if promulgated within the greater development community as an open-source solution, such an approach holds potential to be transformative to current bioinformatics software development. Context: Bioinformatics relies on nimble IT framework which can adapt to changing requirements. Aims: To present a well-established software design and deployment strategy as a solution for current challenges within bioinformatics Conclusions: Use of the microservices framework is an effective

  6. Relational XES: Data management for process mining

    NARCIS (Netherlands)

    Dongen, van B.F.; Shabani, S.; Grabis, J.; Sandkuhl, K.

    2015-01-01

    Information systems log data during the execution of business processes in so called "event logs". Process mining aims to improve business processes by extracting knowledge from event logs. Currently, the de-facto standard for storing and managing event data, XES, is tailored towards sequential

  7. Relational XES : data management for process mining

    NARCIS (Netherlands)

    Dongen, van B.F.; Shabani, S.

    2015-01-01

    Information systems log data during the execution of business processes in so called "event logs". Process mining aims to improve business processes by extracting knowledge from event logs. Currently, the de-facto standard for storing and managing event data, XES, is tailored towards sequential

  8. PubData: search engine for bioinformatics databases worldwide

    OpenAIRE

    Vand, Kasra; Wahlestedt, Thor; Khomtchouk, Kelly; Sayed, Mohammed; Wahlestedt, Claes; Khomtchouk, Bohdan

    2016-01-01

    We propose a search engine and file retrieval system for all bioinformatics databases worldwide. PubData searches biomedical data in a user-friendly fashion similar to how PubMed searches biomedical literature. PubData is built on novel network programming, natural language processing, and artificial intelligence algorithms that can patch into the file transfer protocol servers of any user-specified bioinformatics database, query its contents, retrieve files for download, and adapt to the use...

  9. MOWServ: a web client for integration of bioinformatic resources

    Science.gov (United States)

    Ramírez, Sergio; Muñoz-Mérida, Antonio; Karlsson, Johan; García, Maximiliano; Pérez-Pulido, Antonio J.; Claros, M. Gonzalo; Trelles, Oswaldo

    2010-01-01

    The productivity of any scientist is affected by cumbersome, tedious and time-consuming tasks that try to make the heterogeneous web services compatible so that they can be useful in their research. MOWServ, the bioinformatic platform offered by the Spanish National Institute of Bioinformatics, was released to provide integrated access to databases and analytical tools. Since its release, the number of available services has grown dramatically, and it has become one of the main contributors of registered services in the EMBRACE Biocatalogue. The ontology that enables most of the web-service compatibility has been curated, improved and extended. The service discovery has been greatly enhanced by Magallanes software and biodataSF. User data are securely stored on the main server by an authentication protocol that enables the monitoring of current or already-finished user’s tasks, as well as the pipelining of successive data processing services. The BioMoby standard has been greatly extended with the new features included in the MOWServ, such as management of additional information (metadata such as extended descriptions, keywords and datafile examples), a qualified registry, error handling, asynchronous services and service replication. All of them have increased the MOWServ service quality, usability and robustness. MOWServ is available at http://www.inab.org/MOWServ/ and has a mirror at http://www.bitlab-es.com/MOWServ/. PMID:20525794

  10. Integrating Process Management with Archival Management Systems: Lessons Learned

    Directory of Open Access Journals (Sweden)

    J. Gordon Daines, III

    2009-03-01

    Full Text Available The Integrated Digital Special Collections (INDI system is a prototype of a database-driven, Web application designed to automate and manage archival workflow for large institutions and consortia. This article discusses the how the INDI project enabled the successful implementation of a process to manage large technology projects in the Harold B. Lee Library at Brigham Young University. It highlights how the scope of these technology projects is set and how the major deliverables for each project are defined. The article also talks about how the INDI system followed the process and still failed to be completed. It examines why the process itself is successful and why the INDI project failed. It further underscores the importance of process management in archival management systems.

  11. Socialization as key process in knowledge management

    Directory of Open Access Journals (Sweden)

    Francisco José GARCÍA-PEÑALVO

    2016-07-01

    Full Text Available The editorial of this second issue of volume 17,corresponding to 2016, is devoted to socialization process in the knowledge management in order to complement the special section about Social Networks and Education.

  12. Crew Transportation Technical Management Processes

    Science.gov (United States)

    Mckinnie, John M. (Compiler); Lueders, Kathryn L. (Compiler)

    2013-01-01

    Under the guidance of processes provided by Crew Transportation Plan (CCT-PLN-1100), this document, with its sister documents, International Space Station (ISS) Crew Transportation and Services Requirements Document (CCT-REQ-1130), Crew Transportation Technical Standards and Design Evaluation Criteria (CCT-STD-1140), Crew Transportation Operations Standards (CCT STD-1150), and ISS to Commercial Orbital Transportation Services Interface Requirements Document (SSP 50808), provides the basis for a National Aeronautics and Space Administration (NASA) certification for services to the ISS for the Commercial Provider. When NASA Crew Transportation System (CTS) certification is achieved for ISS transportation, the Commercial Provider will be eligible to provide services to and from the ISS during the services phase.

  13. Investigating road safety management processes in Europe.

    NARCIS (Netherlands)

    Jähi, H. Muhlrad, N. Buttler, I. Gitelman, V. Bax, C. Dupont, E. Giustiniani, G. Machata, K. Martensen, H. Papadimitriou, E. Persia, L. Talbot, R. Vallet, G. & Yannis, G.

    2012-01-01

    The work package 1 of the EC FP7 project DaCoTA investigates road safety management processes in Europe. It has drafted a model to investigate the state of the art of road safety policy-making and management at the national level and to define “good practice”. The DaCoTA “good practice”

  14. Building an Identity Management Governance Process

    Science.gov (United States)

    Berg, Joanne E.; Kraemer, Ron; Raatz, Carla; Devoti, Steve

    2009-01-01

    A particular challenge in any campus environment is determining how requests for access to services and resources are managed. Who decides the technology, infrastructure, policy, business process and procedure? The involvement of key institutional leaders and stakeholders in identity management governance is the driving force behind the way the…

  15. The Management-Business Process: Cultural Considerations.

    Science.gov (United States)

    Ruiz, Reynaldo

    The effect of culture on the business management process in a Hispanic setting is explored for the benefit of persons in business in Latin America or with Hispanic groups in the United States. Understanding of cultural differences is important for business managers who work with Spanish speaking employees or clients because of the wide-ranging and…

  16. Assessment of Navy Contract Management Processes

    Science.gov (United States)

    2016-04-30

    capability? • 85% of quality problems are related to processes, while only 15% of problems are controlled by individual workers ( Deming , 1986...eligible participants: 369 • Total surveys completed: 185 • Response rate: 50% CONTRACT MANAGEMENT MATURITY MODEL© MATURITY LEVEL 5...Measurement – Organizations systematically use performance metrics to measure the quality and evaluate the effectiveness of the contract management

  17. Generalized Centroid Estimators in Bioinformatics

    Science.gov (United States)

    Hamada, Michiaki; Kiryu, Hisanori; Iwasaki, Wataru; Asai, Kiyoshi

    2011-01-01

    In a number of estimation problems in bioinformatics, accuracy measures of the target problem are usually given, and it is important to design estimators that are suitable to those accuracy measures. However, there is often a discrepancy between an employed estimator and a given accuracy measure of the problem. In this study, we introduce a general class of efficient estimators for estimation problems on high-dimensional binary spaces, which represent many fundamental problems in bioinformatics. Theoretical analysis reveals that the proposed estimators generally fit with commonly-used accuracy measures (e.g. sensitivity, PPV, MCC and F-score) as well as it can be computed efficiently in many cases, and cover a wide range of problems in bioinformatics from the viewpoint of the principle of maximum expected accuracy (MEA). It is also shown that some important algorithms in bioinformatics can be interpreted in a unified manner. Not only the concept presented in this paper gives a useful framework to design MEA-based estimators but also it is highly extendable and sheds new light on many problems in bioinformatics. PMID:21365017

  18. Using Intelligent Agents to Manage Business Processes

    OpenAIRE

    Jennings, N. R.; Faratin, P.; Johnson, M. J.; O'Brien, P.; Wiegand, M. E.

    1996-01-01

    Management of the business process requires pertinent, consistent and up-to-date information gathering and information dissemination. These complex and time consuming tasks prompt organizations to develop an Information Technology system to assist with the management of various aspects of their business processes. Intelligent agents are the strongest solution candidates because of their many advantages, namely: autonomy, social ability, responsiveness and proactiveness. Given these characteri...

  19. Chapter 16: text mining for translational bioinformatics.

    Science.gov (United States)

    Cohen, K Bretonnel; Hunter, Lawrence E

    2013-04-01

    Text mining for translational bioinformatics is a new field with tremendous research potential. It is a subfield of biomedical natural language processing that concerns itself directly with the problem of relating basic biomedical research to clinical practice, and vice versa. Applications of text mining fall both into the category of T1 translational research-translating basic science results into new interventions-and T2 translational research, or translational research for public health. Potential use cases include better phenotyping of research subjects, and pharmacogenomic research. A variety of methods for evaluating text mining applications exist, including corpora, structured test suites, and post hoc judging. Two basic principles of linguistic structure are relevant for building text mining applications. One is that linguistic structure consists of multiple levels. The other is that every level of linguistic structure is characterized by ambiguity. There are two basic approaches to text mining: rule-based, also known as knowledge-based; and machine-learning-based, also known as statistical. Many systems are hybrids of the two approaches. Shared tasks have had a strong effect on the direction of the field. Like all translational bioinformatics software, text mining software for translational bioinformatics can be considered health-critical and should be subject to the strictest standards of quality assurance and software testing.

  20. Improvement of Construction Project Management Processes

    Directory of Open Access Journals (Sweden)

    Nazarko, J.

    2017-07-01

    Full Text Available The common denominator of the five papers published in the current edition of the Journal of Engineering, Project, and Production Management is the improvement of construction project management processes for effective use of resources. Execution of proper project management processes is widely recognized as a key success factor influencing likelihood of project success (Alleman, 2014. It is noticeable that four out of five papers in this issue of the Journal are authored or co-authored by Iranian researchers from the same Institute but their conclusions bear importance that cannot be limited to the authors’ region.

  1. From Project Management to Process Management - Effectively Organising Transdisciplinary Projects

    OpenAIRE

    Moschitz, Heidrun

    2013-01-01

    In transdisciplinary projects, the roles of researchers change. In addition to being a source of knowledge, they are required to engage in knowledge exchange processes. This results in an alteration at project level: researchers need to creatively manage projects as group processes.

  2. Bioinformatics Training Network (BTN): a community resource for bioinformatics trainers

    DEFF Research Database (Denmark)

    Schneider, Maria V.; Walter, Peter; Blatter, Marie-Claude

    2012-01-01

    and clearly tagged in relation to target audiences, learning objectives, etc. Ideally, they would also be peer reviewed, and easily and efficiently accessible for downloading. Here, we present the Bioinformatics Training Network (BTN), a new enterprise that has been initiated to address these needs and review...

  3. Implementing process safety management in gas processing operations

    International Nuclear Information System (INIS)

    Rodman, D.L.

    1992-01-01

    The Occupational Safety and Health Administration (OSHA) standard entitled Process Safety Management of Highly Hazardous Chemicals; Explosives and Blasting Agents was finalized February 24, 1992. The purpose of the standard is to prevent or minimize consequences of catastrophic releases of toxic, flammable, or explosive chemicals. OSHA believes that its rule will accomplish this goal by requiring a comprehensive management program that integrates technologies, procedures, and management practices. Gas Processors Association (GPA) member companies are significantly impacted by this major standard, the requirements of which are extensive and complex. The purpose of this paper is to review the requirements of the standard and to discuss the elements to consider in developing and implementing a viable long term Process Safety Management Program

  4. Demand-side management process evaluations - the management perspective

    International Nuclear Information System (INIS)

    Perrault, G.A.; Barrett, L.B.

    1993-01-01

    A demand-side management (DSM) process evaluation is a qualitative, expert assessment of how a utility marketing program is being conducted. It reviews the efficiency and effectiveness in which a utility plans, manages, executes, and monitors the delivery of DSM programs to its marketplace. Process evaluations,which includes load impact, customer satisfaction and cost-effectiveness analysis, are becoming an increasingly significant component. The process evaluation focus is on the program planning and delivery process as opposed to the energy impacts resulting from the specific measures or products of the program. Because of this process-oriented focus, such evaluations can identify important opportunities for improving the cost-effectiveness of a program without significantly changing product lines. The evaluation may identify administrative or delivery process improvements. In addition, the evaluation may identify ways of improving the degree to which the customer is satisfied with the program or the utility. Since process evaluations are usually conducted as part of a utility's mandated DSM measurement and evaluation plan, they tend to focus mainly on the stated needs of the regulator as opposed to company management. This can be a problem. Although the regulatory perspective is important, in an increasingly competitive business environment, utilities must not overlook management's business and operational needs for specific information regarding DSM program planning, control, execution, and evaluation. This paper discusses some of the conflicts that exist between the regulator's and management's needs for DSM program evaluation results and presents some approaches for assuring that both needs are met. It is organized to first discuss the scope of a process evaluation, then the evaluation issues, the management concerns, and finally reporting of results

  5. GREEN BUSINESS PROCESS MANAGEMENT: A RESEARCH AGENDA

    Directory of Open Access Journals (Sweden)

    Aditya Ghose

    2010-01-01

    Full Text Available There is a global consensus on the need to reduce our collective carbon footprint. While much research attention has focused on developing alternative energy sources, automotive technologies or waste disposal techniques, we often ignore the fact that the ability to optimize (existing operations to reduce their emissions impact is fundamental to this exercise. Business process management (BPM technology, with its focus on understanding, modelling and improving/optimizing business processes, is a key starting point. Process modelling technology has applications beyond what we would traditionally describe as business processes - we can also model and improve manufacturing and other "physical" processes. This paper describes the contours of the emerging research landscape in green business process management and presents some early results in this area.

  6. A portfolio management system in the strategic management process

    Directory of Open Access Journals (Sweden)

    Qifan Huang

    2017-02-01

    Full Text Available Strategic management is the process of “what we are” which decides and implements “what we intend to be and how we are going to get there.” Strategy describes how an organization intends to compete with the resource available in the existing and perceived future environment. “Project management” is ancient, but also emerging. The wise human ancestors left numerous miracles with us, project management is a branch of the discipline of management, including the pyramids, statues of Zeus, Lighthouse of Alexandria, etc., which are brilliant pages in the history of project management. A project is a plan to solve the problem and effectively complete the established goal of the projects, so you have to go on strategic management for the project. Policy management is the means to achieve its objectives, including planning, implementation and control process. Strategic management is to gather staffs with different functions and form the project team. Due to its properties with a variety of different functions, more flexible management of property strategy in accordance with its special functions should be made in response to the changing internal and external environment. Management of functions is the role and performance management. It is not crime and punishment, but sparse and guide. Instead of the occurrence and discovery of issues, the difficulty is finding the solutions to problems by observations and recommendations. The management functions need to recognize their own responsibilities, and help the company to have a more long-term development of rational thinking.

  7. Development of advanced spent fuel management process. System analysis of advanced spent fuel management process

    International Nuclear Information System (INIS)

    Ro, S.G.; Kang, D.S.; Seo, C.S.; Lee, H.H.; Shin, Y.J.; Park, S.W.

    1999-03-01

    The system analysis of an advanced spent fuel management process to establish a non-proliferation model for the long-term spent fuel management is performed by comparing the several dry processes, such as a salt transport process, a lithium process, the IFR process developed in America, and DDP developed in Russia. In our system analysis, the non-proliferation concept is focused on the separation factor between uranium and plutonium and decontamination factors of products in each process, and the non-proliferation model for the long-term spent fuel management has finally been introduced. (Author). 29 refs., 17 tabs., 12 figs

  8. Peer Mentoring for Bioinformatics presentation

    OpenAIRE

    Budd, Aidan

    2014-01-01

    A handout used in a HUB (Heidelberg Unseminars in Bioinformatics) meeting focused on career development for bioinformaticians. It describes an activity for use to help introduce the idea of peer mentoring, potnetially acting as an opportunity to create peer-mentoring groups.

  9. Reproducible Bioinformatics Research for Biologists

    Science.gov (United States)

    This book chapter describes the current Big Data problem in Bioinformatics and the resulting issues with performing reproducible computational research. The core of the chapter provides guidelines and summaries of current tools/techniques that a noncomputational researcher would need to learn to pe...

  10. Taking Bioinformatics to Systems Medicine

    NARCIS (Netherlands)

    van Kampen, Antoine H. C.; Moerland, Perry D.

    2016-01-01

    Systems medicine promotes a range of approaches and strategies to study human health and disease at a systems level with the aim of improving the overall well-being of (healthy) individuals, and preventing, diagnosing, or curing disease. In this chapter we discuss how bioinformatics critically

  11. Bioinformatics and the Undergraduate Curriculum

    Science.gov (United States)

    Maloney, Mark; Parker, Jeffrey; LeBlanc, Mark; Woodard, Craig T.; Glackin, Mary; Hanrahan, Michael

    2010-01-01

    Recent advances involving high-throughput techniques for data generation and analysis have made familiarity with basic bioinformatics concepts and programs a necessity in the biological sciences. Undergraduate students increasingly need training in methods related to finding and retrieving information stored in vast databases. The rapid rise of…

  12. Bioinformatics of genomic association mapping

    NARCIS (Netherlands)

    Vaez Barzani, Ahmad

    2015-01-01

    In this thesis we present an overview of bioinformatics-based approaches for genomic association mapping, with emphasis on human quantitative traits and their contribution to complex diseases. We aim to provide a comprehensive walk-through of the classic steps of genomic association mapping

  13. Information processing among high-performance managers

    Directory of Open Access Journals (Sweden)

    S.C. Garcia-Santos

    2010-01-01

    Full Text Available The purpose of this study was to evaluate the information processing of 43 business managers with a professional superior performance. The theoretical framework considers three models: the Theory of Managerial Roles of Henry Mintzberg, the Theory of Information Processing, and Process Model Response to Rorschach by John Exner. The participants have been evaluated by Rorschach method. The results show that these managers are able to collect data, evaluate them and establish rankings properly. At same time, they are capable of being objective and accurate in the problems assessment. This information processing style permits an interpretation of the world around on basis of a very personal and characteristic processing way or cognitive style.

  14. RISK MANAGEMENT PROCESSES IN SUPPLY CHAINS

    Directory of Open Access Journals (Sweden)

    Aleksandar Aleksić

    2009-06-01

    Full Text Available One of the keys of successful business last few years is effective dealing with risks in every meaning of that word. At the time when the world economic crisis largely limits business, successful Risk management is the only way of survival for a large number of business systems. This paper will present the processes of risk management in supply chains that are in accordance with the standards ISO 28000 and ISO 31000. By implementing a holistic, enterprise-wide supply chain risk management program, companies also can uphold their commitment to providing strong corporate governance on behalf of stakeholders and increase their market value.

  15. Process management and controlling in diagnostic radiology

    International Nuclear Information System (INIS)

    Gocke, P.; Debatin, J.F.; Duerselen, L.F.J.

    2002-01-01

    Systematic process management and efficient quality control is rapidly gaining importance in our healthcare system. What does this mean for diagnostic radiology departments?To improve efficiency, quality and productivity the workflow within the department of diagnostic and interventional radiology at the University Hospital of Essen were restructured over the last two years. Furthermore, a controlling system was established. One of the pursued aims was to create a quality management system as a basis for the subsequent certification according to the ISO EN 9001:2000 norm.Central to the success of the workflow reorganisation was the training of selected members of the department's staff in process and quality management theory. Thereafter, a dedicated working group was created to prepare the reorganisation and the subsequent ISO certification with the support of a consulting partner. To assure a smooth implementation of the restructured workflow and create acceptance for the required ISO-9001 documentation, the entire staff was familiarized with the basic ideas of process- and quality-management in several training sessions.This manuscript summarizes the basic concepts of process and quality management as they were taught to our staff. A direct relationship towards diagnostic radiology is maintained throughout the text. (orig.) [de

  16. Ontological Model of Business Process Management Systems

    Science.gov (United States)

    Manoilov, G.; Deliiska, B.

    2008-10-01

    The activities which constitute business process management (BPM) can be grouped into five categories: design, modeling, execution, monitoring and optimization. Dedicated software packets for business process management system (BPMS) are available on the market. But the efficiency of its exploitation depends on used ontological model in the development time and run time of the system. In the article an ontological model of BPMS in area of software industry is investigated. The model building is preceded by conceptualization of the domain and taxonomy of BPMS development. On the base of the taxonomy an simple online thesaurus is created.

  17. Bioinformatics for cancer immunotherapy target discovery

    DEFF Research Database (Denmark)

    Olsen, Lars Rønn; Campos, Benito; Barnkob, Mike Stein

    2014-01-01

    therapy target discovery in a bioinformatics analysis pipeline. We describe specialized bioinformatics tools and databases for three main bottlenecks in immunotherapy target discovery: the cataloging of potentially antigenic proteins, the identification of potential HLA binders, and the selection epitopes...

  18. Talent Management: Working lines and key processes

    Directory of Open Access Journals (Sweden)

    Alvaro Alonso

    2014-12-01

    ways are established talent management as the treatment of these dimensions. Furthermore, we conclude that any talent plan includes processes or phases: attraction, selection, identification, development and retention.Value: Unlike other proposals, we consider it necessary to incorporate an additional referral study around key positions in the organization. Thus, characterization and cataloging of the previous literature is provided, as well as comparative analysis of heterogeneous definitions of talent and talent management. Furthermore, we propose that talent management is not just a tool for the implementation of the strategy, but it is situate since the start of the strategic process, that is, from strategic formulation. Thus, we focus on the approach advocated from the third line of study.

  19. Teaching bioinformatics in concert.

    Directory of Open Access Journals (Sweden)

    Anya L Goodman

    2014-11-01

    Full Text Available Can biology students without programming skills solve problems that require computational solutions? They can if they learn to cooperate effectively with computer science students. The goal of the in-concert teaching approach is to introduce biology students to computational thinking by engaging them in collaborative projects structured around the software development process. Our approach emphasizes development of interdisciplinary communication and collaboration skills for both life science and computer science students.

  20. 77 FR 13585 - Electricity Subsector Cybersecurity Risk Management Process Guideline

    Science.gov (United States)

    2012-03-07

    ... DEPARTMENT OF ENERGY Electricity Subsector Cybersecurity Risk Management Process Guideline AGENCY... Electricity Subsector Cybersecurity Risk Management Process guideline. The guideline describes a risk... Cybersecurity Risk Management Process Guideline. The primary goal of this guideline is to describe a risk...

  1. Using color management in color document processing

    Science.gov (United States)

    Nehab, Smadar

    1995-04-01

    Color Management Systems have been used for several years in Desktop Publishing (DTP) environments. While this development hasn't matured yet, we are already experiencing the next generation of the color imaging revolution-Device Independent Color for the small office/home office (SOHO) environment. Though there are still open technical issues with device independent color matching, they are not the focal point of this paper. This paper discusses two new and crucial aspects in using color management in color document processing: the management of color objects and their associated color rendering methods; a proposal for a precedence order and handshaking protocol among the various software components involved in color document processing. As color peripherals become affordable to the SOHO market, color management also becomes a prerequisite for common document authoring applications such as word processors. The first color management solutions were oriented towards DTP environments whose requirements were largely different. For example, DTP documents are image-centric, as opposed to SOHO documents that are text and charts centric. To achieve optimal reproduction on low-cost SOHO peripherals, it is critical that different color rendering methods are used for the different document object types. The first challenge in using color management of color document processing is the association of rendering methods with object types. As a result of an evolutionary process, color matching solutions are now available as application software, as driver embedded software and as operating system extensions. Consequently, document processing faces a new challenge, the correct selection of the color matching solution while avoiding duplicate color corrections.

  2. A process framework for information security management

    Directory of Open Access Journals (Sweden)

    Knut Haufe

    2016-01-01

    Full Text Available Securing sensitive organizational data has become increasingly vital to organizations. An Information Security Management System (ISMS is a systematic approach for establishing, implementing, operating, monitoring, reviewing, maintaining and improving an organization's information security. Key elements of the operation of an ISMS are ISMS processes. However, and in spite of its importance, an ISMS process framework with a description of ISMS processes and their interaction as well as the interaction with other management processes is not available in the literature. Cost benefit analysis of information security investments regarding single measures protecting information and ISMS processes are not in the focus of current research, mostly focused on economics. This article aims to fill this research gap by proposing such an ISMS process framework as the main contribution. Based on a set of agreed upon ISMS processes in existing standards like ISO 27000 series, COBIT and ITIL. Within the framework, identified processes are described and their interaction and interfaces are specified. This framework helps to focus on the operation of the ISMS, instead of focusing on measures and controls. By this, as a main finding, the systemic character of the ISMS consisting of processes and the perception of relevant roles of the ISMS is strengthened.

  3. Process Management Practices In Healthcare Institutions

    OpenAIRE

    Şükrü Kılıç; Cumhur Aydınlı

    2015-01-01

    Healthcare institutions differ from other service businesses by their “matrix organizational structure” and “error-free output” requirement. However, the processes stay the same for all organizational activities at different levels. One of the post-modern management approach is to focus on basis of necessary processes and fundamental organizational changes. This case study aims to initially explain the characteristics of healthcare institutions and the ba...

  4. Preface to Introduction to Structural Bioinformatics

    NARCIS (Netherlands)

    Feenstra, K. Anton; Abeln, Sanne

    2018-01-01

    While many good textbooks are available on Protein Structure, Molecular Simulations, Thermodynamics and Bioinformatics methods in general, there is no good introductory level book for the field of Structural Bioinformatics. This book aims to give an introduction into Structural Bioinformatics, which

  5. KNOWLEDGE MANAGEMENT PROCESSES AND INTELLECTUAL PROPERTY MANAGEMENT PROCESSES: AN INTEGRATED CONCEPTUAL FRAMEWORK

    OpenAIRE

    HENAO-CALAD, MONICA; RIVERA-MONTOYA, PAULA; URIBE-OCHOA, BEATRIZ

    2017-01-01

    ABSTRACT Intellectual property management, knowledge management are disciplines that have been treated independently, both in academia and in the organizational field. Through the legal discipline of intellectual property, the former manages intangible assets that are eligible for protection (copyright, patents and trademarks, among others) leaving aside those assets that cannot be realized in any way. The latter is devoted to the processes of knowledge management in general, namely, the know...

  6. NEW PERSPECTIVES ON STRATEGIC MANAGEMENT PROCESS

    Directory of Open Access Journals (Sweden)

    POP Zenovia Cristiana

    2013-07-01

    Full Text Available For developing economies the development of enterprises should be a strategic goal, this way of thinking may become viable only as a result of a combination of judicious analysis based on specific local economic aspects and a set of actions to correct any slippage or amplify existing development trends taken by the managers. A better leadership would unequivocally lead to a better strategy but sometimes the lack of information, first about the external environment, continuously undergoing quick and radical changes, the political problems and the complexity of the implementation of the strategy or the costs that it implies are not taken into consideration. Therefore managers have two options: to establish strategies, which would lead to the achievement of the objectives; evaluate them on the basis of economic efficiency or to identify an already existent strategy and to adapt it to the environment changes in which the enterprise carries on its activity. This paper aims at discussing and explaining from a theoretical perspective, the evolution and the advantages and disadvantages of the strategic management process, in order to convey the managers a modality to achieve competitiveness and evaluate the position of the firm. In the first section, we explain the the necessity of strategic management process. In the second section we present the different evolution stages. The third section presents our conclusions regarding the advantages and disadvantages of the strategic management process, fundamental for the strategy success. The financial crisis did affect the Romanian economy and Romanian enterprises early in 2009 registering an impact of the crisis identified in the need of the managers to rethink their strategies, to improve their management skills and perspectives on the role of the employees after the crisis. In this paper we try to underline the evolution stages of the strategic management process with its own characteristics by which both

  7. Implementation of a formulary management process.

    Science.gov (United States)

    Karel, Lauren I; Delisle, Dennis R; Anagnostis, Ellena A; Wordell, Cindy J

    2017-08-15

    The application of lean methodology in an initiative to redesign the formulary maintenance process at an academic medical center is described. Maintaining a hospital formulary requires clear communication and coordination among multiple members of the pharmacy department. Using principles of lean methodology, pharmacy department personnel within a multihospital health system launched a multifaceted initiative to optimize formulary management systemwide. The ongoing initiative began with creation of a formulary maintenance redesign committee consisting of pharmacy department personnel with expertise in informatics, automation, purchasing, drug information, and clinical pharmacy services. The committee met regularly and used lean methodology to design a standardized process for management of formulary additions and deletions and changes to medications' formulary status. Through value stream analysis, opportunities for process and performance improvement were identified; staff suggestions on process streamlining were gathered during a series of departmental kaizen events. A standardized template for development and dissemination of monographs associated with formulary additions and status changes was created. In addition, a shared Web-based checklist was developed to facilitate information sharing and timely initiation and completion of tasks involved in formulary status changes, and a permanent formulary maintenance committee was established to monitor and refine the formulary management process. A clearly defined, standardized process within the pharmacy department was developed for tracking necessary steps in enacting formulary changes to encourage safe and efficient workflow. Copyright © 2017 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  8. Modern bioinformatics meets traditional Chinese medicine.

    Science.gov (United States)

    Gu, Peiqin; Chen, Huajun

    2014-11-01

    Traditional Chinese medicine (TCM) is gaining increasing attention with the emergence of integrative medicine and personalized medicine, characterized by pattern differentiation on individual variance and treatments based on natural herbal synergism. Investigating the effectiveness and safety of the potential mechanisms of TCM and the combination principles of drug therapies will bridge the cultural gap with Western medicine and improve the development of integrative medicine. Dealing with rapidly growing amounts of biomedical data and their heterogeneous nature are two important tasks among modern biomedical communities. Bioinformatics, as an emerging interdisciplinary field of computer science and biology, has become a useful tool for easing the data deluge pressure by automating the computation processes with informatics methods. Using these methods to retrieve, store and analyze the biomedical data can effectively reveal the associated knowledge hidden in the data, and thus promote the discovery of integrated information. Recently, these techniques of bioinformatics have been used for facilitating the interactional effects of both Western medicine and TCM. The analysis of TCM data using computational technologies provides biological evidence for the basic understanding of TCM mechanisms, safety and efficacy of TCM treatments. At the same time, the carrier and targets associated with TCM remedies can inspire the rethinking of modern drug development. This review summarizes the significant achievements of applying bioinformatics techniques to many aspects of the research in TCM, such as analysis of TCM-related '-omics' data and techniques for analyzing biological processes and pharmaceutical mechanisms of TCM, which have shown certain potential of bringing new thoughts to both sides. © The Author 2013. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  9. Making Bioinformatics Projects a Meaningful Experience in an Undergraduate Biotechnology or Biomedical Science Programme

    Science.gov (United States)

    Sutcliffe, Iain C.; Cummings, Stephen P.

    2007-01-01

    Bioinformatics has emerged as an important discipline within the biological sciences that allows scientists to decipher and manage the vast quantities of data (such as genome sequences) that are now available. Consequently, there is an obvious need to provide graduates in biosciences with generic, transferable skills in bioinformatics. We present…

  10. Accumulating project management knowledge through process theory

    NARCIS (Netherlands)

    Niederman, Fred; March, Salvatore T.; Mueller, Benjamin

    2014-01-01

    This paper describes how the general notion of process theory can provide a foundational component in a portfolio of project management theories. The paper begins by outlining a variety of views pertaining to the nature of theory and theory development. This forms a basis for understanding how

  11. Error management process for power stations

    International Nuclear Information System (INIS)

    Hirotsu, Yuko; Takeda, Daisuke; Fujimoto, Junzo; Nagasaka, Akihiko

    2016-01-01

    The purpose of this study is to establish 'error management process for power stations' for systematizing activities for human error prevention and for festering continuous improvement of these activities. The following are proposed by deriving concepts concerning error management process from existing knowledge and realizing them through application and evaluation of their effectiveness at a power station: an entire picture of error management process that facilitate four functions requisite for maraging human error prevention effectively (1. systematizing human error prevention tools, 2. identifying problems based on incident reports and taking corrective actions, 3. identifying good practices and potential problems for taking proactive measures, 4. prioritizeng human error prevention tools based on identified problems); detail steps for each activity (i.e. developing an annual plan for human error prevention, reporting and analyzing incidents and near misses) based on a model of human error causation; procedures and example of items for identifying gaps between current and desired levels of executions and outputs of each activity; stages for introducing and establishing the above proposed error management process into a power station. By giving shape to above proposals at a power station, systematization and continuous improvement of activities for human error prevention in line with the actual situation of the power station can be expected. (author)

  12. Managing performance through business processes from BPM to the practice of process management

    CERN Document Server

    Thiault, Dominique

    2012-01-01

    Centred on the performance of the company, this book is a practical guide that organises Business Process Management (BPM) around these major subjects, such as process management, process governance, or the setting up a successful process approach in the company. Each of these subjects is introduced didactically, alternating examples and in-depth information. Relying on implementation recommendations, practical sheets, and illustrations, managing performance through processes can be used, first, to increase the chances that the objectives will be reached and to improve company performance (industrial organisation, service organisation, private or public sector) and to offer methods, rules, models, and practical supports that can easily be reused. The result of a combination of several experiences in the field and of rewarding encounters with managers, experts, and high-level executives, this book durably places the processes in their managerial dimension. It also presents an opening to a systematic approach t...

  13. The development and application of bioinformatics core competencies to improve bioinformatics training and education.

    Science.gov (United States)

    Mulder, Nicola; Schwartz, Russell; Brazas, Michelle D; Brooksbank, Cath; Gaeta, Bruno; Morgan, Sarah L; Pauley, Mark A; Rosenwald, Anne; Rustici, Gabriella; Sierk, Michael; Warnow, Tandy; Welch, Lonnie

    2018-02-01

    Bioinformatics is recognized as part of the essential knowledge base of numerous career paths in biomedical research and healthcare. However, there is little agreement in the field over what that knowledge entails or how best to provide it. These disagreements are compounded by the wide range of populations in need of bioinformatics training, with divergent prior backgrounds and intended application areas. The Curriculum Task Force of the International Society of Computational Biology (ISCB) Education Committee has sought to provide a framework for training needs and curricula in terms of a set of bioinformatics core competencies that cut across many user personas and training programs. The initial competencies developed based on surveys of employers and training programs have since been refined through a multiyear process of community engagement. This report describes the current status of the competencies and presents a series of use cases illustrating how they are being applied in diverse training contexts. These use cases are intended to demonstrate how others can make use of the competencies and engage in the process of their continuing refinement and application. The report concludes with a consideration of remaining challenges and future plans.

  14. The development and application of bioinformatics core competencies to improve bioinformatics training and education

    Science.gov (United States)

    Brooksbank, Cath; Morgan, Sarah L.; Rosenwald, Anne; Warnow, Tandy; Welch, Lonnie

    2018-01-01

    Bioinformatics is recognized as part of the essential knowledge base of numerous career paths in biomedical research and healthcare. However, there is little agreement in the field over what that knowledge entails or how best to provide it. These disagreements are compounded by the wide range of populations in need of bioinformatics training, with divergent prior backgrounds and intended application areas. The Curriculum Task Force of the International Society of Computational Biology (ISCB) Education Committee has sought to provide a framework for training needs and curricula in terms of a set of bioinformatics core competencies that cut across many user personas and training programs. The initial competencies developed based on surveys of employers and training programs have since been refined through a multiyear process of community engagement. This report describes the current status of the competencies and presents a series of use cases illustrating how they are being applied in diverse training contexts. These use cases are intended to demonstrate how others can make use of the competencies and engage in the process of their continuing refinement and application. The report concludes with a consideration of remaining challenges and future plans. PMID:29390004

  15. Credit Risk Management - Loan Approval Process

    Directory of Open Access Journals (Sweden)

    Lulzim Rashiti

    2016-03-01

    Full Text Available The aim of this study is on understanding the international regulations issued by Basel I, Basel II and Basel III to best supervise and manage credit risk management policies. Part of paper will focus on the description and impacts of the regulations and the pivotal importance they play in providing a sound banking system. Credit risk represents another important element that will be analysed considering that it lays the foundation during the loan consideration and approval process. The paper will also explain in detail procedures and responsibilities shared along the process of loan acceptance by a banker. To sum up, the overall process from application to loan approval or denial will be explained pointing out the implications that are faced along the way

  16. Establishing bioinformatics research in the Asia Pacific

    OpenAIRE

    Ranganathan, Shoba; Tammi, Martti; Gribskov, Michael; Tan, Tin Wee

    2006-01-01

    Abstract In 1998, the Asia Pacific Bioinformatics Network (APBioNet), Asia's oldest bioinformatics organisation was set up to champion the advancement of bioinformatics in the Asia Pacific. By 2002, APBioNet was able to gain sufficient critical mass to initiate the first International Conference on Bioinformatics (InCoB) bringing together scientists working in the field of bioinformatics in the region. This year, the InCoB2006 Conference was organized as the 5th annual conference of the Asia-...

  17. GOBLET: the Global Organisation for Bioinformatics Learning, Education and Training.

    Science.gov (United States)

    Attwood, Teresa K; Atwood, Teresa K; Bongcam-Rudloff, Erik; Brazas, Michelle E; Corpas, Manuel; Gaudet, Pascale; Lewitter, Fran; Mulder, Nicola; Palagi, Patricia M; Schneider, Maria Victoria; van Gelder, Celia W G

    2015-04-01

    In recent years, high-throughput technologies have brought big data to the life sciences. The march of progress has been rapid, leaving in its wake a demand for courses in data analysis, data stewardship, computing fundamentals, etc., a need that universities have not yet been able to satisfy--paradoxically, many are actually closing "niche" bioinformatics courses at a time of critical need. The impact of this is being felt across continents, as many students and early-stage researchers are being left without appropriate skills to manage, analyse, and interpret their data with confidence. This situation has galvanised a group of scientists to address the problems on an international scale. For the first time, bioinformatics educators and trainers across the globe have come together to address common needs, rising above institutional and international boundaries to cooperate in sharing bioinformatics training expertise, experience, and resources, aiming to put ad hoc training practices on a more professional footing for the benefit of all.

  18. [Reflections on the management of deinstitutionalization process].

    Science.gov (United States)

    Lucena, Marcela Adriana da Silva; Bezerra, Adriana Falangola Benjamin

    2012-09-01

    This study addresses mental health and, based on a conceptual review, offers considerations on the management of deinstitutionalization processes regarding individuals interned in long-stay psychiatric institutions. Elements concerning asylum formation and logic are discussed, along with the mechanisms necessary for the effective change in paradigm and practices, with deinstitutionalization and psychosocial rehabilitation as the core issues. Reflections are offered regarding management actions committed to the psychosocial model, linking such actions to the application of the components of care and going beyond the articulation of the tools of mental health policy. Theoretical reflection offers suggestions referring to the qualification processes of mental health professionals, deinstitutionalization in the management of the Unified Health System and tripartite action with co-accountability in actions and financing. The final considerations recognize the bureaucratic obstacles in the public realm and propose facing these challenges as a management challenge, along with processes of change that can radically commit to the lives of people, thereby broadening the discussion to the ethical realm.

  19. Intermediaries in the Management Process of Innovation

    DEFF Research Database (Denmark)

    Gretzinger, Susanne; Hinz, Holger; Matiaske, Wenzel

    2012-01-01

    Information is a critical resource in innovation processes. SMEs are therefore advised to draw on consulting in innovation processes, as they cannot ensure the necessary information flow internally due to lesser resources. From the strategic point of view, the involvement of intermediaries is acc...... is accompanied by the risk of losing specific knowledge to the business environment. But the other way around: To neglect the integration of consultancies could mean a deficit in the process of information management.......Information is a critical resource in innovation processes. SMEs are therefore advised to draw on consulting in innovation processes, as they cannot ensure the necessary information flow internally due to lesser resources. From the strategic point of view, the involvement of intermediaries...

  20. Process management - critical safety issues with focus on risk management

    International Nuclear Information System (INIS)

    Sanne, Johan M.

    2005-12-01

    Organizational changes focused on process orientation are taking place among Swedish nuclear power plants, aiming at improving the operation. The Swedish Nuclear Power Inspectorate has identified a need for increased knowledge within the area for its regulatory activities. In order to analyze what process orientation imply for nuclear power plant safety a number of questions must be asked: 1. How is safety in nuclear power production created currently? What significance does the functional organization play? 2. How can organizational forms be analysed? What consequences does quality management have for work and for the enterprise? 3. Why should nuclear power plants be process oriented? Who are the customers and what are their customer values? Which customers are expected to contribute from process orientation? 4. What can one learn from process orientation in other safety critical systems? What is the effect on those features that currently create safety? 5. Could customer values increase for one customer without decreasing for other customers? What is the relationship between economic and safety interests from an increased process orientation? The deregulation of the electricity market have caused an interest in increased economic efficiency, which is the motivation for the interest in process orientation. among other means. It is the nuclear power plants' owners and the distributors (often the same corporations) that have the strongest interest in process orientation. If the functional organization and associated practices are decomposed, the prerequisites of the risk management regime changes, perhaps deteriorating its functionality. When nuclear power operators consider the introduction of process orientation, the Nuclear Power Inspectorate should require that 1. The operators perform a risk analysis beforehand concerning the potential consequences that process orientation might convey: the analysis should contain a model specifying how safety is currently

  1. Process safety management for highly hazardous chemicals

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-02-01

    Purpose of this document is to assist US DOE contractors who work with threshold quantities of highly hazardous chemicals (HHCs), flammable liquids or gases, or explosives in successfully implementing the requirements of OSHA Rule for Process Safety Management of Highly Hazardous Chemicals (29 CFR 1910.119). Purpose of this rule is to prevent releases of HHCs that have the potential to cause catastrophic fires, explosions, or toxic exposures.

  2. 77 FR 30517 - Electricity Subsector Cybersecurity Risk Management Process

    Science.gov (United States)

    2012-05-23

    ... DEPARTMENT OF ENERGY Electricity Subsector Cybersecurity Risk Management Process AGENCY: Office of... Electricity Subsector Cybersecurity Risk Management Process guideline. The guideline describes a risk... Management Process. The primary goal of this guideline is to describe a risk management process that is...

  3. Managing Cultural Variation in Software Process Improvement

    DEFF Research Database (Denmark)

    Kræmmergaard, Pernille; Müller, Sune Dueholm; Mathiassen, Lars

    The scale and complexity of change in software process improvement (SPI) are considerable and managerial attention to organizational culture during SPI can therefore potentially contribute to successful outcomes. However, we know little about the impact of variations in organizational subculture ...... organizations can have important implications for SPI outcomes. Furthermore, it provides insights into how software managers can practically assess subcultures to inform decisions about and help prepare plans for SPI initiatives.......The scale and complexity of change in software process improvement (SPI) are considerable and managerial attention to organizational culture during SPI can therefore potentially contribute to successful outcomes. However, we know little about the impact of variations in organizational subculture...

  4. THORP - the management of the design process

    International Nuclear Information System (INIS)

    Thorpe, E.; Thurrell, B.H.; Varey, L.S.

    1991-01-01

    This Paper sets out to describe the organization of the design of the Thermal Oxide Reprocessing Plant (THORP) head end and chemical separation building. This posed many challenges not only because the building itself is a complicated engineering entity, but also because of the logistical aspects of administering the large number of engineers and draughtsmen -600 in total at peak- employed on the project. The effects of the necessary iterative design process, both technical and logistical, are outlined, together with a description of the manner in which the whole design process was managed. (author)

  5. Towards emergence phenomenon in business process management

    Directory of Open Access Journals (Sweden)

    Koryl Maciej

    2017-06-01

    Full Text Available A standard solution regarding business process management automation in enterprises is the use of workflow management systems working by the Rule-Based Reasoning approach. In such systems, the process model which is designed entirely before the implementation has to meet all needs deriving from business activity of the organization. In practice, it means that great limitations arise in process control abilities, especially in the dynamic business environment. Therefore, new kinds of workflow systems may help which typically work in more agile way e.g. following the Case-Based Reasoning approach. The paper shows another possible solution – the use of emergence theory which indicates among other conditions required to fulfill stimulation of the system (for example the business environment to run grass-roots processes that lead to arising of new more sophisticated organizing forms. The paper also points the using opportunity of such techniques as the processing of complex events to fulfill key conditions pointed by the emergence theory.

  6. MODELING OF MANAGEMENT PROCESSES IN AN ORGANIZATION

    Directory of Open Access Journals (Sweden)

    Stefan Iovan

    2016-05-01

    Full Text Available When driving any major change within an organization, strategy and execution are intrinsic to a project’s success. Nevertheless, closing the gap between strategy and execution remains a challenge for many organizations [1]. Companies tend to focus more on execution than strategy for quick results, instead of taking the time needed to understand the parts that make up the whole, so the right execution plan can be put in place to deliver the best outcomes. A large part of this understands that business operations don’t fit neatly within the traditional organizational hierarchy. Business processes are often messy, collaborative efforts that cross teams, departments and systems, making them difficult to manage within a hierarchical structure [2]. Business process management (BPM fills this gap by redefining an organization according to its end-to-end processes, so opportunities for improvement can be identified and processes streamlined for growth, revenue and transformation. This white paper provides guidelines on what to consider when using business process applications to solve your BPM initiatives, and the unique capabilities software systems provides that can help ensure both your project’s success and the success of your organization as a whole. majority of medium and small businesses, big companies and even some guvermental organizations [2].

  7. Process-based project proposal risk management

    Directory of Open Access Journals (Sweden)

    Alok Kumar

    2016-12-01

    Full Text Available We all are aware of the organizational omnipresence. Projects within the organizations are ubiquitous too. Projects achieve their goals successfully if they are planned, scheduled, controlled and implemented well. The project lifecycle of initiating, planning, scheduling, controlling and implementing are very well-planned by project managers and the organizations. Successful projects have well-developed risk management plans to deal with situations impacting projects. Like any other organisation, a university does try to access funds for different purposes too. For such organisations, running a project is not the issue, rather getting a project proposal approved to fund a project is the key. Project proposal processing is done by the nodal office in every organisation. Usually, these nodal offices help in administration and submission of a project proposal for accessing funds. Seldom are these nodal project offices within the organizations facilitate a project proposal approval by proactively reaching out to the project managers. And as project managers prepare project proposals, little or no attention is made to prepare a project proposal risk plan so as to maximise project acquisition. Risk plans are submitted while preparing proposals but these risk plans cater to a requirement to address actual projects upon approval. Hence, a risk management plan for project proposal is either missing or very little effort is made to treat the risks inherent in project acquisition. This paper is an integral attempt to highlight the importance of risk treatment for project proposal stage as an extremely important step to preparing the risk management plan made for projects corresponding to their lifecycle phases. Several tools and techniques have been proposed in the paper to help and guide either the project owner (proposer or the main organisational unit responsible for project management. Development of tools and techniques to further enhance project

  8. Color management: printing processes - opportunities and limitations

    Science.gov (United States)

    Ingram, Samuel T.

    2002-06-01

    Digital tools have impacted traditional methods employed to reproduce color images during the past decade. The shift from a purely photomechanical process in color reproduction to colorimetric reproduction offers tremendous opportunity in the graphic arts industry. But good things do not necessarily come to all in the same package. Printing processes possess different reproduction attributes: tone reproduction, gray balance and color correction requirements are as different as the ingredient sets selected for color reproduction. This paper will provide insight toward understanding advantages and limitations offered by the new digital technologies in printing, publishing and packaging. For the past five years the Clemson University Graphic Communications Department has conducted numerous color projects using the new digital colorimetric tools during the previous decade. Several approaches have been used including experimental research and typical production workflows. The use of colorimetric data in color reproduction has given an opportunity to realize real gains in color use, predictability and consistency. Meeting an image's separation and reproduction requirements for a specified printing process can involve disruption of the anticipated workflow. Understanding the printing process requirements and the fit within the specifications of a colorimetric workflow are critical to the successful adoption of a color managed workflow. The paper will also provide an insight into the issues and challenges experienced with a color managed workflow. The printing processes used include offset litho, narrow and wide-web flexography (paper, liner board, corrugated and film), screen printing (paper board and polycarbonates), and digital imaging with toner, ink and inkjet systems. A proposal for technology integration will be the focus of the presentation drawn from documented experiences in over 300 applications of color management tools. Discussion will include the structure of

  9. Live testing of the SCAT management process

    International Nuclear Information System (INIS)

    Martin, V.; Duhaime, C.; Boule, M.; Lamarche, A.

    2002-01-01

    The techniques developed by Environment Canada's Shoreline Cleanup Assessment Team (SCAT) have become the world standard for consistency in remedial efforts following an oil spill. This paper presented the results of a workshop that was aimed at testing the management process developed by two different agencies, Environment Canada and Eastern Canada Response Corporation (ECRC), following a spill incident in 1999 in which 150 km of Quebec's north shore near Havre-Saint-Pierre was polluted with 49 tonnes of bunker oil 180 from an ore ship. The issues of specific concern included fishing, mollusc harvesting, tourism, hunting and sites of environmental interest in the Mingan National Park. Both agencies realized they had to use the SCAT approach, but for different reasons. Environment Canada had to identify environmental impacts, while ECRC had to plan methods for shoreline treatment. Both agencies had to document the pollution using the SCAT method, therefore, they joined efforts and pooled their expertise to optimize resources. The newly developed management structure was aimed at determining how the SCAT approach should be planned, how the data quality could be secured, and how the information should be managed. The main benefits of the joint structure was a flow chart and description of the different functions, and a list of deliverables to be produced by those in charge of managing the SCAT approach. It was determined that the new management process is efficient. A SCAT assessment and situation report were both produced within a prescribed time frame. Working in partnership allowed participants to acquire a common understanding of the SCAT approach. 2 refs., 2 tabs., 3 figs

  10. Bioinformatics for Next Generation Sequencing Data

    Directory of Open Access Journals (Sweden)

    Alberto Magi

    2010-09-01

    Full Text Available The emergence of next-generation sequencing (NGS platforms imposes increasing demands on statistical methods and bioinformatic tools for the analysis and the management of the huge amounts of data generated by these technologies. Even at the early stages of their commercial availability, a large number of softwares already exist for analyzing NGS data. These tools can be fit into many general categories including alignment of sequence reads to a reference, base-calling and/or polymorphism detection, de novo assembly from paired or unpaired reads, structural variant detection and genome browsing. This manuscript aims to guide readers in the choice of the available computational tools that can be used to face the several steps of the data analysis workflow.

  11. Development of advanced spent fuel management process

    International Nuclear Information System (INIS)

    Park, Seong Won; Shin, Y. J.; Cho, S. H.

    2004-03-01

    The research on spent fuel management focuses on the maximization of the disposal efficiency by a volume reduction, the improvement of the environmental friendliness by the partitioning and transmutation of the long lived nuclides, and the recycling of the spent fuel for an efficient utilization of the uranium source. In the second phase which started in 2001, the performance test of the advanced spent fuel management process consisting of voloxidation, reduction of spent fuel and the lithium recovery process has been completed successfully on a laboratory scale. The world-premier spent fuel reduction hot test of a 5 kgHM/batch has been performed successfully by joint research with Russia and the valuable data on the actinides and FPs material balance and the characteristics of the metal product were obtained with experience to help design an engineering scale reduction system. The electrolytic reduction technology which integrates uranium oxide reduction in a molten LiCl-Li 2 O system and Li 2 O electrolysis is developed and a unique reaction system is also devised. Design data such as the treatment capacity, current density and mass transfer behavior obtained from the performance test of a 5 kgU/batch electrolytic reduction system pave the way for the third phase of the hot cell demonstration of the advanced spent fuel management technology

  12. Data mining in bioinformatics using Weka.

    Science.gov (United States)

    Frank, Eibe; Hall, Mark; Trigg, Len; Holmes, Geoffrey; Witten, Ian H

    2004-10-12

    The Weka machine learning workbench provides a general-purpose environment for automatic classification, regression, clustering and feature selection-common data mining problems in bioinformatics research. It contains an extensive collection of machine learning algorithms and data pre-processing methods complemented by graphical user interfaces for data exploration and the experimental comparison of different machine learning techniques on the same problem. Weka can process data given in the form of a single relational table. Its main objectives are to (a) assist users in extracting useful information from data and (b) enable them to easily identify a suitable algorithm for generating an accurate predictive model from it. http://www.cs.waikato.ac.nz/ml/weka.

  13. Coping with transition: improving the management process

    International Nuclear Information System (INIS)

    Griffin, J.; McAlister, J.

    1985-01-01

    It goes without saying that the industry is indeed in transition. Not only do expectations from regulators and the public continue to grow in intensity and complexity, but out ability to make appropriate responses seems to be becoming exceedingly more difficult as well. At AP and L, the energy supply department has some 2,000 employees and operates (in addition to providing general office engineering, technical, and administrative support) all of AP and L's power plants. These include two nuclear units and four coal units as well as hydro, oil and gas plants. In January 1984 the company began an effort with our senior departmental management to try and improve the management process itself. The ultimate goal is to create a climate conductive to improved productivity and quality without the initial (and sometimes risky) across-the-board implementation of techniques such as quality circles

  14. Learning Genetics through an Authentic Research Simulation in Bioinformatics

    Science.gov (United States)

    Gelbart, Hadas; Yarden, Anat

    2006-01-01

    Following the rationale that learning is an active process of knowledge construction as well as enculturation into a community of experts, we developed a novel web-based learning environment in bioinformatics for high-school biology majors in Israel. The learning environment enables the learners to actively participate in a guided inquiry process…

  15. Ergatis: a web interface and scalable software system for bioinformatics workflows

    Science.gov (United States)

    Orvis, Joshua; Crabtree, Jonathan; Galens, Kevin; Gussman, Aaron; Inman, Jason M.; Lee, Eduardo; Nampally, Sreenath; Riley, David; Sundaram, Jaideep P.; Felix, Victor; Whitty, Brett; Mahurkar, Anup; Wortman, Jennifer; White, Owen; Angiuoli, Samuel V.

    2010-01-01

    Motivation: The growth of sequence data has been accompanied by an increasing need to analyze data on distributed computer clusters. The use of these systems for routine analysis requires scalable and robust software for data management of large datasets. Software is also needed to simplify data management and make large-scale bioinformatics analysis accessible and reproducible to a wide class of target users. Results: We have developed a workflow management system named Ergatis that enables users to build, execute and monitor pipelines for computational analysis of genomics data. Ergatis contains preconfigured components and template pipelines for a number of common bioinformatics tasks such as prokaryotic genome annotation and genome comparisons. Outputs from many of these components can be loaded into a Chado relational database. Ergatis was designed to be accessible to a broad class of users and provides a user friendly, web-based interface. Ergatis supports high-throughput batch processing on distributed compute clusters and has been used for data management in a number of genome annotation and comparative genomics projects. Availability: Ergatis is an open-source project and is freely available at http://ergatis.sourceforge.net Contact: jorvis@users.sourceforge.net PMID:20413634

  16. ARTISTIC AND SCIENTIFIC CREATION PROCESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Ioan Tudor

    2013-12-01

    Full Text Available Cultural creation is not ex nihilo, but a conversion of old “material” into a new form or a re-signifying, or a new combination of preexisting elements. Art, as well as science, has been characterized as acts of dominating and transforming nature. In art, the process is real and in science it is virtual. This process is characterized by an efficient management of the means of expression which must be subordinated to the message that the piece of art wants to transmit and the scientist exploits facts in order to make them significant as support, expression and exemplification of the laws of nature. There is also an ontogenesis of management: individual evolution which takes place by virtue of a program, thanks to devices with self-regulating capabilities. This aspect may be particularly interesting for cyberneticists. In contemporary civilization the transfer of certain aspects of the creation process to automated machines implies programming, algorithms. Man is an algorithmic being.

  17. Fair process: managing in the knowledge economy.

    Science.gov (United States)

    Kim, W C; Mauborgne, R

    1997-01-01

    Unlike the traditional factors of production--land, labor, and capital--knowledge is a resource that can't be forced out of people. But creating and sharing knowledge is essential to fostering innovation, the key challenge of the knowledge-based economy. To create a climate in which employees volunteer their creativity and expertise, managers need to look beyond the traditional tools at their disposal. They need to build trust. The authors have studied the links between trust, idea sharing, and corporate performance for more than a decade. They have explored the question of why managers of local subsidiaries so often fail to share information with executives at headquarters. They have studied the dynamics of idea sharing in product development teams, joint ventures, supplier partnerships, and corporate transformations. They offer an explanation for why people resist change even when it would benefit them directly. In every case, the decisive factor was what the authors call fair process--fairness in the way a company makes and executes decisions. The elements of fair process are simple: Engage people's input in decisions that directly affect them. Explain why decisions are made the way they are. Make clear what will be expected of employees after the changes are made. Fair process may sound like a soft issue, but it is crucial to building trust and unlocking ideas. Without it, people are apt to withhold their full cooperation and their creativity. The results are costly: ideas that never see daylight and initiatives that are never seized.

  18. Methods of process management in radiology

    International Nuclear Information System (INIS)

    Teichgraeber, U.K.M.; Gillessen, C.; Neumann, F.

    2003-01-01

    The main emphasis in health care has been on quality and availability but increasing cost pressure has made cost efficiency ever more relevant for nurses, technicians, and physicians. Within a hospital, the radiologist considerably influences the patient's length of stay through the availability of service and diagnostic information. Therefore, coordinating and timing radiologic examinations become increasingly more important. Physicians are not taught organizational management during their medical education and residency training, and the necessary expertise in economics is generally acquired through the literature or specialized courses. Beyond the medical service, the physicians are increasingly required to optimize their work flow according to economic factors. This review introduces various tools for process management and its application in radiology. By means of simple paper-based methods, the work flow of most processes can be analyzed. For more complex work flow, it is suggested to choose a method that allows for an exact qualitative and quantitative prediction of the effect of variations. This review introduces network planning technique and process simulation. (orig.) [de

  19. The Management of Law Firms Using Business Process Management, Document Management and Web Services Integration

    OpenAIRE

    Roxana Maria Petculet

    2012-01-01

    The aim of this paper is to present the technical solution implemented in the present context for the management of law firms. The informational system consists of the automation of business processes using a BPM engine and electronic archiving using a DMS. The communication between the two modules is made by invoking web services. The whole system integrates modules like: project management, contract management, invoice management, collection, CRM, reporting.

  20. Establishing bioinformatics research in the Asia Pacific

    Directory of Open Access Journals (Sweden)

    Tammi Martti

    2006-12-01

    Full Text Available Abstract In 1998, the Asia Pacific Bioinformatics Network (APBioNet, Asia's oldest bioinformatics organisation was set up to champion the advancement of bioinformatics in the Asia Pacific. By 2002, APBioNet was able to gain sufficient critical mass to initiate the first International Conference on Bioinformatics (InCoB bringing together scientists working in the field of bioinformatics in the region. This year, the InCoB2006 Conference was organized as the 5th annual conference of the Asia-Pacific Bioinformatics Network, on Dec. 18–20, 2006 in New Delhi, India, following a series of successful events in Bangkok (Thailand, Penang (Malaysia, Auckland (New Zealand and Busan (South Korea. This Introduction provides a brief overview of the peer-reviewed manuscripts accepted for publication in this Supplement. It exemplifies a typical snapshot of the growing research excellence in bioinformatics of the region as we embark on a trajectory of establishing a solid bioinformatics research culture in the Asia Pacific that is able to contribute fully to the global bioinformatics community.

  1. Emerging strengths in Asia Pacific bioinformatics.

    Science.gov (United States)

    Ranganathan, Shoba; Hsu, Wen-Lian; Yang, Ueng-Cheng; Tan, Tin Wee

    2008-12-12

    The 2008 annual conference of the Asia Pacific Bioinformatics Network (APBioNet), Asia's oldest bioinformatics organisation set up in 1998, was organized as the 7th International Conference on Bioinformatics (InCoB), jointly with the Bioinformatics and Systems Biology in Taiwan (BIT 2008) Conference, Oct. 20-23, 2008 at Taipei, Taiwan. Besides bringing together scientists from the field of bioinformatics in this region, InCoB is actively involving researchers from the area of systems biology, to facilitate greater synergy between these two groups. Marking the 10th Anniversary of APBioNet, this InCoB 2008 meeting followed on from a series of successful annual events in Bangkok (Thailand), Penang (Malaysia), Auckland (New Zealand), Busan (South Korea), New Delhi (India) and Hong Kong. Additionally, tutorials and the Workshop on Education in Bioinformatics and Computational Biology (WEBCB) immediately prior to the 20th Federation of Asian and Oceanian Biochemists and Molecular Biologists (FAOBMB) Taipei Conference provided ample opportunity for inducting mainstream biochemists and molecular biologists from the region into a greater level of awareness of the importance of bioinformatics in their craft. In this editorial, we provide a brief overview of the peer-reviewed manuscripts accepted for publication herein, grouped into thematic areas. As the regional research expertise in bioinformatics matures, the papers fall into thematic areas, illustrating the specific contributions made by APBioNet to global bioinformatics efforts.

  2. The ATLAS Data Management Software Engineering Process

    CERN Document Server

    Lassnig, M; The ATLAS collaboration; Stewart, G A; Barisits, M; Beermann, T; Vigne, R; Serfon, C; Goossens, L; Nairz, A; Molfetas, A

    2014-01-01

    Rucio is the next-generation data management system of the ATLAS experiment. The software engineering process to develop Rucio is fundamentally different to existing software development approaches in the ATLAS distributed computing community. Based on a conceptual design document, development takes place using peer-reviewed code in a test-driven environment. The main objectives are to ensure that every engineer understands the details of the full project, even components usually not touched by them, that the design and architecture are coherent, that temporary contributors can be productive without delay, that programming mistakes are prevented before being committed to the source code, and that the source is always in a fully functioning state. This contribution will illustrate the workflows and products used, and demonstrate the typical development cycle of a component from inception to deployment within this software engineering process. Next to the technological advantages, this contribution will also hi...

  3. The ATLAS Data Management Software Engineering Process

    CERN Document Server

    Lassnig, M; The ATLAS collaboration; Stewart, G A; Barisits, M; Beermann, T; Vigne, R; Serfon, C; Goossens, L; Nairz, A

    2013-01-01

    Rucio is the next-generation data management system of the ATLAS experiment. The software engineering process to develop Rucio is fundamentally different to existing software development approaches in the ATLAS distributed computing community. Based on a conceptual design document, development takes place using peer-reviewed code in a test-driven environment. The main objectives are to ensure that every engineer understands the details of the full project, even components usually not touched by them, that the design and architecture are coherent, that temporary contributors can be productive without delay, that programming mistakes are prevented before being committed to the source code, and that the source is always in a fully functioning state. This contribution will illustrate the workflows and products used, and demonstrate the typical development cycle of a component from inception to deployment within this software engineering process. Next to the technological advantages, this contribution will also hi...

  4. Development of advanced spent fuel management process

    International Nuclear Information System (INIS)

    Ro, Seung Gy; Shin, Y. J.; Do, J. B.; You, G. S.; Seo, J. S.; Lee, H. G.

    1998-03-01

    This study is to develop an advanced spent fuel management process for countries which have not yet decided a back-end nuclear fuel cycle policy. The aims of this process development based on the pyroreduction technology of PWR spent fuels with molten lithium, are to reduce the storage volume by a quarter and to reduce the storage cooling load in half by the preferential removal of highly radioactive decay-heat elements such as Cs-137 and Sr-90 only. From the experimental results which confirm the feasibility of metallization technology, it is concluded that there are no problems in aspects of reaction kinetics and equilibrium. However, the operating performance test of each equipment on an engineering scale still remain and will be conducted in 1999. (author). 21 refs., 45 tabs., 119 figs

  5. The NASA Continuous Risk Management Process

    Science.gov (United States)

    Pokorny, Frank M.

    2004-01-01

    As an intern this summer in the GRC Risk Management Office, I have become familiar with the NASA Continuous Risk Management Process. In this process, risk is considered in terms of the probability that an undesired event will occur and the impact of the event, should it occur (ref., NASA-NPG: 7120.5). Risk management belongs in every part of every project and should be ongoing from start to finish. Another key point is that a risk is not a problem until it has happened. With that in mind, there is a six step cycle for continuous risk management that prevents risks from becoming problems. The steps are: identify, analyze, plan, track, control, and communicate & document. Incorporated in the first step are several methods to identify risks such as brainstorming and using lessons learned. Once a risk is identified, a risk statement is made on a risk information sheet consisting of a single condition and one or more consequences. There can also be a context section where the risk is explained in more detail. Additionally there are three main goals of analyzing a risk, which are evaluate, classify, and prioritize. Here is where a value is given to the attributes of a risk &e., probability, impact, and timeframe) based on a multi-level classification system (e.g., low, medium, high). It is important to keep in mind that the definitions of these levels are probably different for each project. Furthermore the risks can be combined into groups. Then, the risks are prioritized to see what risk is necessary to mitigate first. After the risks are analyzed, a plan is made to mitigate as many risks as feasible. Each risk should be assigned to someone in the project with knowledge in the area of the risk. Then the possible approaches to choose from are: research, accept, watch, or mitigate. Next, all risks, mitigated or not, are tracked either individually or in groups. As the plan is executed, risks are re-evaluated, and the attribute values are adjusted as necessary. Metrics

  6. Integrating risk management into the baselining process

    International Nuclear Information System (INIS)

    Jennett, N.; Tonkinson, A.

    1994-01-01

    These processes work together in building the project (comprised of the technical, schedule, and cost baselines) against which performance is measured and changes to the scope, schedule and cost of a project are managed and controlled. Risk analysis is often performed as the final element of the scheduling or estimating processes, a precursor to establishing cost and schedule contingency. However, best business practices dictate that information that may be crucial to the success of a project be analyzed and incorporated into project planning as soon as it is available and usable. The purpose or risk management is not to eliminate risk. Neither is it intended to suggest wholesale re-estimating and re-scheduling of a project. Rather, the intent is to make provisions to reduce and control the schedule and/or cost ramifications of risk by anticipating events and conditions that cannot be reliably planned for and which have the potential to negatively impact accomplishment of the technical objectives and requirements of the project

  7. BioWarehouse: a bioinformatics database warehouse toolkit

    Directory of Open Access Journals (Sweden)

    Stringer-Calvert David WJ

    2006-03-01

    Full Text Available Abstract Background This article addresses the problem of interoperation of heterogeneous bioinformatics databases. Results We introduce BioWarehouse, an open source toolkit for constructing bioinformatics database warehouses using the MySQL and Oracle relational database managers. BioWarehouse integrates its component databases into a common representational framework within a single database management system, thus enabling multi-database queries using the Structured Query Language (SQL but also facilitating a variety of database integration tasks such as comparative analysis and data mining. BioWarehouse currently supports the integration of a pathway-centric set of databases including ENZYME, KEGG, and BioCyc, and in addition the UniProt, GenBank, NCBI Taxonomy, and CMR databases, and the Gene Ontology. Loader tools, written in the C and JAVA languages, parse and load these databases into a relational database schema. The loaders also apply a degree of semantic normalization to their respective source data, decreasing semantic heterogeneity. The schema supports the following bioinformatics datatypes: chemical compounds, biochemical reactions, metabolic pathways, proteins, genes, nucleic acid sequences, features on protein and nucleic-acid sequences, organisms, organism taxonomies, and controlled vocabularies. As an application example, we applied BioWarehouse to determine the fraction of biochemically characterized enzyme activities for which no sequences exist in the public sequence databases. The answer is that no sequence exists for 36% of enzyme activities for which EC numbers have been assigned. These gaps in sequence data significantly limit the accuracy of genome annotation and metabolic pathway prediction, and are a barrier for metabolic engineering. Complex queries of this type provide examples of the value of the data warehousing approach to bioinformatics research. Conclusion BioWarehouse embodies significant progress on the

  8. BioWarehouse: a bioinformatics database warehouse toolkit.

    Science.gov (United States)

    Lee, Thomas J; Pouliot, Yannick; Wagner, Valerie; Gupta, Priyanka; Stringer-Calvert, David W J; Tenenbaum, Jessica D; Karp, Peter D

    2006-03-23

    This article addresses the problem of interoperation of heterogeneous bioinformatics databases. We introduce BioWarehouse, an open source toolkit for constructing bioinformatics database warehouses using the MySQL and Oracle relational database managers. BioWarehouse integrates its component databases into a common representational framework within a single database management system, thus enabling multi-database queries using the Structured Query Language (SQL) but also facilitating a variety of database integration tasks such as comparative analysis and data mining. BioWarehouse currently supports the integration of a pathway-centric set of databases including ENZYME, KEGG, and BioCyc, and in addition the UniProt, GenBank, NCBI Taxonomy, and CMR databases, and the Gene Ontology. Loader tools, written in the C and JAVA languages, parse and load these databases into a relational database schema. The loaders also apply a degree of semantic normalization to their respective source data, decreasing semantic heterogeneity. The schema supports the following bioinformatics datatypes: chemical compounds, biochemical reactions, metabolic pathways, proteins, genes, nucleic acid sequences, features on protein and nucleic-acid sequences, organisms, organism taxonomies, and controlled vocabularies. As an application example, we applied BioWarehouse to determine the fraction of biochemically characterized enzyme activities for which no sequences exist in the public sequence databases. The answer is that no sequence exists for 36% of enzyme activities for which EC numbers have been assigned. These gaps in sequence data significantly limit the accuracy of genome annotation and metabolic pathway prediction, and are a barrier for metabolic engineering. Complex queries of this type provide examples of the value of the data warehousing approach to bioinformatics research. BioWarehouse embodies significant progress on the database integration problem for bioinformatics.

  9. Development of Advanced Spent Fuel Management Process

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Chung Seok; Choi, I. K.; Kwon, S. G. (and others)

    2007-06-15

    As a part of research efforts to develop an advanced spent fuel management process, this project focused on the electrochemical reduction technology which can replace the original Li reduction technology of ANL, and we have successfully built a 20 kgHM/batch scale demonstration system. The performance tests of the system in the ACPF hot cell showed more than a 99% reduction yield of SIMFUEL, a current density of 100 mA/cm{sup 2} and a current efficiency of 80%. For an optimization of the process, the prevention of a voltage drop in an integrated cathode, a minimization of the anodic effect and an improvement of the hot cell operability by a modulation and simplization of the unit apparatuses were achieved. Basic research using a bench-scale system was also carried out by focusing on a measurement of the electrochemical reduction rate of the surrogates, an elucidation of the reaction mechanism, collecting data on the partition coefficients of the major nuclides, quantitative measurement of mass transfer rates and diffusion coefficients of oxygen and metal ions in molten salts. When compared to the PYROX process of INL, the electrochemical reduction system developed in this project has comparative advantages in its application of a flexible reaction mechanism, relatively short reaction times and increased process yields.

  10. Development of Advanced Spent Fuel Management Process

    International Nuclear Information System (INIS)

    Seo, Chung Seok; Choi, I. K.; Kwon, S. G.

    2007-06-01

    As a part of research efforts to develop an advanced spent fuel management process, this project focused on the electrochemical reduction technology which can replace the original Li reduction technology of ANL, and we have successfully built a 20 kgHM/batch scale demonstration system. The performance tests of the system in the ACPF hot cell showed more than a 99% reduction yield of SIMFUEL, a current density of 100 mA/cm 2 and a current efficiency of 80%. For an optimization of the process, the prevention of a voltage drop in an integrated cathode, a minimization of the anodic effect and an improvement of the hot cell operability by a modulation and simplization of the unit apparatuses were achieved. Basic research using a bench-scale system was also carried out by focusing on a measurement of the electrochemical reduction rate of the surrogates, an elucidation of the reaction mechanism, collecting data on the partition coefficients of the major nuclides, quantitative measurement of mass transfer rates and diffusion coefficients of oxygen and metal ions in molten salts. When compared to the PYROX process of INL, the electrochemical reduction system developed in this project has comparative advantages in its application of a flexible reaction mechanism, relatively short reaction times and increased process yields

  11. 23 CFR 450.320 - Congestion management process in transportation management areas.

    Science.gov (United States)

    2010-04-01

    ... 23 Highways 1 2010-04-01 2010-04-01 false Congestion management process in transportation... Programming § 450.320 Congestion management process in transportation management areas. (a) The transportation planning process in a TMA shall address congestion management through a process that provides for safe and...

  12. The secondary metabolite bioinformatics portal

    DEFF Research Database (Denmark)

    Weber, Tilmann; Kim, Hyun Uk

    2016-01-01

    . In this context, this review gives a summary of tools and databases that currently are available to mine, identify and characterize natural product biosynthesis pathways and their producers based on ‘omics data. A web portal called Secondary Metabolite Bioinformatics Portal (SMBP at http...... analytical and chemical methods gave access to this group of compounds, nowadays genomics-based methods offer complementary approaches to find, identify and characterize such molecules. This paradigm shift also resulted in a high demand for computational tools to assist researchers in their daily work......Natural products are among the most important sources of lead molecules for drug discovery. With the development of affordable whole-genome sequencing technologies and other ‘omics tools, the field of natural products research is currently undergoing a shift in paradigms. While, for decades, mainly...

  13. Configuration and Data Management Process and the System Safety Professional

    Science.gov (United States)

    Shivers, Charles Herbert; Parker, Nelson C. (Technical Monitor)

    2001-01-01

    This article presents a discussion of the configuration management (CM) and the Data Management (DM) functions and provides a perspective of the importance of configuration and data management processes to the success of system safety activities. The article addresses the basic requirements of configuration and data management generally based on NASA configuration and data management policies and practices, although the concepts are likely to represent processes of any public or private organization's well-designed configuration and data management program.

  14. Business process of reputation management of food industry enterprises

    OpenAIRE

    Derevianko Olena. H.

    2014-01-01

    The goal of the article is development of the methodical base of reputation management directed at formalisation of theoretical provisions and explanation how to organise reputation management at food industry enterprises. The article shows prospectiveness of use of the Business Process Management concept in reputation management. Using the diagram of the Reputation Management business process environment the article shows its key participants (suppliers and clients of the business process) a...

  15. Initial perspectives on process threat management

    International Nuclear Information System (INIS)

    Whiteley, James R. Rob; Mannan, M. Sam

    2004-01-01

    Terrorist and criminal acts are now considered credible risks in the process industries. Deliberate attacks on the nation's petroleum refineries and chemical plants would pose a significant threat to public welfare, national security, and the US economy. To-date, the primary response of government and industry has been on improved security to prevent attacks and the associated consequences. While prevention is clearly preferred, the potential for successful attacks must be addressed. If plant security is breached, the extent of the inflicted damage is determined by the available plant safety systems and procedures. We refer to this 'inside the gate' response as process threat management. The authors have initiated a joint industry/academia study to address: - the level of safety provided by existing plant equipment and safety systems in response to a terrorist act, and; - identification of process (rather than security) needs or opportunities to address this new safety concern. This paper describes the initial perspectives and issues identified by the team at the beginning of the study

  16. Management of investment processes on Finnish farms

    Directory of Open Access Journals (Sweden)

    T. MATTILA

    2008-12-01

    Full Text Available Structural change in agriculture means a continuous need for investing in farm production. It is essential for the sustainable operations and the economy of the farm that such investments are successful. In this research, different stages of the investment process of farms were studied as well as the use of information and the success perceived during the investment process. The study was carried out with mail surveys and telephone interviews on the Finnish Farm Accountancy Data Network (FADN farms. The most challenging investments were in animal husbandry buildings and, as to these investments, the comparison of alternatives was the most challenging stage. For most investments, the planning phase was considered more challenging than the implementation. Before making the decision, farmers acquired information from many sources, of which the opinion of the main customer and the experiences of fellow farmers were the most valued. Some of the products considered were so new on the market that it was not easy to get adequate information and, furthermore, the information given by suppliers was not always accurate. Decision-making was supported by calculations, but qualitative factors had a dominating role. Large basic decisions were made relatively quickly, while details needed a longer time to process. In general, farm managers were satisfied with their investments. Improvements in work quality and quantity were especially mentioned and generally qualitative factors were the ones first in mind when evaluating the successfulness of the investment.;

  17. Biology in 'silico': The Bioinformatics Revolution.

    Science.gov (United States)

    Bloom, Mark

    2001-01-01

    Explains the Human Genome Project (HGP) and efforts to sequence the human genome. Describes the role of bioinformatics in the project and considers it the genetics Swiss Army Knife, which has many different uses, for use in forensic science, medicine, agriculture, and environmental sciences. Discusses the use of bioinformatics in the high school…

  18. Using "Arabidopsis" Genetic Sequences to Teach Bioinformatics

    Science.gov (United States)

    Zhang, Xiaorong

    2009-01-01

    This article describes a new approach to teaching bioinformatics using "Arabidopsis" genetic sequences. Several open-ended and inquiry-based laboratory exercises have been designed to help students grasp key concepts and gain practical skills in bioinformatics, using "Arabidopsis" leucine-rich repeat receptor-like kinase (LRR…

  19. A Mathematical Optimization Problem in Bioinformatics

    Science.gov (United States)

    Heyer, Laurie J.

    2008-01-01

    This article describes the sequence alignment problem in bioinformatics. Through examples, we formulate sequence alignment as an optimization problem and show how to compute the optimal alignment with dynamic programming. The examples and sample exercises have been used by the author in a specialized course in bioinformatics, but could be adapted…

  20. Fuzzy Logic in Medicine and Bioinformatics

    Directory of Open Access Journals (Sweden)

    Angela Torres

    2006-01-01

    Full Text Available The purpose of this paper is to present a general view of the current applications of fuzzy logic in medicine and bioinformatics. We particularly review the medical literature using fuzzy logic. We then recall the geometrical interpretation of fuzzy sets as points in a fuzzy hypercube and present two concrete illustrations in medicine (drug addictions and in bioinformatics (comparison of genomes.

  1. Software Process Improvement Journey: IBM Australia Application Management Services

    Science.gov (United States)

    2005-03-01

    See Section 5.1.2) - Client Relationship Management ( CRM ) processes-specifically, Solution Design and Solution Delivery - Worldwide Project Management ...plex systems life-cycle management , rapid solutions development, custom development, package selection and implementation, maintenance, minor...CarnegieMellon ___ Software Engineering Institute Software Process Improvement Journey: IBM Australia Application Management Services Robyn Nichols

  2. Rising Strengths Hong Kong SAR in Bioinformatics.

    Science.gov (United States)

    Chakraborty, Chiranjib; George Priya Doss, C; Zhu, Hailong; Agoramoorthy, Govindasamy

    2017-06-01

    Hong Kong's bioinformatics sector is attaining new heights in combination with its economic boom and the predominance of the working-age group in its population. Factors such as a knowledge-based and free-market economy have contributed towards a prominent position on the world map of bioinformatics. In this review, we have considered the educational measures, landmark research activities and the achievements of bioinformatics companies and the role of the Hong Kong government in the establishment of bioinformatics as strength. However, several hurdles remain. New government policies will assist computational biologists to overcome these hurdles and further raise the profile of the field. There is a high expectation that bioinformatics in Hong Kong will be a promising area for the next generation.

  3. Bioinformatics clouds for big data manipulation

    Directory of Open Access Journals (Sweden)

    Dai Lin

    2012-11-01

    Full Text Available Abstract As advances in life sciences and information technology bring profound influences on bioinformatics due to its interdisciplinary nature, bioinformatics is experiencing a new leap-forward from in-house computing infrastructure into utility-supplied cloud computing delivered over the Internet, in order to handle the vast quantities of biological data generated by high-throughput experimental technologies. Albeit relatively new, cloud computing promises to address big data storage and analysis issues in the bioinformatics field. Here we review extant cloud-based services in bioinformatics, classify them into Data as a Service (DaaS, Software as a Service (SaaS, Platform as a Service (PaaS, and Infrastructure as a Service (IaaS, and present our perspectives on the adoption of cloud computing in bioinformatics. Reviewers This article was reviewed by Frank Eisenhaber, Igor Zhulin, and Sandor Pongor.

  4. The 2016 Bioinformatics Open Source Conference (BOSC).

    Science.gov (United States)

    Harris, Nomi L; Cock, Peter J A; Chapman, Brad; Fields, Christopher J; Hokamp, Karsten; Lapp, Hilmar; Muñoz-Torres, Monica; Wiencko, Heather

    2016-01-01

    Message from the ISCB: The Bioinformatics Open Source Conference (BOSC) is a yearly meeting organized by the Open Bioinformatics Foundation (OBF), a non-profit group dedicated to promoting the practice and philosophy of Open Source software development and Open Science within the biological research community. BOSC has been run since 2000 as a two-day Special Interest Group (SIG) before the annual ISMB conference. The 17th annual BOSC ( http://www.open-bio.org/wiki/BOSC_2016) took place in Orlando, Florida in July 2016. As in previous years, the conference was preceded by a two-day collaborative coding event open to the bioinformatics community. The conference brought together nearly 100 bioinformatics researchers, developers and users of open source software to interact and share ideas about standards, bioinformatics software development, and open and reproducible science.

  5. Bioinformatics clouds for big data manipulation

    KAUST Repository

    Dai, Lin

    2012-11-28

    As advances in life sciences and information technology bring profound influences on bioinformatics due to its interdisciplinary nature, bioinformatics is experiencing a new leap-forward from in-house computing infrastructure into utility-supplied cloud computing delivered over the Internet, in order to handle the vast quantities of biological data generated by high-throughput experimental technologies. Albeit relatively new, cloud computing promises to address big data storage and analysis issues in the bioinformatics field. Here we review extant cloud-based services in bioinformatics, classify them into Data as a Service (DaaS), Software as a Service (SaaS), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS), and present our perspectives on the adoption of cloud computing in bioinformatics.This article was reviewed by Frank Eisenhaber, Igor Zhulin, and Sandor Pongor. 2012 Dai et al.; licensee BioMed Central Ltd.

  6. Bioinformatics clouds for big data manipulation.

    Science.gov (United States)

    Dai, Lin; Gao, Xin; Guo, Yan; Xiao, Jingfa; Zhang, Zhang

    2012-11-28

    As advances in life sciences and information technology bring profound influences on bioinformatics due to its interdisciplinary nature, bioinformatics is experiencing a new leap-forward from in-house computing infrastructure into utility-supplied cloud computing delivered over the Internet, in order to handle the vast quantities of biological data generated by high-throughput experimental technologies. Albeit relatively new, cloud computing promises to address big data storage and analysis issues in the bioinformatics field. Here we review extant cloud-based services in bioinformatics, classify them into Data as a Service (DaaS), Software as a Service (SaaS), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS), and present our perspectives on the adoption of cloud computing in bioinformatics. This article was reviewed by Frank Eisenhaber, Igor Zhulin, and Sandor Pongor.

  7. The improving processes in the human resources management

    OpenAIRE

    Darja Holátová

    2002-01-01

    The quality management of the human resources management, the quality of the products, services and prosperities of the firms is among others dependent on the quality management. Managers convey a leadership and commitment necessary for creating the environment for quality improvement. The managers are responsible for their own actions, development and improvement of their own work processes.

  8. 9th International Conference on Practical Applications of Computational Biology and Bioinformatics

    CERN Document Server

    Rocha, Miguel; Fdez-Riverola, Florentino; Paz, Juan

    2015-01-01

    This proceedings presents recent practical applications of Computational Biology and  Bioinformatics. It contains the proceedings of the 9th International Conference on Practical Applications of Computational Biology & Bioinformatics held at University of Salamanca, Spain, at June 3rd-5th, 2015. The International Conference on Practical Applications of Computational Biology & Bioinformatics (PACBB) is an annual international meeting dedicated to emerging and challenging applied research in Bioinformatics and Computational Biology. Biological and biomedical research are increasingly driven by experimental techniques that challenge our ability to analyse, process and extract meaningful knowledge from the underlying data. The impressive capabilities of next generation sequencing technologies, together with novel and ever evolving distinct types of omics data technologies, have put an increasingly complex set of challenges for the growing fields of Bioinformatics and Computational Biology. The analysis o...

  9. Organizational agility key factors for dynamic business process management

    OpenAIRE

    Triaa , Wafa; Gzara , Lilia; Verjus , Hervé

    2016-01-01

    International audience; For several years, Business Process Management (BPM) is recognized as a holistic management approach that promotes business effectiveness and efficiency. Increasingly, corporates find themselves, operating in business environments filled with unpredictable, complex and continuous change. Driven by these dynamic competitive conditions, they look for a dynamic management of their business processes to maintain their processes performance. To be competitive, companies hav...

  10. 49 CFR 659.31 - Hazard management process.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 7 2010-10-01 2010-10-01 false Hazard management process. 659.31 Section 659.31... Agency § 659.31 Hazard management process. (a) The oversight agency must require the rail transit agency..., operational changes, or other changes within the rail transit environment. (b) The hazard management process...

  11. 76 FR 57723 - Electricity Sector Cybersecurity Risk Management Process Guideline

    Science.gov (United States)

    2011-09-16

    ... DEPARTMENT OF ENERGY Electricity Sector Cybersecurity Risk Management Process Guideline AGENCY... public comment on DOE's intent to publish the Electricity Sector Cybersecurity Risk Management Process Guideline. The guideline describes a risk management process that is targeted to the specific needs of...

  12. A web services choreography scenario for interoperating bioinformatics applications

    Directory of Open Access Journals (Sweden)

    Cheung David W

    2004-03-01

    Full Text Available Abstract Background Very often genome-wide data analysis requires the interoperation of multiple databases and analytic tools. A large number of genome databases and bioinformatics applications are available through the web, but it is difficult to automate interoperation because: 1 the platforms on which the applications run are heterogeneous, 2 their web interface is not machine-friendly, 3 they use a non-standard format for data input and output, 4 they do not exploit standards to define application interface and message exchange, and 5 existing protocols for remote messaging are often not firewall-friendly. To overcome these issues, web services have emerged as a standard XML-based model for message exchange between heterogeneous applications. Web services engines have been developed to manage the configuration and execution of a web services workflow. Results To demonstrate the benefit of using web services over traditional web interfaces, we compare the two implementations of HAPI, a gene expression analysis utility developed by the University of California San Diego (UCSD that allows visual characterization of groups or clusters of genes based on the biomedical literature. This utility takes a set of microarray spot IDs as input and outputs a hierarchy of MeSH Keywords that correlates to the input and is grouped by Medical Subject Heading (MeSH category. While the HTML output is easy for humans to visualize, it is difficult for computer applications to interpret semantically. To facilitate the capability of machine processing, we have created a workflow of three web services that replicates the HAPI functionality. These web services use document-style messages, which means that messages are encoded in an XML-based format. We compared three approaches to the implementation of an XML-based workflow: a hard coded Java application, Collaxa BPEL Server and Taverna Workbench. The Java program functions as a web services engine and interoperates

  13. BOWS (bioinformatics open web services) to centralize bioinformatics tools in web services.

    Science.gov (United States)

    Velloso, Henrique; Vialle, Ricardo A; Ortega, J Miguel

    2015-06-02

    Bioinformaticians face a range of difficulties to get locally-installed tools running and producing results; they would greatly benefit from a system that could centralize most of the tools, using an easy interface for input and output. Web services, due to their universal nature and widely known interface, constitute a very good option to achieve this goal. Bioinformatics open web services (BOWS) is a system based on generic web services produced to allow programmatic access to applications running on high-performance computing (HPC) clusters. BOWS intermediates the access to registered tools by providing front-end and back-end web services. Programmers can install applications in HPC clusters in any programming language and use the back-end service to check for new jobs and their parameters, and then to send the results to BOWS. Programs running in simple computers consume the BOWS front-end service to submit new processes and read results. BOWS compiles Java clients, which encapsulate the front-end web service requisitions, and automatically creates a web page that disposes the registered applications and clients. Bioinformatics open web services registered applications can be accessed from virtually any programming language through web services, or using standard java clients. The back-end can run in HPC clusters, allowing bioinformaticians to remotely run high-processing demand applications directly from their machines.

  14. The ATLAS data management software engineering process

    International Nuclear Information System (INIS)

    Lassnig, M; Garonne, V; Stewart, G A; Barisits, M; Serfon, C; Goossens, L; Nairz, A; Beermann, T; Vigne, R; Molfetas, A

    2014-01-01

    Rucio is the next-generation data management system of the ATLAS experiment. The software engineering process to develop Rucio is fundamentally different to existing software development approaches in the ATLAS distributed computing community. Based on a conceptual design document, development takes place using peer-reviewed code in a test-driven environment. The main objectives are to ensure that every engineer understands the details of the full project, even components usually not touched by them, that the design and architecture are coherent, that temporary contributors can be productive without delay, that programming mistakes are prevented before being committed to the source code, and that the source is always in a fully functioning state. This contribution will illustrate the workflows and products used, and demonstrate the typical development cycle of a component from inception to deployment within this software engineering process. Next to the technological advantages, this contribution will also highlight the social aspects of an environment where every action is subject to detailed scrutiny.

  15. The ATLAS data management software engineering process

    Science.gov (United States)

    Lassnig, M.; Garonne, V.; Stewart, G. A.; Barisits, M.; Beermann, T.; Vigne, R.; Serfon, C.; Goossens, L.; Nairz, A.; Molfetas, A.; Atlas Collaboration

    2014-06-01

    Rucio is the next-generation data management system of the ATLAS experiment. The software engineering process to develop Rucio is fundamentally different to existing software development approaches in the ATLAS distributed computing community. Based on a conceptual design document, development takes place using peer-reviewed code in a test-driven environment. The main objectives are to ensure that every engineer understands the details of the full project, even components usually not touched by them, that the design and architecture are coherent, that temporary contributors can be productive without delay, that programming mistakes are prevented before being committed to the source code, and that the source is always in a fully functioning state. This contribution will illustrate the workflows and products used, and demonstrate the typical development cycle of a component from inception to deployment within this software engineering process. Next to the technological advantages, this contribution will also highlight the social aspects of an environment where every action is subject to detailed scrutiny.

  16. Mathematics and evolutionary biology make bioinformatics education comprehensible.

    Science.gov (United States)

    Jungck, John R; Weisstein, Anton E

    2013-09-01

    The patterns of variation within a molecular sequence data set result from the interplay between population genetic, molecular evolutionary and macroevolutionary processes-the standard purview of evolutionary biologists. Elucidating these patterns, particularly for large data sets, requires an understanding of the structure, assumptions and limitations of the algorithms used by bioinformatics software-the domain of mathematicians and computer scientists. As a result, bioinformatics often suffers a 'two-culture' problem because of the lack of broad overlapping expertise between these two groups. Collaboration among specialists in different fields has greatly mitigated this problem among active bioinformaticians. However, science education researchers report that much of bioinformatics education does little to bridge the cultural divide, the curriculum too focused on solving narrow problems (e.g. interpreting pre-built phylogenetic trees) rather than on exploring broader ones (e.g. exploring alternative phylogenetic strategies for different kinds of data sets). Herein, we present an introduction to the mathematics of tree enumeration, tree construction, split decomposition and sequence alignment. We also introduce off-line downloadable software tools developed by the BioQUEST Curriculum Consortium to help students learn how to interpret and critically evaluate the results of standard bioinformatics analyses.

  17. Situational Script Management of Business Processes with Changeable Structure

    OpenAIRE

    Chaliy, Sergey; Chala, Oksana

    2008-01-01

    In the presented work the problem of management business-processes with changeable structure is considered and situational based approach to its decision is offered. The approach is based on situational model of management business-process according to which process is represented as a set of situations. The script defining necessary actions is connected with each situation. Management of process is carried out by means of the rules formalizing functional requirements to processes.

  18. Torrefaction Processing for Human Solid Waste Management

    Science.gov (United States)

    Serio, Michael A.; Cosgrove, Joseph E.; Wójtowicz, Marek A.; Stapleton, Thomas J.; Nalette, Tim A.; Ewert, Michael K.; Lee, Jeffrey; Fisher, John

    2016-01-01

    This study involved a torrefaction (mild pyrolysis) processing approach that could be used to sterilize feces and produce a stable, odor-free solid product that can be stored or recycled, and also to simultaneously recover moisture. It was demonstrated that mild heating (200-250 C) in nitrogen or air was adequate for torrefaction of a fecal simulant and an analog of human solid waste (canine feces). The net result was a nearly undetectable odor (for the canine feces), complete recovery of moisture, some additional water production, a modest reduction of the dry solid mass, and the production of small amounts of gas and liquid. The liquid product is mainly water, with a small Total Organic Carbon content. The amount of solid vs gas plus liquid products can be controlled by adjusting the torrefaction conditions (final temperature, holding time), and the current work has shown that the benefits of torrefaction could be achieved in a low temperature range (< 250 C). These temperatures are compatible with the PTFE bag materials historically used by NASA for fecal waste containment and will reduce the energy consumption of the process. The solid product was a dry material that did not support bacterial growth and was hydrophobic relative to the starting material. In the case of canine feces, the solid product was a mechanically friable material that could be easily compacted to a significantly smaller volume (approx. 50%). The proposed Torrefaction Processing Unit (TPU) would be designed to be compatible with the Universal Waste Management System (UWMS), now under development by NASA. A stand-alone TPU could be used to treat the canister from the UWMS, along with other types of wet solid wastes, with either conventional or microwave heating. Over time, a more complete integration of the TPU and the UWMS could be achieved, but will require design changes in both units.

  19. Overview of an energy management process

    International Nuclear Information System (INIS)

    Chantraine, P.

    2004-01-01

    Invista is a global and vertically integrated fiber, resin and intermediates business which belonged to Dupont but is now a subsidiary of Koch Industries. A background of Invista and its former relationship with Dupont was presented. This presentation was based on goals and work done as Dupont Canada Inc., up to the end of 2003. Details of Invista's approach to climate change in Canada were provided along with the company's relationship with Natural Resources Canada. The historical position of Dupont Canada was reviewed in detail, including their commitment to voluntary approach; participation in the national process; their goal of 85 per cent reduction in greenhouse gas (GHG) by 2000; 93 per cent reduction in nitrous oxide emissions; energy efficiency goals; and continuing growth of the company. An outline and mission of the energy management team established in 1974 was presented, with details of the 1974 oil shortage, stabilization in the 1980s through to rises in electricity prices in the 1990s and concerns over climate change in recent years. Details of the team's operational procedures were presented. Results were presented in graph form and include: total energy use from 1972 to 2003 as well as cumulative energy conservation projects and resulting energy savings. Examples of activities and projects were provided, including details of energy performance contracting. It was concluded that in order to conserve energy, top management support was necessary, as well as passion and dedication in both leaders and teams. A broad scope for creativity in finding solutions within evolving constraints was also important, as was the nurturing of capability, capacity and recognition for results achieved. tabs., figs

  20. High-throughput bioinformatics with the Cyrille2 pipeline system

    Directory of Open Access Journals (Sweden)

    de Groot Joost CW

    2008-02-01

    Full Text Available Abstract Background Modern omics research involves the application of high-throughput technologies that generate vast volumes of data. These data need to be pre-processed, analyzed and integrated with existing knowledge through the use of diverse sets of software tools, models and databases. The analyses are often interdependent and chained together to form complex workflows or pipelines. Given the volume of the data used and the multitude of computational resources available, specialized pipeline software is required to make high-throughput analysis of large-scale omics datasets feasible. Results We have developed a generic pipeline system called Cyrille2. The system is modular in design and consists of three functionally distinct parts: 1 a web based, graphical user interface (GUI that enables a pipeline operator to manage the system; 2 the Scheduler, which forms the functional core of the system and which tracks what data enters the system and determines what jobs must be scheduled for execution, and; 3 the Executor, which searches for scheduled jobs and executes these on a compute cluster. Conclusion The Cyrille2 system is an extensible, modular system, implementing the stated requirements. Cyrille2 enables easy creation and execution of high throughput, flexible bioinformatics pipelines.

  1. Processes, Performance Drivers and ICT Tools in Human Resources Management

    OpenAIRE

    Oškrdal Václav; Pavlíček Antonín; Jelínková Petra

    2011-01-01

    This article presents an insight to processes, performance drivers and ICT tools in human resources (HR) management area. On the basis of a modern approach to HR management, a set of business processes that are handled by today’s HR managers is defined. Consequently, the concept of ICT-supported performance drivers and their relevance in the area of HR management as well as the relationship between HR business processes, performance drivers and ICT tools are defined. The theoretical outcomes ...

  2. On Representing Instance Changes in Adaptive Process Management Systems.

    NARCIS (Netherlands)

    Rinderle, S.B.; Kreher, U; Lauer, M.; Dadam, P.; Reichert, M.U.

    2006-01-01

    By separating the process logic from the application code process management systems (PMS) offer promising perspectives for automation and management of business processes. However, the added value of PMS strongly depends on their ability to support business process changes which can affect the

  3. Understanding and Managing the Assessment Process

    Science.gov (United States)

    Gene Lessard; Scott Archer; John R. Probst; Sandra Clark

    1999-01-01

    Taking an ecological approach to management, or ecosystem management, is a developing approach for managing natural resources within the context of large geogaphic scales and over multiple time frames. Recently, the Council on Environmental Quality (CEQ) (IEMTF 1995) defined an ecosystem as "...an interconnected community of living things, including humans, and...

  4. Integrating the autonomous subsystems management process

    Science.gov (United States)

    Ashworth, Barry R.

    1992-01-01

    Ways in which the ranking of the Space Station Module Power Management and Distribution testbed may be achieved and an individual subsystem's internal priorities may be managed within the complete system are examined. The application of these results in the integration and performance leveling of the autonomously managed system is discussed.

  5. Bioinformatics for Precision Medicine in Oncology: principles and application to the SHIVA clinical trial

    Directory of Open Access Journals (Sweden)

    Nicolas eServant

    2014-05-01

    Full Text Available Precision medicine (PM requires the delivery of individually adapted medical care based on the genetic characteristics of each patient and his/her tumor. The last decade witnessed the development of high-throughput technologies such as microarrays and next-generation sequencing which paved the way to PM in the field of oncology. While the cost of these technologies decreases, we are facing an exponential increase in the amount of data produced. Our ability to use this information in daily practice relies strongly on the availability of an efficient bioinformatics system that assists in the translation of knowledge from the bench towards molecular targeting and diagnosis. Clinical trials and routine diagnoses constitute different approaches, both requiring a strong bioinformatics environment capable of i warranting the integration and the traceability of data, ii ensuring the correct processing and analyses of genomic data and iii applying well-defined and reproducible procedures for workflow management and decision-making. To address the issues, a seamless information system was developed at Institut Curie which facilitates the data integration and tracks in real-time the processing of individual samples. Moreover, computational pipelines were developed to identify reliably genomic alterations and mutations from the molecular profiles of each patient. After a rigorous quality control, a meaningful report is delivered to the clinicians and biologists for the therapeutic decision. The complete bioinformatics environment and the key points of its implementation are presented in the context of the SHIVA clinical trial, a multicentric randomized phase II trial comparing targeted therapy based on tumor molecular profiling versus conventional therapy in patients with refractory cancer. The numerous challenges faced in practice during the setting up and the conduct of this trial are discussed as an illustration of PM application.

  6. Business process of reputation management of food industry enterprises

    Directory of Open Access Journals (Sweden)

    Derevianko Olena. H.

    2014-01-01

    Full Text Available The goal of the article is development of the methodical base of reputation management directed at formalisation of theoretical provisions and explanation how to organise reputation management at food industry enterprises. The article shows prospectiveness of use of the Business Process Management concept in reputation management. Using the diagram of the Reputation Management business process environment the article shows its key participants (suppliers and clients of the business process and identifies their place in formation of the enterprise reputation. It also shows that the reputation management should be considered a business process of the highest level of management. Construction of the flow structure of the Reputation Management business process allows uncovering the logic of interrelation of inlets and outlets within the framework of the specified main stages of the business process: assessment of the current state of reputation, collection of information about stakeholders, identification of PR strategy goals, planning of necessary resources, realisation of the PR strategy, assessment of efficiency and process monitoring. The article offers the flow, functional and organisational structures of the Reputation Management business process for food industry enterprises. Moreover, justification of functional and organisational structures of the Reputation Management business process gives a possibility to distribute functions of reputation management between specific executors and establish responsibility for each stage of the business process.

  7. Natural language processing and advanced information management

    Science.gov (United States)

    Hoard, James E.

    1989-01-01

    Integrating diverse information sources and application software in a principled and general manner will require a very capable advanced information management (AIM) system. In particular, such a system will need a comprehensive addressing scheme to locate the material in its docuverse. It will also need a natural language processing (NLP) system of great sophistication. It seems that the NLP system must serve three functions. First, it provides an natural language interface (NLI) for the users. Second, it serves as the core component that understands and makes use of the real-world interpretations (RWIs) contained in the docuverse. Third, it enables the reasoning specialists (RSs) to arrive at conclusions that can be transformed into procedures that will satisfy the users' requests. The best candidate for an intelligent agent that can satisfactorily make use of RSs and transform documents (TDs) appears to be an object oriented data base (OODB). OODBs have, apparently, an inherent capacity to use the large numbers of RSs and TDs that will be required by an AIM system and an inherent capacity to use them in an effective way.

  8. Evaluating the Effectiveness of a Practical Inquiry-Based Learning Bioinformatics Module on Undergraduate Student Engagement and Applied Skills

    Science.gov (United States)

    Brown, James A. L.

    2016-01-01

    A pedagogic intervention, in the form of an inquiry-based peer-assisted learning project (as a practical student-led bioinformatics module), was assessed for its ability to increase students' engagement, practical bioinformatic skills and process-specific knowledge. Elements assessed were process-specific knowledge following module completion,…

  9. Study on a Process-oriented Knowledge Management Model

    OpenAIRE

    Zhang, Lingling; Li, Jun; Zheng, Xiuyu; Li, Xingsen; Shi, Yong

    2007-01-01

    Now knowledge has become the most important resource of enterprises. Process-oriented knowledge management (POKM) is a new and valuable research field. It may be the most practical method to deal with difficulties in knowledge management. The paper analyzes background, hypothesis and proposes of POKM, define the process knowledge, and give a process-oriented knowledge management model. The model integrates knowledge, process, human, and technology. It can improve the decision support capabili...

  10. A Practical Decision-Analysis Process for Forest Ecosystem Management

    Science.gov (United States)

    H. Michael Rauscher; F. Thomas Lloyd; David L. Loftis; Mark J. Twery

    2000-01-01

    Many authors have pointed out the need to firm up the 'fuzzy' ecosystem management paradigm and develop operationally practical processes to allow forest managers to accommodate more effectively the continuing rapid change in societal perspectives and goals. There are three spatial scales where clear, precise, practical ecosystem management processes are...

  11. BIRCH: A user-oriented, locally-customizable, bioinformatics system

    Science.gov (United States)

    Fristensky, Brian

    2007-01-01

    Background Molecular biologists need sophisticated analytical tools which often demand extensive computational resources. While finding, installing, and using these tools can be challenging, pipelining data from one program to the next is particularly awkward, especially when using web-based programs. At the same time, system administrators tasked with maintaining these tools do not always appreciate the needs of research biologists. Results BIRCH (Biological Research Computing Hierarchy) is an organizational framework for delivering bioinformatics resources to a user group, scaling from a single lab to a large institution. The BIRCH core distribution includes many popular bioinformatics programs, unified within the GDE (Genetic Data Environment) graphic interface. Of equal importance, BIRCH provides the system administrator with tools that simplify the job of managing a multiuser bioinformatics system across different platforms and operating systems. These include tools for integrating locally-installed programs and databases into BIRCH, and for customizing the local BIRCH system to meet the needs of the user base. BIRCH can also act as a front end to provide a unified view of already-existing collections of bioinformatics software. Documentation for the BIRCH and locally-added programs is merged in a hierarchical set of web pages. In addition to manual pages for individual programs, BIRCH tutorials employ step by step examples, with screen shots and sample files, to illustrate both the important theoretical and practical considerations behind complex analytical tasks. Conclusion BIRCH provides a versatile organizational framework for managing software and databases, and making these accessible to a user base. Because of its network-centric design, BIRCH makes it possible for any user to do any task from anywhere. PMID:17291351

  12. BIRCH: A user-oriented, locally-customizable, bioinformatics system

    Directory of Open Access Journals (Sweden)

    Fristensky Brian

    2007-02-01

    Full Text Available Abstract Background Molecular biologists need sophisticated analytical tools which often demand extensive computational resources. While finding, installing, and using these tools can be challenging, pipelining data from one program to the next is particularly awkward, especially when using web-based programs. At the same time, system administrators tasked with maintaining these tools do not always appreciate the needs of research biologists. Results BIRCH (Biological Research Computing Hierarchy is an organizational framework for delivering bioinformatics resources to a user group, scaling from a single lab to a large institution. The BIRCH core distribution includes many popular bioinformatics programs, unified within the GDE (Genetic Data Environment graphic interface. Of equal importance, BIRCH provides the system administrator with tools that simplify the job of managing a multiuser bioinformatics system across different platforms and operating systems. These include tools for integrating locally-installed programs and databases into BIRCH, and for customizing the local BIRCH system to meet the needs of the user base. BIRCH can also act as a front end to provide a unified view of already-existing collections of bioinformatics software. Documentation for the BIRCH and locally-added programs is merged in a hierarchical set of web pages. In addition to manual pages for individual programs, BIRCH tutorials employ step by step examples, with screen shots and sample files, to illustrate both the important theoretical and practical considerations behind complex analytical tasks. Conclusion BIRCH provides a versatile organizational framework for managing software and databases, and making these accessible to a user base. Because of its network-centric design, BIRCH makes it possible for any user to do any task from anywhere.

  13. BUSINESS PROCESSES TRANSFORMATION IN THE METHODOLOGY OF MULTILEVEL FINANCIAL MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Andrey G. Mikheev

    2016-01-01

    Full Text Available The article discusses the application of process approach to financial management. The multilevel financial management methodology is described. It is based on the delegation of the financial management functions in the downstream division of the organization, process automation, transfer of financial resources between units of different levels of the hierarchical structure of the credit institution, the implementation of a financial management mechanism of the process approach, execution of business processes in the computer environment, using the strategic management of the organization by changing the coefficients, which are the parameters of decentralized control mechanism, using the construction of the «fast» financial indicators, taking into account the terms of financial resources transfer transactions and effective transformation of business processes. The article is focused on the credit institution business management through the application of process transformation to transfer funds business processes.

  14. Bioinformatic tools for PCR Primer design

    African Journals Online (AJOL)

    ES

    Bioinformatics is an emerging scientific discipline that uses information ... complex biological questions. ... and computer programs for various purposes of primer ..... polymerase chain reaction: Human Immunodeficiency Virus 1 model studies.

  15. Challenge: A Multidisciplinary Degree Program in Bioinformatics

    Directory of Open Access Journals (Sweden)

    Mudasser Fraz Wyne

    2006-06-01

    Full Text Available Bioinformatics is a new field that is poorly served by any of the traditional science programs in Biology, Computer science or Biochemistry. Known to be a rapidly evolving discipline, Bioinformatics has emerged from experimental molecular biology and biochemistry as well as from the artificial intelligence, database, pattern recognition, and algorithms disciplines of computer science. While institutions are responding to this increased demand by establishing graduate programs in bioinformatics, entrance barriers for these programs are high, largely due to the significant prerequisite knowledge which is required, both in the fields of biochemistry and computer science. Although many schools currently have or are proposing graduate programs in bioinformatics, few are actually developing new undergraduate programs. In this paper I explore the blend of a multidisciplinary approach, discuss the response of academia and highlight challenges faced by this emerging field.

  16. Deciphering psoriasis. A bioinformatic approach.

    Science.gov (United States)

    Melero, Juan L; Andrades, Sergi; Arola, Lluís; Romeu, Antoni

    2018-02-01

    Psoriasis is an immune-mediated, inflammatory and hyperproliferative disease of the skin and joints. The cause of psoriasis is still unknown. The fundamental feature of the disease is the hyperproliferation of keratinocytes and the recruitment of cells from the immune system in the region of the affected skin, which leads to deregulation of many well-known gene expressions. Based on data mining and bioinformatic scripting, here we show a new dimension of the effect of psoriasis at the genomic level. Using our own pipeline of scripts in Perl and MySql and based on the freely available NCBI Gene Expression Omnibus (GEO) database: DataSet Record GDS4602 (Series GSE13355), we explore the extent of the effect of psoriasis on gene expression in the affected tissue. We give greater insight into the effects of psoriasis on the up-regulation of some genes in the cell cycle (CCNB1, CCNA2, CCNE2, CDK1) or the dynamin system (GBPs, MXs, MFN1), as well as the down-regulation of typical antioxidant genes (catalase, CAT; superoxide dismutases, SOD1-3; and glutathione reductase, GSR). We also provide a complete list of the human genes and how they respond in a state of psoriasis. Our results show that psoriasis affects all chromosomes and many biological functions. If we further consider the stable and mitotically inheritable character of the psoriasis phenotype, and the influence of environmental factors, then it seems that psoriasis has an epigenetic origin. This fit well with the strong hereditary character of the disease as well as its complex genetic background. Copyright © 2017 Japanese Society for Investigative Dermatology. Published by Elsevier B.V. All rights reserved.

  17. Integration mockup and process material management system

    Science.gov (United States)

    Verble, Adas James, Jr.

    1992-01-01

    Work to define and develop a full scale Space Station Freedom (SSF) mockup with the flexibility to evolve into future designs, to validate techniques for maintenance and logistics and verify human task allocations and support trade studies is described. This work began in early 1985 and ended in August, 1991. The mockups are presently being used at MSFC in Building 4755 as a technology and design testbed, as well as for public display. Micro Craft also began work on the Process Material Management System (PMMS) under this contract. The PMMS simulator was a sealed enclosure for testing to identify liquids, gaseous, particulate samples, and specimen including, urine, waste water, condensate, hazardous gases, surrogate gasses, liquids, and solids. The SSF would require many trade studies to validate techniques for maintenance and logistics and verify system task allocations; it was necessary to develop a full scale mockup which would be representative of current SSF design with the ease of changing those designs as the SSF design evolved and changed. The tasks defined for Micro Craft were to provide the personnel, services, tools, and materials for the SSF mockup which would consist of four modules, nodes, interior components, and part task mockups of MSFC responsible engineering systems. This included the Engineering Control and Life Support Systems (ECLSS) testbed. For the initial study, the mockups were low fidelity, soft mockups of graphics art bottle, and other low cost materials, which evolved into higher fidelity mockups as the R&D design evolved, by modifying or rebuilding, an important cost saving factor in the design process. We designed, fabricated, and maintained the full size mockup shells and support stands. The shells consisted of cylinders, end cones, rings, longerons, docking ports, crew airlocks, and windows. The ECLSS required a heavier cylinder to support the ECLSS systems test program. Details of this activity will be covered. Support stands were

  18. Concepts and introduction to RNA bioinformatics

    DEFF Research Database (Denmark)

    Gorodkin, Jan; Hofacker, Ivo L.; Ruzzo, Walter L.

    2014-01-01

    RNA bioinformatics and computational RNA biology have emerged from implementing methods for predicting the secondary structure of single sequences. The field has evolved to exploit multiple sequences to take evolutionary information into account, such as compensating (and structure preserving) base...... for interactions between RNA and proteins.Here, we introduce the basic concepts of predicting RNA secondary structure relevant to the further analyses of RNA sequences. We also provide pointers to methods addressing various aspects of RNA bioinformatics and computational RNA biology....

  19. Managing Change in Software Process Improvement

    DEFF Research Database (Denmark)

    Mathiassen, Lars; Ngwenyama, Ojelanki K.; Aaen, Ivan

    2005-01-01

    When software managers initiate SPI, most are ill prepared for the scale and complexity of the organizational change involved. Although they typically know how to deal with large software projects, few managers have sufficient experience with projects that transform organizations. To succeed with...

  20. Computerizing Maintenance Management Improves School Processes.

    Science.gov (United States)

    Conroy, Pat

    2002-01-01

    Describes how a Computerized Maintenance Management System (CMMS), a centralized maintenance operations database that facilitates work order procedures and staff directives, can help individual school campuses and school districts to manage maintenance. Presents the benefits of CMMS and things to consider in CMMS selection. (EV)

  1. Managing Automation: A Process, Not a Project.

    Science.gov (United States)

    Hoffmann, Ellen

    1988-01-01

    Discussion of issues in management of library automation includes: (1) hardware, including systems growth and contracts; (2) software changes, vendor relations, local systems, and microcomputer software; (3) item and authority databases; (4) automation and library staff, organizational structure, and managing change; and (5) environmental issues,…

  2. Investigating the success of operational business process management systems

    NARCIS (Netherlands)

    Poelmans, S.; Reijers, H.A.; Recker, J.

    2013-01-01

    Business process management systems (BPMS) belong to a class of enterprise information systems that are characterized by the dependence on explicitly modeled process logic. Through the process logic, it is relatively easy to manage explicitly the routing and allocation of work items along a business

  3. Survey process quality: a question of healthcare manager approach.

    Science.gov (United States)

    Nilsson, Petra; Blomqvist, Kerstin

    2017-08-14

    Purpose The purpose of this paper is to explore how healthcare first-line managers think about and act regarding workplace survey processes. Design/methodology/approach This interview study was performed at a hospital in south Sweden. First-line healthcare managers ( n=24) volunteered. The analysis was inspired by phenomenography, which aims to describe the ways in which different people experience a phenomenon. The phenomenon was a workplace health promotion (WHP) survey processes. Findings Four main WHP survey process approaches were identified among the managers: as a possibility, as a competition, as a work task among others and as an imposition. For each, three common subcategories emerged; how managers: stated challenges and support from hospital management; described their own work group and collaboration with other managers; and expressed themselves and their situation in their roles as first-line managers. Practical implications Insights into how hospital management can understand their first-line managers' motivation for survey processes and practical suggestions and how managers can work proactively at organizational, group and individual level are presented. Originality/value Usually these studies focus on those who should respond to a survey; not those who should run the survey process. Focusing on managers and not co-workers can lead to more committed and empowered managers and thereby success in survey processes.

  4. Fundamentals of bioinformatics and computational biology methods and exercises in matlab

    CERN Document Server

    Singh, Gautam B

    2015-01-01

    This book offers comprehensive coverage of all the core topics of bioinformatics, and includes practical examples completed using the MATLAB bioinformatics toolbox™. It is primarily intended as a textbook for engineering and computer science students attending advanced undergraduate and graduate courses in bioinformatics and computational biology. The book develops bioinformatics concepts from the ground up, starting with an introductory chapter on molecular biology and genetics. This chapter will enable physical science students to fully understand and appreciate the ultimate goals of applying the principles of information technology to challenges in biological data management, sequence analysis, and systems biology. The first part of the book also includes a survey of existing biological databases, tools that have become essential in today’s biotechnology research. The second part of the book covers methodologies for retrieving biological information, including fundamental algorithms for sequence compar...

  5. ZBIT Bioinformatics Toolbox: A Web-Platform for Systems Biology and Expression Data Analysis.

    Science.gov (United States)

    Römer, Michael; Eichner, Johannes; Dräger, Andreas; Wrzodek, Clemens; Wrzodek, Finja; Zell, Andreas

    2016-01-01

    Bioinformatics analysis has become an integral part of research in biology. However, installation and use of scientific software can be difficult and often requires technical expert knowledge. Reasons are dependencies on certain operating systems or required third-party libraries, missing graphical user interfaces and documentation, or nonstandard input and output formats. In order to make bioinformatics software easily accessible to researchers, we here present a web-based platform. The Center for Bioinformatics Tuebingen (ZBIT) Bioinformatics Toolbox provides web-based access to a collection of bioinformatics tools developed for systems biology, protein sequence annotation, and expression data analysis. Currently, the collection encompasses software for conversion and processing of community standards SBML and BioPAX, transcription factor analysis, and analysis of microarray data from transcriptomics and proteomics studies. All tools are hosted on a customized Galaxy instance and run on a dedicated computation cluster. Users only need a web browser and an active internet connection in order to benefit from this service. The web platform is designed to facilitate the usage of the bioinformatics tools for researchers without advanced technical background. Users can combine tools for complex analyses or use predefined, customizable workflows. All results are stored persistently and reproducible. For each tool, we provide documentation, tutorials, and example data to maximize usability. The ZBIT Bioinformatics Toolbox is freely available at https://webservices.cs.uni-tuebingen.de/.

  6. Framework for Knowledge Management Processes in Supply Chain

    Directory of Open Access Journals (Sweden)

    Mohsen Shafiei Nikabadi

    2014-02-01

    The innovation aspect of the research is to provide a comprehensive framework for knowledge management processes in supply chain of automotive industry with main indicators for each process. Several investigations have been made for knowledge management but specific research on knowledge management processes in the supply chain has not been observed. Thus providing the framework and indicators for each component of the framework is the innovation of the research

  7. Ergonomics Integration Omproving Production Process Management in Enterprises of Latvia

    OpenAIRE

    Henrijs Kaļķis

    2013-01-01

    Dotoral thesis ERGONOMICS INTEGRATION IMPROVING PRODUCTION PROCESS MANAGEMENT IN ENTERPRISES OF LATVIA ANNOTATION Ergonomics integration in process management has great significance in organisations` growth of productivity. It is a new approach to entrepreneurship and business strategy, where ergonomic aspects and values are taken into account in ensuring the effective process management and profitability of enterprises. This study is aimed at solution of the problem of e...

  8. Education management process implementation of reforms

    Directory of Open Access Journals (Sweden)

    A. V. Kondratyeva

    2013-12-01

    In the dissertation research looks at the problem of the study. This article contains material research and evaluate different points of view on the issue of a systematic approach using educational management in the implementation of reforms.

  9. RISK MANAGEMENT MECHANISM OF GRAIN PROCESSING ENTERPRISES

    Directory of Open Access Journals (Sweden)

    I. P. Bogomolova

    2013-01-01

    Full Text Available In article the main characteristics of a control system by risks (the purpose, properties, the principles, requirements are defined, and also the possible mechanism of risk management of the flour-grinding enterprises is considered.

  10. Total Ore Processing Integration and Management

    Energy Technology Data Exchange (ETDEWEB)

    Leslie Gertsch

    2006-05-15

    This report outlines the technical progress achieved for project DE-FC26-03NT41785 (Total Ore Processing Integration and Management) during the period 01 January through 31 March of 2006. (1) Work in Progress: Minntac Mine--Graphical analysis of drill monitor data moved from two-dimensional horizontal patterns to vertical variations in measured and calculated parameters. The rock quality index and the two dimensionless ({pi}) indices developed by Kewen Yin of the University of Minnesota are used by Minntac Mine to design their blasts, but the drill monitor data from any given pattern is obviously not available for the design of that shot. Therefore, the blast results--which are difficult to quantify in a short time--must be back-analyzed for comparison with the drill monitor data to be useful for subsequent blast designs. {pi}{sub 1} indicates the performance of the drill, while {pi}{sub 2} is a measure of the rock resistance to drilling. As would be expected, since a drill tends to perform better in rock that offers little resistance, {pi}{sub 1} and {pi}{sub 2} are strongly inversely correlated; the relationship is a power function rather than simply linear. Low values of each Pi index tend to be quantized, indicating that these two parameters may be most useful above certain minimum magnitudes. (2) Work in Progress: Hibtac Mine--Statistical examination of a data set from Hibtac Mine (Table 1) shows that incorporating information on the size distribution of material feeding from the crusher to the autogenous mills improves the predictive capability of the model somewhat (43% vs. 44% correlation coefficient), but a more important component is production data from preceding days (26% vs. 44% correlation coefficient), determined using exponentially weighted moving average predictive variables. This lag effect likely reflects the long and varied residence times of the different size fragments in the grinding mills. The rock sizes are also correlated with the geologic

  11. Managing the high level waste nuclear regulatory commission licensing process

    International Nuclear Information System (INIS)

    Baskin, K.P.

    1992-01-01

    This paper reports that the process for obtaining Nuclear Regulatory Commission permits for the high level waste storage facility is basically the same process commercial nuclear power plants followed to obtain construction permits and operating licenses for their facilities. Therefore, the experience from licensing commercial reactors can be applied to the high level waste facility. Proper management of the licensing process will be the key to the successful project. The management of the licensing process was categorized into four areas as follows: responsibility, organization, communication and documentation. Drawing on experience from nuclear power plant licensing and basic management principles, the management requirement for successfully accomplishing the project goals are discussed

  12. Life Support Systems: Wastewater Processing and Water Management

    Data.gov (United States)

    National Aeronautics and Space Administration — Advanced Exploration Systems (AES) Life Support Systems project Wastewater Processing and Water Management task: Within an integrated life support system, water...

  13. Human resources management in a translation process

    OpenAIRE

    Rogelj, Jure

    2015-01-01

    The purpose of the web application development is the modernization of the current data acquisition and management model for new and existing translators in the company Iolar d.o.o. Previously data on translators who signed up to work in the company were entered multiple times as they were entered through several entry points. The acquired data were then manually entered into an MS Excel sheet and the Projetex program. We analyzed the current data acquisition and management model as well ...

  14. Human resources management in a translation process

    OpenAIRE

    Rogelj, Jure

    2014-01-01

    The purpose of the web application development is the modernization of the current data acquisition and management model for new and existing translators in the company Iolar d.o.o. Previously data on translators who signed up to work in the company were entered multiple times as they were entered through several entry points. The acquired data were then manually entered into an MS Excel sheet and the Projetex program. We analyzed the current data acquisition and management model as well ...

  15. Organizational structure features supporting knowledge management processes

    OpenAIRE

    Claver-Cortés, Enrique; Zaragoza Sáez, Patrocinio del Carmen; Pertusa-Ortega, Eva

    2007-01-01

    Purpose – The idea that knowledge management can be a potential source of competitive advantage has gained strength in the last few years. However, a number of business actions are needed to generate an appropriate environment and infrastructure for knowledge creation, transfer and application. Among these actions there stands out the design of an organizational structure, the link of which with knowledge management is the main concern here. More specifically, the present paper has as its aim...

  16. Towards an Evaluation Framework for Business Process Integration and Management

    NARCIS (Netherlands)

    Mutschler, B.B.; Reichert, M.U.; Bumiller, J.

    2005-01-01

    Process-awareness in enterprise computing is a must in order to adequately support business processes. Particularly the interoperability of the (process-oriented) business information systems and the management of a company’s process map are difficult to handle. Process-oriented approaches (like

  17. GOBLET: The Global Organisation for Bioinformatics Learning, Education and Training

    Science.gov (United States)

    Atwood, Teresa K.; Bongcam-Rudloff, Erik; Brazas, Michelle E.; Corpas, Manuel; Gaudet, Pascale; Lewitter, Fran; Mulder, Nicola; Palagi, Patricia M.; Schneider, Maria Victoria; van Gelder, Celia W. G.

    2015-01-01

    In recent years, high-throughput technologies have brought big data to the life sciences. The march of progress has been rapid, leaving in its wake a demand for courses in data analysis, data stewardship, computing fundamentals, etc., a need that universities have not yet been able to satisfy—paradoxically, many are actually closing “niche” bioinformatics courses at a time of critical need. The impact of this is being felt across continents, as many students and early-stage researchers are being left without appropriate skills to manage, analyse, and interpret their data with confidence. This situation has galvanised a group of scientists to address the problems on an international scale. For the first time, bioinformatics educators and trainers across the globe have come together to address common needs, rising above institutional and international boundaries to cooperate in sharing bioinformatics training expertise, experience, and resources, aiming to put ad hoc training practices on a more professional footing for the benefit of all. PMID:25856076

  18. Best practices in bioinformatics training for life scientists.

    KAUST Repository

    Via, Allegra

    2013-06-25

    The mountains of data thrusting from the new landscape of modern high-throughput biology are irrevocably changing biomedical research and creating a near-insatiable demand for training in data management and manipulation and data mining and analysis. Among life scientists, from clinicians to environmental researchers, a common theme is the need not just to use, and gain familiarity with, bioinformatics tools and resources but also to understand their underlying fundamental theoretical and practical concepts. Providing bioinformatics training to empower life scientists to handle and analyse their data efficiently, and progress their research, is a challenge across the globe. Delivering good training goes beyond traditional lectures and resource-centric demos, using interactivity, problem-solving exercises and cooperative learning to substantially enhance training quality and learning outcomes. In this context, this article discusses various pragmatic criteria for identifying training needs and learning objectives, for selecting suitable trainees and trainers, for developing and maintaining training skills and evaluating training quality. Adherence to these criteria may help not only to guide course organizers and trainers on the path towards bioinformatics training excellence but, importantly, also to improve the training experience for life scientists.

  19. 2nd Colombian Congress on Computational Biology and Bioinformatics

    CERN Document Server

    Cristancho, Marco; Isaza, Gustavo; Pinzón, Andrés; Rodríguez, Juan

    2014-01-01

    This volume compiles accepted contributions for the 2nd Edition of the Colombian Computational Biology and Bioinformatics Congress CCBCOL, after a rigorous review process in which 54 papers were accepted for publication from 119 submitted contributions. Bioinformatics and Computational Biology are areas of knowledge that have emerged due to advances that have taken place in the Biological Sciences and its integration with Information Sciences. The expansion of projects involving the study of genomes has led the way in the production of vast amounts of sequence data which needs to be organized, analyzed and stored to understand phenomena associated with living organisms related to their evolution, behavior in different ecosystems, and the development of applications that can be derived from this analysis.  .

  20. TWRS process engineering data management plan

    Energy Technology Data Exchange (ETDEWEB)

    Adams, M.R.

    1997-05-12

    The Tank Characterization Data Management (TCDM) system provides customers and users with data and information of known and acceptable quality when they are needed, in the form they are needed, and at a reasonable cost. The TCDM mission will be accomplished by the following: (1) maintaining and managing tank characterization data and information based on business needs and objectives including transfer of ownership to future contractors; (2) capturing data where it originates and entering it only once to control data consistency, electronic data and information management shall be emphasized to the extent practicable; (3) establishing data quality standards, and managing and certifying databases and data sources against these standards to maintain the proper level of data and information quality consistent with the importance of the data and information, data obtained at high cost with significant implications to decision making regarding tank safety and/or disposal will be maintained and managed at the highest necessary levels of quality; (4) establishing and enforcing data management standards for the Tank Characterization Database (TCD) and supporting data sources including providing mechanisms for discovering and correcting data errors before they propagate; (5) emphasizing electronic data sharing with all authorized users, customers, contractors, and stakeholders to the extent practicable; (6) safeguarding data and information from unauthorized alteration or destruction; (7) providing standards for electronic information deliverables to subcontractors and vendors to achieve uniformity in electronic data management; and (8) investing in new technology (hardware and/or software) as prudent and necessary to accomplish the mission in an efficient and effective manner.

  1. An Application of Business Process Management to Health Care Facilities.

    Science.gov (United States)

    Hassan, Mohsen M D

    The purpose of this article is to help health care facility managers and personnel identify significant elements of their facilities to address, and steps and actions to follow, when applying business process management to them. The ABPMP (Association of Business Process Management Professionals) life-cycle model of business process management is adopted, and steps from Lean, business process reengineering, and Six Sigma, and actions from operations management are presented to implement it. Managers of health care facilities can find in business process management a more comprehensive approach to improving their facilities than Lean, Six Sigma, business process reengineering, and ad hoc approaches that does not conflict with them because many of their elements can be included under its umbrella. Furthermore, the suggested application of business process management can guide and relieve them from selecting among these approaches, as well as provide them with specific steps and actions that they can follow. This article fills a gap in the literature by presenting a much needed comprehensive application of business process management to health care facilities that has specific steps and actions for implementation.

  2. Defining Incident Management Processes for CSIRTs: A Work in Progress

    National Research Council Canada - National Science Library

    Alberts, Chris; Dorofee, Audrey; Killcrece, Georgia; Ruefle, Robin; Zajicek, Mark

    2004-01-01

    .... Workflow diagrams and descriptions are provided for each of these processes. One advantage of the model is that it enables examination of incident management processes that cross organizational boundaries, both internally and externally...

  3. Navigating the changing learning landscape: perspective from bioinformatics.ca

    OpenAIRE

    Brazas, Michelle D.; Ouellette, B. F. Francis

    2013-01-01

    With the advent of YouTube channels in bioinformatics, open platforms for problem solving in bioinformatics, active web forums in computing analyses and online resources for learning to code or use a bioinformatics tool, the more traditional continuing education bioinformatics training programs have had to adapt. Bioinformatics training programs that solely rely on traditional didactic methods are being superseded by these newer resources. Yet such face-to-face instruction is still invaluable...

  4. Service management process maps your route to service excellence

    CERN Document Server

    Associates, Computer

    2007-01-01

    ITIL® has become the de facto standard in Service Management best practice processes and the lens through which the value of IT services is viewed and measured. ITIL, with its emphasis on standardizing IT processes and building a common language, helps you better manage and optimize the cost and quality of IT services.As organizations look to implement quality Service Management processes, the same questions repeatedly arise: How do all these processes interface with each other? How do the processes work within a culture of change and evolution? How can we easily communicate

  5. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses.

    Science.gov (United States)

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-06-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. ISO 9001 2000 : the quality management process

    CERN Document Server

    Tricker, Ray

    2006-01-01

    With the publication of ISO 9001:2000, there is now a single quality management ?requirements? standard that is applicable to all organisations, products and services. ISO 9001:2000 is the only standard that can be used for the certification of a QMS and its generic requirements can be used by any organisation.ISO 9001:2000 applies to all types or organisations. It is the quality standard which specifies the requirements of quality management systems for use where organisations need to demonstrate their capability to provide products and services which meet both customer needs and relevant reg

  7. Spanning organizational boundaries to manage creative processes:

    DEFF Research Database (Denmark)

    Andersen, Poul Houman; Kragh, Hanne; Lettl, Christopher

    2013-01-01

    In order to continue to be innovative in the current fast-paced and competitive environment, organizations are increasingly dependent on creative inputs developed outside their boundaries. The paper addresses the boundary spanning activities that managers undertake to a) select and mobilize...... creative talent, b) create shared identity, and c) combine and integrate knowledge in innovation projects involving external actors. We study boundary spanning activities in two creative projects in the LEGO group. One involves identifying and integrating deep, specialized knowledge, the other focuses...... actors, and how knowledge is integrated across organizational boundaries. We discuss implications of our findings for managers and researchers in a business-to-business context...

  8. [Case management process identified from experience of nurse case managers].

    Science.gov (United States)

    Park, Eun-Jun; Kim, Chunmi

    2008-12-01

    The purpose of this study was to develop a substantive theory of case management (CM) practice by investigating the experience of nurse case managers caring for Medical Aid enrollees in Korea. A total of 12 nurses were interviewed regarding their own experience in CM practice. Data were recorded and analyzed using grounded theory. Empowerment was the core category of CM for Medical Aid enrollees. The case managers engaged in five phases as follows, phase of inquiring in advance, building a relationship with the client, giving the client critical mind, facilitating positive changes in the client's use of healthcare services, and maintaining relationship bonds. These phases moved gradually and were circular if necessary. Also, they were accelerated or slowed depending on factors including clients' characteristics, case managers' competency level, families' support level, and availability of community resources. This study helps understand what CM practice is and how nurses are performing this innovative CM role. It is recommended that nurse leaders and policy makers integrate empowerment as a core category and the five critical CM phases into future CM programs.

  9. ESG Integration and the Investment Management Process : Fundamental Investing Reinvented

    NARCIS (Netherlands)

    van Duuren, Emiel; Plantinga, Auke; Scholtens, Bert

    2016-01-01

    We investigate how conventional asset managers account for environmental, social and governance factors (ESG) in their investment process. We do so on the basis of an international survey among fund managers. We find that many conventional managers integrate responsible investing in their investment

  10. Contract Management Process Maturity: Empirical Analysis of Organizational Assessments

    Science.gov (United States)

    2009-08-27

    with the National Contract Management Association (NCMA), a Certified Purchasing Manager ( CPM ) with the Institute for Supply Management (ISM), and a...include advertising procurement opportunities, conducting industry and pre-proposal conferences, and amending solicitation documents as required. 4...with organizational core processes include advertising procurement opportunities, conducting solicitation and pre-proposal conferences, and amending

  11. Towards Business Process Management in networked ecosystems

    NARCIS (Netherlands)

    Johan Versendaal; dr. Martijn Zoet; Jeroen Grondelle

    2014-01-01

    Managing and supporting the collaboration between different actors is key in any organizational context, whether of a hierarchical or a networked nature. In the networked context of ecosystems of service providers and other stakeholders, BPM is faced with different challenges than in a conventional

  12. Knowledge management: processes and systems | Igbinovia ...

    African Journals Online (AJOL)

    Information Impact: Journal of Information and Knowledge Management. Journal Home · ABOUT THIS JOURNAL · Advanced Search · Current Issue · Archives · Journal Home > Vol 8, No 3 (2017) >. Log in or Register to get access to full text downloads.

  13. Management of the process of nuclear transport

    International Nuclear Information System (INIS)

    Requejo, P.

    2015-01-01

    Since 1996 ETSA is the only Spanish logistics operator specialized on servicing the nuclear and radioactive industry. Nowadays ETSA has some technological systems specifically designed for the management of nuclear transports. These tools have been the result of the analysis of multiple factors involved in nuclear shipments, of ETSAs wide experience as a logistics operator and the search for continuous improvement. (Author)

  14. Health care management modelling: a process perspective

    NARCIS (Netherlands)

    Vissers, J.M.H.

    1998-01-01

    Modelling-based health care management ought to become just as popular as evidence based medicine. Making managerial decisions based on evidence by modelling efforts is certainly a step forward. Examples can be given of many successful applications in different areas of decision making: disease

  15. IDEA MANAGEMENT IN THE INNOVATION PROCESS

    Directory of Open Access Journals (Sweden)

    Cătălin George ALEXE

    2014-12-01

    Full Text Available The employees of a company often want to make themselves useful and to make life easier at work by providing potentially useful ideas, aimed at eliminating problems or to exploit the opportunities. Without the ability to obtain new ideas, an organization stagnates, declines and eventually is eliminated by the competitors who have new ideas. To materialize the idea into an innovative product, it is desirable that it corresponds to the company's goals to be achieved with the existing technology and resources in order to reduce the investments. Thus, it appeared the need for an idea management to bring order in the set of ideas and to create a transparent and effective mode in attracting and management of these ideas. This paper proposes, starting from a number of scientific approaches in the literature, to address to the idea management as a complex model and to identify which are those dedicated IT solutions that could help going over various phases and sub-phases of such a complex model, particularly useful for the management of a company.

  16. A process-based approach to management of the enterprise

    Directory of Open Access Journals (Sweden)

    Ryzhakina Tatiana

    2016-01-01

    Full Text Available Establishing an efficient management system is an especially pressing issue for machinery industry as a basic sector of economy in a country. The present paper considers establishing a management system oriented towards increasing enterprise value and customer satisfaction through the integration of process-based management and a Balanced Scorecard. An integrated management system enables structuring of organization processes, redesigning them with regard to external changes as well as applying a balanced scorecard connecting functional units by means of defining strategic objectives and measurable indicators detailing and controlling these objectives and thus increasing the efficiency of processes orienting an organization towards a customer.

  17. Accumulating Project Management Knowledge Using Process Theory

    NARCIS (Netherlands)

    Niederman, Fred; March, Salvatore T.; Mueller, Benjamin

    2016-01-01

    Process theory has become an important mechanism for the accumulation of knowledge in a number of disciplines. In contrast with variance theory, which focuses on co-variation of dependent and independent variables, process theory focuses on sequences of activities, their duration and the intervals

  18. Processes of Strategic Renewal, Competencies, and the Management of Speed

    OpenAIRE

    Volker Mahnke; John Harald Aadne

    1998-01-01

    We discuss strategic renewal from a competence perspective. We argue that the management of speed and timing in this process is viewed distinctively when perceived through a cognitive lens. Managers need more firmly grounded process-understanding. The key idea of this paper is to dynamically conceptualize key activities of strategic renewal, and possible sources of break-down as they relate to the managment of speed and timing. Based on a case from the media industry, we identify managerial t...

  19. Process management in healthcare. Sant Camil Hospital case study

    OpenAIRE

    Sánchez Ruiz, Lidia; Blanco Rojo, Beatriz; Simón, Rosa María

    2013-01-01

    Nowadays due to the crisis, some government measures are aimed at reducing healthcare spending, affecting in some level or another the quality offered. Process management is said to be a useful tool for reducing healthcare costs by improving management without any additional economic investment. That is doing more with the same resources and without reducing the quality offered. In this study an empirical case of a Catalan hospital is presented. Overall, the usefulness of process management i...

  20. Information systems for material flow management in construction processes

    Science.gov (United States)

    Mesároš, P.; Mandičák, T.

    2015-01-01

    The article describes the options for the management of material flows in the construction process. Management and resource planning is one of the key factors influencing the effectiveness of construction project. It is very difficult to set these flows correctly. The current period offers several options and tools to do this. Information systems and their modules can be used just for the management of materials in the construction process.

  1. The management process, management information and control systems, and cybernetics

    Science.gov (United States)

    Zannetos, Z. S.; Wilcox, J. W.

    1972-01-01

    An attempt has been made to analyze the strengths and weaknesses of the cybernetics approach as applied to management. The conclusion is that cybernetics can serve not only as a conceptual philosophical aid, but also as an operational tool in both managerial planning and control. So far, however, most of its promise is yet unrealized; especially in the planning sphere. Only in the area of control of operations has the impact of this promising field shown tangible results.

  2. p3d--Python module for structural bioinformatics.

    Science.gov (United States)

    Fufezan, Christian; Specht, Michael

    2009-08-21

    High-throughput bioinformatic analysis tools are needed to mine the large amount of structural data via knowledge based approaches. The development of such tools requires a robust interface to access the structural data in an easy way. For this the Python scripting language is the optimal choice since its philosophy is to write an understandable source code. p3d is an object oriented Python module that adds a simple yet powerful interface to the Python interpreter to process and analyse three dimensional protein structure files (PDB files). p3d's strength arises from the combination of a) very fast spatial access to the structural data due to the implementation of a binary space partitioning (BSP) tree, b) set theory and c) functions that allow to combine a and b and that use human readable language in the search queries rather than complex computer language. All these factors combined facilitate the rapid development of bioinformatic tools that can perform quick and complex analyses of protein structures. p3d is the perfect tool to quickly develop tools for structural bioinformatics using the Python scripting language.

  3. p3d – Python module for structural bioinformatics

    Directory of Open Access Journals (Sweden)

    Fufezan Christian

    2009-08-01

    Full Text Available Abstract Background High-throughput bioinformatic analysis tools are needed to mine the large amount of structural data via knowledge based approaches. The development of such tools requires a robust interface to access the structural data in an easy way. For this the Python scripting language is the optimal choice since its philosophy is to write an understandable source code. Results p3d is an object oriented Python module that adds a simple yet powerful interface to the Python interpreter to process and analyse three dimensional protein structure files (PDB files. p3d's strength arises from the combination of a very fast spatial access to the structural data due to the implementation of a binary space partitioning (BSP tree, b set theory and c functions that allow to combine a and b and that use human readable language in the search queries rather than complex computer language. All these factors combined facilitate the rapid development of bioinformatic tools that can perform quick and complex analyses of protein structures. Conclusion p3d is the perfect tool to quickly develop tools for structural bioinformatics using the Python scripting language.

  4. [Relationship Between Members Satisfaction with Service Club Management Processes and Perception of Club Management System.

    Science.gov (United States)

    Dawson, Frances Trigg

    A study was made to determine the relationships between (1) satisfaction of members with service club management processes and member's perception of management systems, (2) perception of service club management system to selected independent variables, and (3) satisfaction to perception of service club management systems with independent…

  5. Risk management of new product development process

    OpenAIRE

    Aleixo, Gonçalo Granja

    2009-01-01

    Dissertation submitted to Faculdade de Ciências e Tecnologia of Universidade Nova de Lisboa for the achievement of Integrated Master´s degree in Industrial Management Engineering Winners in today´s global changing environment, are those who continuously pursuit innovations in order to guarantee their sustainability. If in the presence of a certain environment many enterprises makes enormous mistakes, in an uncertain environment as the development of innovations, these mistakes will be mult...

  6. Customer Relationship Management : Strategy Process Evaluation

    OpenAIRE

    Din, Fazal

    2014-01-01

    Before the advent of globalization the mantra of "Customer is King", was preached by large organizations. With the development of technology and the business environment, multinationals better understand the importance of customer retention and customer loyalty. Customer loyalty, attained by sustainable customer relationship, is the focus of most of the companies around the world. For any large organization customer relationship management, under the umbrella of business vision, mission and o...

  7. Improvement of radiology services based on the process management approach

    International Nuclear Information System (INIS)

    Amaral, Creusa Sayuri Tahara; Rozenfeld, Henrique; Costa, Janaina Mascarenhas Hornos; Magon, Maria de Fatima de Andrade; Mascarenhas, Yvone Maria

    2011-01-01

    The health sector requires continuous investments to ensure the improvement of products and services from a technological standpoint, the use of new materials, equipment and tools, and the application of process management methods. Methods associated with the process management approach, such as the development of reference models of business processes, can provide significant innovations in the health sector and respond to the current market trend for modern management in this sector (Gunderman et al. (2008) ). This article proposes a process model for diagnostic medical X-ray imaging, from which it derives a primary reference model and describes how this information leads to gains in quality and improvements.

  8. BUSINESS PROCESS MODELLING: A FOUNDATION FOR KNOWLEDGE MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Vesna Bosilj-Vukšić

    2006-12-01

    Full Text Available Knowledge management (KM is increasingly recognised as a strategic practice of knowledge-intensive companies, becoming an integral part of an organisation's strategy to improve business performance. This paper provides an overview of business process modelling applications and analyses the relationship between business process modelling and knowledge management projects. It presents the case study of Croatian leading banks and the insurance company, discussing its practical experience in conducting business process modelling projects and investigating the opportunity for integrating business process repository and organisational knowledge as the foundation for knowledge management system development.

  9. Process innovation in tourism management: A review of the literature

    Directory of Open Access Journals (Sweden)

    Irma Elia Damian

    2015-06-01

    Full Text Available Purpose: Identifying gaps in the academic literature regarding three great management topics: Process Innovation, Tourism Management and Process Innovation in Tourism Management in order to establish the conceptual framework and identify future lines of research. Design/methodology: For this research, a systematic review was conducted on items obtained from accredited databases recognized as EBSCO, OECD, among others, the literature on Process Innovation, Tourism Management and Process Innovation in Tourism Management. Findings: As a result of this review the opportunity for academic research was identified, this due to a theoretical gap which exists on the subject of Process Innovation in Tourism Management. Research limitations/implications: For the realization of this research the databases used were: EBSCO, Emerald, OECD, ProQuest and Scientific Research. The authors are aware that there may be other papers of the subject that were not considered in this article. Practical implications: Following this research the road to research other issues is opened, such as: what impact does it have and how do Process Innovations in Tourism Management are carried along in organizations belonging to Sector Originality/value: The knowledge of how process innovation in tourism management occurs in hospitality organizations gives a better understanding and comprehension of this subjects opening the possibility, for other organizations, of adopting and adapting them in areas related to customer satisfaction, improvement of the image of the organization and quality of the service offered.

  10. Management of Talent Development Process in Sport

    OpenAIRE

    SEVİMLİ, Dilek

    2015-01-01

    In the development of elite athletes, talent identification and education, is a complex and multidimensional process. It is difficult to predict the future performance depending on the increasing amount of technical, tactical, conditioning and psychological needs in a sport. Factors such as children’s developmental stages and levels, gender, athlete development programs, social support, the quality of coaches, access to equipment and facilities can affect talent development process.Phases of ...

  11. Development of advanced spent fuel management process

    International Nuclear Information System (INIS)

    Shin, Young Joon; Cho, S. H.; You, G. S.

    2001-04-01

    Currently, the economic advantage of any known approach to the back end fuel cycle of a nuclear power reactor has not been well established. Thus the long term storage of the spent fuel in a safe manner is one of the important issues to be resolved in countries where the nuclear power has a relatively heavy weight in power production of that country. At KAERI, as a solution to this particular issue midterm storage of the spent fuel, an alternative approach has been developed. This approach includes the decladding and pulverization process of the spent PWR fuel rod, the reducing process from the uranium oxide to a metallic uranium powder using Li metal in a LiCl salt, the continuous casting process of the reduced metal, and the recovery process of Li from mixed salts by the electrolysis. We conducted the laboratory scale tests of each processes for the technical feasibility and determination for the operational conditions for this approach. Also, we performed the theoretical safety analysis and conducted integral tests for the equipment integration through the Mock-up facility with non-radioactive samples. There were no major issues in the approach, however, material incompatibility of the alkaline metal and oxide in a salt at a high temperature and the reactor that contains the salt became a show stopper of the process. Also the difficulty of the clear separation of the salt with metals reduced from the oxide became a major issue

  12. Understanding and Managing Process Interaction in IS Development Projects

    DEFF Research Database (Denmark)

    Bygstad, Bendik; Nielsen, Peter Axel

    2005-01-01

    Increasingly, information systems must be developed and implemented as a part of business change. This is a challenge for the IS project manager, since business change and information systems development usually are performed as separate processes. Thus, there is a need to understand and manage......-technical innovation in a situation where the organisational change process and the IS development process are parallel but incongruent. We also argue that iterative software engineering frameworks are well structured to support process interaction. Finally, we advocate that the IS project manager needs to manage...... the relationship between these two kinds of processes. To understand the interaction between information systems development and planned organisational change we introduce the concept of process interaction. We draw on a longitudinal case study of an IS development project that used an iterative and incremental...

  13. A big picture prospective for wet waste processing management

    International Nuclear Information System (INIS)

    Gibson, J.D.

    1996-01-01

    This paper provides an overview of general observations made relative to the technical and economical considerations being evaluated by many commercial nuclear power plants involving their decision making process for implementation of several new wet waste management technologies. The waste management processes reviewed include the use of, Reverse Osmosis, Non-Precoat Filters, Resin Stripping ampersand Recycling, Evaporation ampersand Calcination (RVR trademark, ROVER trademark ampersand Thermax trademark), Compression Dewatering (PressPak trademark), Incineration (Resin Express trademark), Survey ampersand Free Release (Green Is Clean) and Quantum Catalytic Extraction Processing (QCEP trademark). These waste management processes are reviewed relative to their general advantages and disadvantages associated with the processing of various wet waste streams including: reactor make-up water, floor drain sludges and other liquid waste streams such as boric acid concentrates and steam generator cleaning solutions. A summary of the conclusions generally being derived by most utilities associated with the use of these waste management processes is also provided

  14. Process Architecture for Managing Digital Object Identifiers

    Science.gov (United States)

    Wanchoo, L.; James, N.; Stolte, E.

    2014-12-01

    In 2010, NASA's Earth Science Data and Information System (ESDIS) Project implemented a process for registering Digital Object Identifiers (DOIs) for data products distributed by Earth Observing System Data and Information System (EOSDIS). For the first 3 years, ESDIS evolved the process involving the data provider community in the development of processes for creating and assigning DOIs, and guidelines for the landing page. To accomplish this, ESDIS established two DOI User Working Groups: one for reviewing the DOI process whose recommendations were submitted to ESDIS in February 2014; and the other recently tasked to review and further develop DOI landing page guidelines for ESDIS approval by end of 2014. ESDIS has recently upgraded the DOI system from a manually-driven system to one that largely automates the DOI process. The new automated feature include: a) reviewing the DOI metadata, b) assigning of opaque DOI name if data provider chooses, and c) reserving, registering, and updating the DOIs. The flexibility of reserving the DOI allows data providers to embed and test the DOI in the data product metadata before formally registering with EZID. The DOI update process allows the changing of any DOI metadata except the DOI name unless the name has not been registered. Currently, ESDIS has processed a total of 557 DOIs of which 379 DOIs are registered with EZID and 178 are reserved with ESDIS. The DOI incorporates several metadata elements that effectively identify the data product and the source of availability. Of these elements, the Uniform Resource Locator (URL) attribute has the very important function of identifying the landing page which describes the data product. ESDIS in consultation with data providers in the Earth Science community is currently developing landing page guidelines that specify the key data product descriptive elements to be included on each data product's landing page. This poster will describe in detail the unique automated process and

  15. Managing the exploration process: conference papers

    International Nuclear Information System (INIS)

    1999-01-01

    The conference includes eight articles on the theme of the meeting including: I - creating an environment that fosters exploration and development ideas; II - integrating a global perspective when setting objectives for exploration planning; III - practical issues is setting exploration objectives; IV - portfolio analysis of exploration prospect of ideas; V - the effective presentation of exploration prospects; V I - the future of information management; VII - assessing exploration assets; and V III - environmental and regulatory considerations when planning an exploratory well. Individual articles indexed/abstracted separately include: articles I, II, III, VII, and V III

  16. Crew Management Processes Revitalize Patient Care

    Science.gov (United States)

    2009-01-01

    In 2005, two physicians, former NASA astronauts, created LifeWings Partners LLC in Memphis, Tennessee and began using Crew Resource Management (CRM) techniques developed at Ames Research Center in the 1970s to help improve safety and efficiency at hospitals. According to the company, when hospitals follow LifeWings? training, they can see major improvements in a number of areas, including efficiency, employee satisfaction, operating room turnaround, patient advocacy, and overall patient outcomes. LifeWings has brought its CRM training to over 90 health care organizations and annual sales have remained close to $3 million since 2007.

  17. Genomics Virtual Laboratory: A Practical Bioinformatics Workbench for the Cloud.

    Directory of Open Access Journals (Sweden)

    Enis Afgan

    Full Text Available Analyzing high throughput genomics data is a complex and compute intensive task, generally requiring numerous software tools and large reference data sets, tied together in successive stages of data transformation and visualisation. A computational platform enabling best practice genomics analysis ideally meets a number of requirements, including: a wide range of analysis and visualisation tools, closely linked to large user and reference data sets; workflow platform(s enabling accessible, reproducible, portable analyses, through a flexible set of interfaces; highly available, scalable computational resources; and flexibility and versatility in the use of these resources to meet demands and expertise of a variety of users. Access to an appropriate computational platform can be a significant barrier to researchers, as establishing such a platform requires a large upfront investment in hardware, experience, and expertise.We designed and implemented the Genomics Virtual Laboratory (GVL as a middleware layer of machine images, cloud management tools, and online services that enable researchers to build arbitrarily sized compute clusters on demand, pre-populated with fully configured bioinformatics tools, reference datasets and workflow and visualisation options. The platform is flexible in that users can conduct analyses through web-based (Galaxy, RStudio, IPython Notebook or command-line interfaces, and add/remove compute nodes and data resources as required. Best-practice tutorials and protocols provide a path from introductory training to practice. The GVL is available on the OpenStack-based Australian Research Cloud (http://nectar.org.au and the Amazon Web Services cloud. The principles, implementation and build process are designed to be cloud-agnostic.This paper provides a blueprint for the design and implementation of a cloud-based Genomics Virtual Laboratory. We discuss scope, design considerations and technical and logistical constraints

  18. Genomics Virtual Laboratory: A Practical Bioinformatics Workbench for the Cloud.

    Science.gov (United States)

    Afgan, Enis; Sloggett, Clare; Goonasekera, Nuwan; Makunin, Igor; Benson, Derek; Crowe, Mark; Gladman, Simon; Kowsar, Yousef; Pheasant, Michael; Horst, Ron; Lonie, Andrew

    2015-01-01

    Analyzing high throughput genomics data is a complex and compute intensive task, generally requiring numerous software tools and large reference data sets, tied together in successive stages of data transformation and visualisation. A computational platform enabling best practice genomics analysis ideally meets a number of requirements, including: a wide range of analysis and visualisation tools, closely linked to large user and reference data sets; workflow platform(s) enabling accessible, reproducible, portable analyses, through a flexible set of interfaces; highly available, scalable computational resources; and flexibility and versatility in the use of these resources to meet demands and expertise of a variety of users. Access to an appropriate computational platform can be a significant barrier to researchers, as establishing such a platform requires a large upfront investment in hardware, experience, and expertise. We designed and implemented the Genomics Virtual Laboratory (GVL) as a middleware layer of machine images, cloud management tools, and online services that enable researchers to build arbitrarily sized compute clusters on demand, pre-populated with fully configured bioinformatics tools, reference datasets and workflow and visualisation options. The platform is flexible in that users can conduct analyses through web-based (Galaxy, RStudio, IPython Notebook) or command-line interfaces, and add/remove compute nodes and data resources as required. Best-practice tutorials and protocols provide a path from introductory training to practice. The GVL is available on the OpenStack-based Australian Research Cloud (http://nectar.org.au) and the Amazon Web Services cloud. The principles, implementation and build process are designed to be cloud-agnostic. This paper provides a blueprint for the design and implementation of a cloud-based Genomics Virtual Laboratory. We discuss scope, design considerations and technical and logistical constraints, and explore the

  19. An integration architecture for knowledge management system and business process management system

    NARCIS (Netherlands)

    Jung, J.; Choi, I.; Song, M.S.

    2007-01-01

    Recently, interests in the notion of process-oriented knowledge management (PKM) from academia and industry have been significantly increased. Comprehensive research and development requirements along with a cogent framework, however, have not been proposed for integrating knowledge management (KM)

  20. Knowledge Integrated Business Process Management for Third Party Logistics Companies

    OpenAIRE

    Zhang, Hongyan

    2013-01-01

    The growing importance of logistics as well as the increasing dynamic complexity of markets, technologies, and customer needs has brought great challenges to logistics. In order to focus on their core competency in such a competitive environment, more and more companies have outsourced a part or the entirety of the logistics process to third party logistics (3PL) service providers. 3PL has played a crucial role in managing logistics processes within supply chain management. Logistics processe...

  1. Process innovations in the management of radioactive wastes

    International Nuclear Information System (INIS)

    Theyyunni, T.K.

    1995-01-01

    Innovative processes and techniques were investigated for their possible application in the management of low, intermediate and high level radioactive wastes. High decontamination, high volume reduction, process simplicity and operational safety are some of the objectives of these investigation. Based on the favourable results, it is hoped that many of these process innovations can be introduced in the waste management schemes with beneficial results. (author)

  2. Assessing computational genomics skills: Our experience in the H3ABioNet African bioinformatics network.

    Directory of Open Access Journals (Sweden)

    C Victor Jongeneel

    2017-06-01

    Full Text Available The H3ABioNet pan-African bioinformatics network, which is funded to support the Human Heredity and Health in Africa (H3Africa program, has developed node-assessment exercises to gauge the ability of its participating research and service groups to analyze typical genome-wide datasets being generated by H3Africa research groups. We describe a framework for the assessment of computational genomics analysis skills, which includes standard operating procedures, training and test datasets, and a process for administering the exercise. We present the experiences of 3 research groups that have taken the exercise and the impact on their ability to manage complex projects. Finally, we discuss the reasons why many H3ABioNet nodes have declined so far to participate and potential strategies to encourage them to do so.

  3. Assessing computational genomics skills: Our experience in the H3ABioNet African bioinformatics network.

    Science.gov (United States)

    Jongeneel, C Victor; Achinike-Oduaran, Ovokeraye; Adebiyi, Ezekiel; Adebiyi, Marion; Adeyemi, Seun; Akanle, Bola; Aron, Shaun; Ashano, Efejiro; Bendou, Hocine; Botha, Gerrit; Chimusa, Emile; Choudhury, Ananyo; Donthu, Ravikiran; Drnevich, Jenny; Falola, Oluwadamila; Fields, Christopher J; Hazelhurst, Scott; Hendry, Liesl; Isewon, Itunuoluwa; Khetani, Radhika S; Kumuthini, Judit; Kimuda, Magambo Phillip; Magosi, Lerato; Mainzer, Liudmila Sergeevna; Maslamoney, Suresh; Mbiyavanga, Mamana; Meintjes, Ayton; Mugutso, Danny; Mpangase, Phelelani; Munthali, Richard; Nembaware, Victoria; Ndhlovu, Andrew; Odia, Trust; Okafor, Adaobi; Oladipo, Olaleye; Panji, Sumir; Pillay, Venesa; Rendon, Gloria; Sengupta, Dhriti; Mulder, Nicola

    2017-06-01

    The H3ABioNet pan-African bioinformatics network, which is funded to support the Human Heredity and Health in Africa (H3Africa) program, has developed node-assessment exercises to gauge the ability of its participating research and service groups to analyze typical genome-wide datasets being generated by H3Africa research groups. We describe a framework for the assessment of computational genomics analysis skills, which includes standard operating procedures, training and test datasets, and a process for administering the exercise. We present the experiences of 3 research groups that have taken the exercise and the impact on their ability to manage complex projects. Finally, we discuss the reasons why many H3ABioNet nodes have declined so far to participate and potential strategies to encourage them to do so.

  4. Knowledge Management Enablers and Process in Hospital Organizations.

    Science.gov (United States)

    Lee, Hyun-Sook

    2017-02-01

    This research aimed to investigate the effects of knowledge management enablers, such as organizational structure, leadership, learning, information technology systems, trust, and collaboration, on the knowledge management process of creation, storage, sharing, and application. Using data from self-administered questionnaires in four Korean tertiary hospitals, this survey investigated the main organizational factors affecting the knowledge management process in these organizations. A total of 779 questionnaires were analyzed using SPSS 18.0 and AMOS 18.0. The results showed that organizational factors affect the knowledge management process differently in each hospital organization. From a managerial perspective, the implications of these factors for developing organizational strategies that encourage and foster the knowledge management process are discussed.

  5. A ten-step process to develop case management plans.

    Science.gov (United States)

    Tahan, Hussein A

    2002-01-01

    The use of case management plans has contained cost and improved quality of care successfully. However, the process of developing these plans remains a great challenge for healthcare executives, in this article, the author presents the answer to this challenge by discussing a 10-step formal process that administrators of patient care services and case managers can adapt to their institutions. It also can be used by interdisciplinary team members as a practical guide to develop a specific case management plan. This process is applicable to any care setting (acute, ambulatory, long term, and home care), diagnosis, or procedure. It is particularly important for those organizations that currently do not have a deliberate and systematic process to develop case management plans and are struggling with how to improve the efficiency and productivity of interdisciplinary teams charged with developing case management plans.

  6. Management and organizational indicators of process safety

    International Nuclear Information System (INIS)

    Van Hemel, S.B.; Connelly, E.M.; Haas, P.M.

    1991-01-01

    This study is part of a Nuclear Regulatory Commission research element on organizational factors in plant safety under the Human Factors research program. This paper reports that the study investigated the chemical industry, to find leading management or organizational tools which could be useful for the NRC. After collecting information form a variety of information sources, the authors concentrated our study on two types of indicators currently in use: the first is audit- or review-based, and concentrates on programmatic factors; the second, based on frequent behavioral observations, concentrates on the management of individual worker behaviors. The authors analyzed data on the relationships between the leading indictors and direct indictors such as accident and injury rates in three case studies, to determine whether sufficient evidence of validity and utility exists to justify consideration of these indicators as public safety indicators. This paper states that on the basis of statistical associations and other evidence, the authors concluded that the two indicator types have promise for use as plant safety performance indicators, and that further development and testing of the candidate indicators should be performed

  7. REGIONALIZATION OF MANAGEMENT PROCESS BY INNOVATIVE ACTIVITY

    Directory of Open Access Journals (Sweden)

    E. V. Sibirskaia

    2014-01-01

    Full Text Available Summary. In current market conditions, the economy and Russia's accession to international trade scholars and experts from various fields of knowledge paying special attention to a huge set of regional problems. The growing role of regional research determines the level of establishing effective mechanisms for the implementation of the economic interests of actors as well as economic development and improving the quality of human life is the priority objectives of federal, regional and local authorities. Today, the Russian economic science faces a global goal - to develop ways and means of transformation of the Russian economy and bring it to a path of sustainable, innovative development, providing new quality of life. Achieving this goal must surely be a central task of the Russian economics and politics, as in the near future and the long term In article authors opened the maintenance of determinants of innovative development of the territory, mediated by strengthening of regionalization of management by innovative activity: condition of resource and innovative potential; the developed forms and nature of interaction between public authorities of regional level, local community and business; applied forms of integration of subjects of managing for realization of their innovative potential due to expansion of opportunities of participation in the perspective directions of scientific and technical, economic and social development; system of the incentives developing favorable conditions for introduction and development of innovative technologies, and also increases in the enterprise activity, formed by the external institutional environment; regional economic policy as instrument of increase of efficiency of innovative activity.

  8. Ecological Processes and Contemporary Coral Reef Management

    Directory of Open Access Journals (Sweden)

    Angela Dikou

    2010-05-01

    Full Text Available Top-down controls of complex foodwebs maintain the balance among the critical groups of corals, algae, and herbivores, thus allowing the persistence of corals reefs as three-dimensional, biogenic structures with high biodiversity, heterogeneity, resistance, resilience and connectivity, and the delivery of essential goods and services to societies. On contemporary reefs world-wide, however, top-down controls have been weakened due to reduction in herbivory levels (overfishing or disease outbreak while bottom-up controls have increased due to water quality degradation (increase in sediment and nutrient load and climate forcing (seawater warming and acidification leading to algal-dominated alternate benthic states of coral reefs, which are indicative of a trajectory towards ecological extinction. Management to reverse common trajectories of degradation for coral reefs necessitates a shift from optimization in marine resource use and conservation towards building socio-economic resilience into coral reef systems while attending to the most manageable human impacts (fishing and water quality and the global-scale causes (climate change.

  9. Risk Management in Public Procurement Process

    Directory of Open Access Journals (Sweden)

    Ioana Manea

    2010-12-01

    Full Text Available Public procurement represents an important part of the current economy reality. Throughout the procurement process, due to the effect of the interaction among the components of the public procurement system, certain actions with significant negative effects on its optimal operation may occur. Risks may turn into certainty either because of a simple error in the development and administration of the procurement process, or because of a deliberate deviation from the existing legal provisions. Therefore, there is an imperative for the implementation of certain risk-avoiding measures, as well as of measures aiming to reduce their negative effects in case of their occurrence.

  10. Process simulation and parametric modeling for strategic project management

    CERN Document Server

    Morales, Peter J

    2013-01-01

    Process Simulation and Parametric Modeling for Strategic Project Management will offer CIOs, CTOs and Software Development Managers, IT Graduate Students an introduction to a set of technologies that will help them understand how to better plan software development projects, manage risk and have better insight into the complexities of the software development process.A novel methodology will be introduced that allows a software development manager to better plan and access risks in the early planning of a project.  By providing a better model for early software development estimation and softw

  11. Economically oriented process optimization in waste management.

    Science.gov (United States)

    Maroušek, Josef

    2014-06-01

    A brief report on the development of novel apparatus is presented. It was verified in a commercial scale that a new concept of anaerobic fermentation followed by continuous pyrolysis is technically and economically feasible to manage previously enzymatically hydrolyzed waste haylage in huge volumes. The design of the concept is thoroughly described, documented in figures, and biochemically analyzed in detail. Assessment of the concept shows that subsequent pyrolysis of the anaerobically fermented residue allows among biogas to produce also high-quality biochar. This significantly improves the overall economy. In addition, it may be assumed that this applied research is consistent with previous theoretical assumptions stating that any kind of aerobic or anaerobic fermentation increases the microporosity of the biochar obtained.

  12. Emerging role of bioinformatics tools and software in evolution of clinical research

    Directory of Open Access Journals (Sweden)

    Supreet Kaur Gill

    2016-01-01

    Full Text Available Clinical research is making toiling efforts for promotion and wellbeing of the health status of the people. There is a rapid increase in number and severity of diseases like cancer, hepatitis, HIV etc, resulting in high morbidity and mortality. Clinical research involves drug discovery and development whereas clinical trials are performed to establish safety and efficacy of drugs. Drug discovery is a long process starting with the target identification, validation and lead optimization. This is followed by the preclinical trials, intensive clinical trials and eventually post marketing vigilance for drug safety. Softwares and the bioinformatics tools play a great role not only in the drug discovery but also in drug development. It involves the use of informatics in the development of new knowledge pertaining to health and disease, data management during clinical trials and to use clinical data for secondary research. In addition, new technology likes molecular docking, molecular dynamics simulation, proteomics and quantitative structure activity relationship in clinical research results in faster and easier drug discovery process. During the preclinical trials, the software is used for randomization to remove bias and to plan study design. In clinical trials software like electronic data capture, Remote data capture and electronic case report form (eCRF is used to store the data. eClinical, Oracle clinical are software used for clinical data management and for statistical analysis of the data. After the drug is marketed the safety of a drug could be monitored by drug safety software like Oracle Argus or ARISg. Therefore, softwares are used from the very early stages of drug designing, to drug development, clinical trials and during pharmacovigilance. This review describes different aspects related to application of computers and bioinformatics in drug designing, discovery and development, formulation designing and clinical research.

  13. Conjoint Management of Business Processes and Information Technologies

    DEFF Research Database (Denmark)

    Siurdyban, Artur

    and improve business processes. As a consequence, there is a growing need to address managerial aspects of the relationships between information technologies and business processes. The aim of this PhD study is to investigate how the practice of conjoint management of business processes and information...... technologies can be supported and improved. The study is organized into five research papers and this summary. Each paper addresses a different aspect of conjoint management of business processes and information technologies, i.e. problem development and managerial practices on software...... and information technologies in a project environment. It states that both elements are intrinsically related and should be designed and considered together. The second case examines the relationships between information technology management and business process management. It discusses the multi-faceted role...

  14. The process of Risk management for E-business

    Directory of Open Access Journals (Sweden)

    Erion Lekaj

    2017-07-01

    Full Text Available In the new Internet economy, risk management plays a critical role to protect the organization and its ability to perform their business mission, not just its IT assets. Risk management is the process of identifying risk, assessing risk, and taking steps to reduce risk to an acceptable level. The risk management is an important component of an IT security program. Information and communications technology management and IT security are responsible for ensuring that technology risks are managed appropriately. These risks originate from the deployment and use of IT assets in various ways, such as configuring systems incorrectly or gaining access to restricted soft ware.

  15. Interface management: Effective communication to improve process safety

    International Nuclear Information System (INIS)

    Kelly, Brian; Berger, Scott

    2006-01-01

    Failure to successfully communicate maintenance activities, abnormal conditions, emergency response procedures, process hazards, and hundreds of other items of critical information can lead to disaster, regardless of the thoroughness of the process safety management system. Therefore, a well-functioning process safety program depends on maintaining successful communication interfaces between each involved employee or stakeholder and the many other employees or stakeholders that person must interact with. The authors discuss a process to identify the critical 'Interfaces' between the many participants in a process safety management system, and then to establish a protocol for each critical interface

  16. Elements of knowledge management in the improvement of business processes

    OpenAIRE

    Brajer-Marczak Renata

    2016-01-01

    The key role in process management is played by the systematic analysis, measurement and improvement of processes. The imperative of continuous introduction of changes in processes is the answer to the changing conditions of competition and the great dynamics in the expectations and preferences of customers. Information related to business process should be collected and formalized in order to improve the execution of processes. In connection with the above, it may be stated that the improvem...

  17. Chemical process safety management within the Department of Energy

    International Nuclear Information System (INIS)

    Piatt, J.A.

    1995-07-01

    Although the Department of Energy (DOE) is not well known for its chemical processing activities, the DOE does have a variety of chemical processes covered under OSHA's Rule for Process Safety Management of Highly Hazardous Chemicals (the PSM Standard). DOE, like industry, is obligated to comply with the PSM Standard. The shift in the mission of DOE away from defense programs toward environmental restoration and waste management has affected these newly forming process safety management programs within DOE. This paper describes the progress made in implementing effective process safety management programs required by the PSM Standard and discusses some of the trends that have supported efforts to reduce chemical process risks within the DOE. In June of 1994, a survey of chemicals exceeding OSHA PSM or EPA Risk Management Program threshold quantities (TQs) at DOE sites found that there were 22 processes that utilized toxic or reactive chemicals over TQs; there were 13 processes involving flammable gases and liquids over TQs; and explosives manufacturing occurred at 4 sites. Examination of the survey results showed that 12 of the 22 processes involving toxic chemicals involved the use of chlorine for water treatment systems. The processes involving flammable gases and liquids were located at the Strategic Petroleum Reserve and Naval petroleum Reserve sites

  18. Processes of international collaboration in management research

    DEFF Research Database (Denmark)

    Jonsen, Karsten; Butler, Christina; Mäkelä, Kristiina

    2013-01-01

    Scientists and academics increasingly work on collaborative projects and write papers in international research teams. This trend is driven by greater publishing demands in terms of the quality and breadth of data and analysis methods, which tend to be difficult to achieve without collaborating...... across institutional and national boundaries. Yet, our understanding of the collaborative processes in an academic setting and the potential tensions associated with them remains limited. We use a reflexive, autoethnographic approach to explicitly investigate our own experiences of international...... collaborative research. We offer systematic insights into the social and intellectual processes of academic collaborative writing, identifying six lessons and two key tensions that influence the success of international research teams. Our findings may benefit the formation of future coauthor teams...

  19. Managing Cultural Variation in Software Process Improvement

    DEFF Research Database (Denmark)

    Müller, Sune Dueholm; Kræmmergaard, Pernille; Mathiassen, Lars

    The scale and complexity of change in software process improvement (SPI) are considerable and managerial attention to organizational culture during SPI can therefore potentially contribute to successful outcomes. However, we know little about the impact of variations in organizational subculture...... CMMI level 2 as planned, ASY struggled to implement even modest improvements. To explain these differences, we analyzed the underlying organizational culture within ISY and ASY using two different methods for subculture assessment. The study demonstrates how variations in culture across software...

  20. A combined disease management and process modeling approach for assessing and improving care processes: a fall management case-study.

    Science.gov (United States)

    Askari, Marjan; Westerhof, Richard; Eslami, Saied; Medlock, Stephanie; de Rooij, Sophia E; Abu-Hanna, Ameen

    2013-10-01

    To propose a combined disease management and process modeling approach for evaluating and improving care processes, and demonstrate its usability and usefulness in a real-world fall management case study. We identified essential disease management related concepts and mapped them into explicit questions meant to expose areas for improvement in the respective care processes. We applied the disease management oriented questions to a process model of a comprehensive real world fall prevention and treatment program covering primary and secondary care. We relied on interviews and observations to complete the process models, which were captured in UML activity diagrams. A preliminary evaluation of the usability of our approach by gauging the experience of the modeler and an external validator was conducted, and the usefulness of the method was evaluated by gathering feedback from stakeholders at an invitational conference of 75 attendees. The process model of the fall management program was organized around the clinical tasks of case finding, risk profiling, decision making, coordination and interventions. Applying the disease management questions to the process models exposed weaknesses in the process including: absence of program ownership, under-detection of falls in primary care, and lack of efficient communication among stakeholders due to missing awareness about other stakeholders' workflow. The modelers experienced the approach as usable and the attendees of the invitational conference found the analysis results to be valid. The proposed disease management view of process modeling was usable and useful for systematically identifying areas of improvement in a fall management program. Although specifically applied to fall management, we believe our case study is characteristic of various disease management settings, suggesting the wider applicability of the approach. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  1. When cognitive biases lead to business process management issues

    NARCIS (Netherlands)

    Razavian, M.; Turetken, O.; Vanderfeesten, I.T.P.; Dumas, M.; Fantinato, M.

    2017-01-01

    There is a broad consensus that design decision making is important for Business Process Management success. Despite many business process design approaches and practices that are available, the quality of business process analysis and design relies heavily on human factors. Some of these factors

  2. Sustaining Operational Resiliency: A Process Improvement Approach to Security Management

    National Research Council Canada - National Science Library

    Caralli, Richard A

    2006-01-01

    .... Coordinating these efforts to sustain operational resiliency requires a process-oriented approach that can be defined, measured, and actively managed. This report describes the fundamental elements and benefits of a process approach to security and operational resiliency and provides a notional view of a framework for process improvement.

  3. Management by process based systems and safety focus

    International Nuclear Information System (INIS)

    Rydnert, Bo; Groenlund, Bjoern

    2005-12-01

    An initiative from The Swedish Nuclear Power Inspectorate led to this study carried out in the late autumn of 2005. The objective was to understand in more detail how an increasing use of process management affects organisations, on the one hand regarding risks and security, on the other hand regarding management by objectives and other management and operative effects. The main method was interviewing representatives of companies and independent experts. More than 20 interviews were carried out. In addition a literature study was made. All participating companies are using Management Systems based on processes. However, the methods chosen, and the results achieved, vary extensively. Thus, there are surprisingly few examples of complete and effective management by processes. Yet there is no doubt that management by processes is effective and efficient. Overall goals are reached, business results are achieved in more reliable ways and customers are more satisfied. The weaknesses found can be translated into a few comprehensive recommendations. A clear, structured and acknowledged model should be used and the processes should be described unambiguously. The changed management roles should be described and obeyed extremely legibly. New types of process objectives need to be formulated. In addition one fact needs to be observed and effectively fended off. Changes are often met by mental opposition on management level, as well as among co-workers. This fact needs attention and leadership. Safety development is closely related to the design and operation of a business management system and its continual improvement. A deep understanding of what constitutes an efficient and effective management system affects the understanding of safety. safety culture and abilities to achieve safety goals. Concerning risk, the opinions were unambiguous. Management by processes as such does not result in any further risks. On the contrary. Processes give a clear view of production and

  4. Bioinformatic tools for PCR Primer design

    African Journals Online (AJOL)

    ES

    reaction (PCR), oligo hybridization and DNA sequencing. Proper primer design is actually one of the most important factors/steps in successful DNA sequencing. Various bioinformatics programs are available for selection of primer pairs from a template sequence. The plethora programs for PCR primer design reflects the.

  5. "Extreme Programming" in a Bioinformatics Class

    Science.gov (United States)

    Kelley, Scott; Alger, Christianna; Deutschman, Douglas

    2009-01-01

    The importance of Bioinformatics tools and methodology in modern biological research underscores the need for robust and effective courses at the college level. This paper describes such a course designed on the principles of cooperative learning based on a computer software industry production model called "Extreme Programming" (EP).…

  6. Protein raftophilicity. How bioinformatics can help membranologists

    DEFF Research Database (Denmark)

    Nielsen, Henrik; Sperotto, Maria Maddalena

    )-based bioinformatics approach. The ANN was trained to recognize feature-based patterns in proteins that are considered to be associated with lipid rafts. The trained ANN was then used to predict protein raftophilicity. We found that, in the case of α-helical membrane proteins, their hydrophobic length does not affect...

  7. Bioinformatics in Undergraduate Education: Practical Examples

    Science.gov (United States)

    Boyle, John A.

    2004-01-01

    Bioinformatics has emerged as an important research tool in recent years. The ability to mine large databases for relevant information has become increasingly central to many different aspects of biochemistry and molecular biology. It is important that undergraduates be introduced to the available information and methodologies. We present a…

  8. Implementing bioinformatic workflows within the bioextract server

    Science.gov (United States)

    Computational workflows in bioinformatics are becoming increasingly important in the achievement of scientific advances. These workflows typically require the integrated use of multiple, distributed data sources and analytic tools. The BioExtract Server (http://bioextract.org) is a distributed servi...

  9. Privacy Preserving PCA on Distributed Bioinformatics Datasets

    Science.gov (United States)

    Li, Xin

    2011-01-01

    In recent years, new bioinformatics technologies, such as gene expression microarray, genome-wide association study, proteomics, and metabolomics, have been widely used to simultaneously identify a huge number of human genomic/genetic biomarkers, generate a tremendously large amount of data, and dramatically increase the knowledge on human…

  10. Bioboxes: standardised containers for interchangeable bioinformatics software.

    Science.gov (United States)

    Belmann, Peter; Dröge, Johannes; Bremges, Andreas; McHardy, Alice C; Sczyrba, Alexander; Barton, Michael D

    2015-01-01

    Software is now both central and essential to modern biology, yet lack of availability, difficult installations, and complex user interfaces make software hard to obtain and use. Containerisation, as exemplified by the Docker platform, has the potential to solve the problems associated with sharing software. We propose bioboxes: containers with standardised interfaces to make bioinformatics software interchangeable.

  11. Development and implementation of a bioinformatics online ...

    African Journals Online (AJOL)

    Thus, there is the need for appropriate strategies of introducing the basic components of this emerging scientific field to part of the African populace through the development of an online distance education learning tool. This study involved the design of a bioinformatics online distance educative tool an implementation of ...

  12. SPECIES DATABASES AND THE BIOINFORMATICS REVOLUTION.

    Science.gov (United States)

    Biological databases are having a growth spurt. Much of this results from research in genetics and biodiversity, coupled with fast-paced developments in information technology. The revolution in bioinformatics, defined by Sugden and Pennisi (2000) as the "tools and techniques for...

  13. Waste management, waste resource facilities and waste conversion processes

    International Nuclear Information System (INIS)

    Demirbas, Ayhan

    2011-01-01

    In this study, waste management concept, waste management system, biomass and bio-waste resources, waste classification, and waste management methods have been reviewed. Waste management is the collection, transport, processing, recycling or disposal, and monitoring of waste materials. A typical waste management system comprises collection, transportation, pre-treatment, processing, and final abatement of residues. The waste management system consists of the whole set of activities related to handling, treating, disposing or recycling the waste materials. General classification of wastes is difficult. Some of the most common sources of wastes are as follows: domestic wastes, commercial wastes, ashes, animal wastes, biomedical wastes, construction wastes, industrial solid wastes, sewer, biodegradable wastes, non-biodegradable wastes, and hazardous wastes.

  14. P19-S Managing Proteomics Data from Data Generation and Data Warehousing to Central Data Repository and Journal Reviewing Processes

    Science.gov (United States)

    Thiele, H.; Glandorf, J.; Koerting, G.; Reidegeld, K.; Blüggel, M.; Meyer, H.; Stephan, C.

    2007-01-01

    In today’s proteomics research, various techniques and instrumentation bioinformatics tools are necessary to manage the large amount of heterogeneous data with an automatic quality control to produce reliable and comparable results. Therefore a data-processing pipeline is mandatory for data validation and comparison in a data-warehousing system. The proteome bioinformatics platform ProteinScape has been proven to cover these needs. The reprocessing of HUPO BPP participants’ MS data was done within ProteinScape. The reprocessed information was transferred into the global data repository PRIDE. ProteinScape as a data-warehousing system covers two main aspects: archiving relevant data of the proteomics workflow and information extraction functionality (protein identification, quantification and generation of biological knowledge). As a strategy for automatic data validation, different protein search engines are integrated. Result analysis is performed using a decoy database search strategy, which allows the measurement of the false-positive identification rate. Peptide identifications across different workflows, different MS techniques, and different search engines are merged to obtain a quality-controlled protein list. The proteomics identifications database (PRIDE), as a public data repository, is an archiving system where data are finally stored and no longer changed by further processing steps. Data submission to PRIDE is open to proteomics laboratories generating protein and peptide identifications. An export tool has been developed for transferring all relevant HUPO BPP data from ProteinScape into PRIDE using the PRIDE.xml format. The EU-funded ProDac project will coordinate the development of software tools covering international standards for the representation of proteomics data. The implementation of data submission pipelines and systematic data collection in public standards–compliant repositories will cover all aspects, from the generation of MS data

  15. THE DEVELOPMENT OF THE PROCESS-BASED APPROACH TO MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Urij V. Lyandau

    2013-01-01

    Full Text Available This article considers the timeline of the approaches to management of the industrial processes and organizations in general.The Adam Smith’s idea of specialization, the Henry Ford’s conveyor and Frederick Taylor’s scientific approach created functional corporations, in which specialized departments consisted of specialized workers. Such organizational chart was optimized for every department’s tasks, which are necessary to perform.During the life cycle evolution of industrial and then informational ages external conditions of production has changed. In consequence, there was born the necessity to change key factors of the management paradigm. These changes are the transfer from the functional management to the process-based approach. The functional management was the basic type of management in many organizations during the 20th century. Only in the end of 1990 companies started to integrate the process-based approach. This conversion was born cause of special conditions that the informational age created.

  16. Processing Infrared Images For Fire Management Applications

    Science.gov (United States)

    Warren, John R.; Pratt, William K.

    1981-12-01

    The USDA Forest Service has used airborne infrared systems for forest fire detection and mapping for many years. The transfer of the images from plane to ground and the transposition of fire spots and perimeters to maps has been performed manually. A new system has been developed which uses digital image processing, transmission, and storage. Interactive graphics, high resolution color display, calculations, and computer model compatibility are featured in the system. Images are acquired by an IR line scanner and converted to 1024 x 1024 x 8 bit frames for transmission to the ground at a 1.544 M bit rate over a 14.7 GHZ carrier. Individual frames are received and stored, then transferred to a solid state memory to refresh the display at a conventional 30 frames per second rate. Line length and area calculations, false color assignment, X-Y scaling, and image enhancement are available. Fire spread can be calculated for display and fire perimeters plotted on maps. The performance requirements, basic system, and image processing will be described.

  17. Nonprofit, payload process improvement through lean management

    Science.gov (United States)

    Sampson, Melissa

    Organizations that are successful and competitive long-term have learned to efficiently utilize their resources, such as money, people, facilities, and time. Over the last half-century, there have been a variety of theories and techniques put forth on how to do this. One recent theory applied in the aerospace industry is Lean Management (LM), which emphasizes a customer focus and a rigorous elimination of activities that do not add value from the customer's perspective. LM has not, until now, been evaluated for small, nonprofit, one-off production organizations (NOPOs). Previous research on LM focused on for-profit companies and large-scale production organizations, producing relatively similar products repetitively (e.g. automobiles, commercial satellites, aircraft, and launch vehicles). One-off production organizations typically create one-of-a-kind products. The purpose of this research is to examine the applicability of LM to a NOPO. LM will improve resource utilization and thereby competitiveness, as well as exploring a new area of knowledge and research. The research methodology consists of conducting case studies, formal and informal interviews, observation and analysis in order to assess whether and how LM may be beneficial. The research focuses on one particular NOPO, BioServe Space Technologies (BST): a nonprofit, payload development organization. Additional NOPOs were interviewed in order to draw more generalized conclusions about LM benefits. The research demonstrates that LM is applicable to NOPOs, thus providing a tool to improve efficiency and competitiveness. Results from this research are guidelines for payload development organizations to implement LM, and highlighting potential LM weaknesses. A major conclusion is that LM needs some minor modifications to be applicable and useful to NOPOs, particularly in terms of value stream mapping. The LM implementation roadmap developed for NOPOs introduces customized metrics, as well as including standard

  18. Improvement of quality management in the processes of tobacco production

    OpenAIRE

    Miceski, Trajko

    2004-01-01

    Quality management, now more than ever, occupies an important place in tobacco production. It presents, above all, a continuous process aimed to satisfy the requirements of both the persons employed in tobacco industry and the layers.

  19. NHDOT : process for municipally managed state aid highway program projects

    Science.gov (United States)

    2006-05-23

    The design and construction of Municipally Managed State Aid Highway Program projects must comply with the requirements in this guideline in order to receive State Aid under the applicable provisions of RSA 235. Under this process, State Aid Construc...

  20. Advanced business process management in networked E-business scenarios

    NARCIS (Netherlands)

    Grefen, P.W.P.J.; Türetken, O.

    2017-01-01

    In the modern economy, we see a shift towards networked business scenarios. In many contemporary situations, the operation of multiple organizations is tightly coupled in collaborative business networks. To allow this tightly coupled collaboration, business process management (BPM) in these

  1. Socially Grounded Analysis of Knowledge Management Systems and Processes

    NARCIS (Netherlands)

    Guizzardi, R.S.S.; Perini, A.; Dignum, V.

    2008-01-01

    In the struggle to survive and compete in face of constant technological changes and unstable business environments, organizations recognize knowledge as its most valuable asset. Consequently, these organizations often invest on Knowledge Management (KM), seeking to enhance their internal processes

  2. Influence of Culture on the Process of Managing Decisions Adoption

    Directory of Open Access Journals (Sweden)

    Florin-Lucian Isac

    2015-01-01

    Full Text Available Different cultural environment requires a corresponding managerial environment. The process of managing decisions adoption is influenced by the values, attitudes, beliefs and behaviors of the employees.

  3. MOTORIZATION PROCESS AND MANAGEMENT IN BIG CITIES IN CHINA

    Directory of Open Access Journals (Sweden)

    Hong MA

    2007-01-01

    This paper expresses and analyzes the mobility development process in big cities in China, taking Beijing as an example. The evolution of transportation policy is described. Both transportation demand management and infrastructure contraction should be considered simultaneously in the policy.

  4. proposal for a lean commodity management process for the south

    African Journals Online (AJOL)

    Administrator

    The 'new Navy' will have to survive with less funds and fewer staff. ... Managing the most cost effective supply support for these product ... In order to implement a real customer-based service, process changes are ..... The local operations.

  5. Success Factors of Business Process Management Systems Implementation

    NARCIS (Netherlands)

    Johan Versendaal; J.P.P. Ravesteijn

    2007-01-01

    In this research (critical) success factors for Business Process Management Systems implementation are identified and qualitatively validated. Furthermore a list of critical success factors is constructed. Based on the identified factors a BPMS implementation approach is suggested. Future research

  6. Managing Cultural Variation in Software Process Improvement

    DEFF Research Database (Denmark)

    Kræmmergaard, Pernille; Müller, Sune Dueholm; Mathiassen, Lars

    The scale and complexity of change in software process improvement (SPI) are considerable and managerial attention to organizational culture during SPI can therefore potentially contribute to successful outcomes. However, we know little about the impact of variations in organizational subculture...... on SPI initiatives. On this backdrop, we report from a large scale SPI project in a Danish high-tech company, Terma. Two of its business units - Integrated Systems (ISY) and Airborne Systems (ASY) - followed similar approaches over a two year period, but with quite different outcomes. While ISY reached...... CMMI level 2 as planned, ASY struggled to implement even modest improvements. To explain these differences, we analyzed the underlying organizational culture within ISY and ASY using two different methods for subculture assessment. The study demonstrates how variations in culture across software...

  7. Managing Constraint Generators in Retail Design Processes

    DEFF Research Database (Denmark)

    Münster, Mia Borch; Haug, Anders

    case studies of fashion store design projects, the present paper addresses this gap. The and six case studies of fashion store design projects, the present paper sheds light on the types of constraints generated by the relevant constraint generators. The paper shows that in the cases studied......Retail design concepts are complex designs meeting functional and aesthetic demands. During a design process a retail designer has to consider various constraint generators such as stakeholder interests, physical limitations and restrictions. Obviously the architectural site, legislators...... and landlords need to be considered as well as the interest of the client and brand owner. Furthermore the users need to be taken into account in order to develop an interesting and functional shopping and working environments. Finally, suppliers and competitors may influence the design with regard...

  8. Managing Change: the people side of implementing CRM processes

    OpenAIRE

    Hann, David

    2006-01-01

    This report has been produced with the remit of analysing the people side of change management with regard to a Customer Relationship Management (CRM) process implementation at Jupiter Design (Jupiter). An increasing churn of clients and 12- years of growth has made Jupiter realise they must maximise revenues from existing clients. The adoption of a CRM approach has been suggested as a possible solution.

  9. Modelling of information processes management of educational complex

    Directory of Open Access Journals (Sweden)

    Оксана Николаевна Ромашкова

    2014-12-01

    Full Text Available This work concerns information model of the educational complex which includes several schools. A classification of educational complexes formed in Moscow is given. There are also a consideration of the existing organizational structure of the educational complex and a suggestion of matrix management structure. Basic management information processes of the educational complex were conceptualized.

  10. Bioinformatic analysis of the nucleolus

    DEFF Research Database (Denmark)

    Leung, Anthony K L; Andersen, Jens S; Mann, Matthias

    2003-01-01

    The nucleolus is a plurifunctional, nuclear organelle, which is responsible for ribosome biogenesis and many other functions in eukaryotes, including RNA processing, viral replication and tumour suppression. Our knowledge of the human nucleolar proteome has been expanded dramatically by the two r...

  11. Blockchains for Business Process Management - Challenges and Opportunities

    OpenAIRE

    Mendling, Jan; Weber, Ingo; van der Aalst, Wil; Brocke, Jan vom; Cabanillas, Cristina; Daniel, Florian; Debois, Soren; Di Ciccio, Claudio; Dumas, Marlon; Dustdar, Schahram; Gal, Avigdor; Garcia-Banuelos, Luciano; Governatori, Guido; Hull, Richard; La Rosa, Marcello

    2017-01-01

    Blockchain technology offers a sizable promise to rethink the way inter-organizational business processes are managed because of its potential to realize execution without a central party serving as a single point of trust (and failure). To stimulate research on this promise and the limits thereof, in this paper we outline the challenges and opportunities of blockchain for Business Process Management (BPM). We first reflect how blockchains could be used in the context of the established BPM l...

  12. Automated input data management in manufacturing process simulation

    OpenAIRE

    Ettefaghian, Alireza

    2015-01-01

    Input Data Management (IDM) is a time consuming and costly process for Discrete Event Simulation (DES) projects. Input Data Management is considered as the basis of real-time process simulation (Bergmann, Stelzer and Strassburger, 2011). According to Bengtsson et al. (2009), data input phase constitutes on the average about 31% of the time of an entire simulation project. Moreover, the lack of interoperability between manufacturing applications and simulation software leads to a high cost to ...

  13. Three essential management processes of nuclear power plant operators

    International Nuclear Information System (INIS)

    Qi Tunfeng

    2010-01-01

    The paper takes the operation and management of Qinshan NPP Phase II as an example, focusing on the implementation of the essential process from the following three aspects the NPP production organization, training, examination and authorization for safety-related personnel, and financing budge management. A better understanding and implementation of the essential process will enable nuclear power plants to effectively control the nuclear safety from the most fundamental managerial level. (author)

  14. Design of distributed systems of hydrolithosphere processes management. A synthesis of distributed management systems

    Science.gov (United States)

    Pershin, I. M.; Pervukhin, D. A.; Ilyushin, Y. V.; Afanaseva, O. V.

    2017-10-01

    The paper considers an important problem of designing distributed systems of hydrolithosphere processes management. The control actions on the hydrolithosphere processes under consideration are implemented by a set of extractive wells. The article shows the method of defining the approximation links for description of the dynamic characteristics of hydrolithosphere processes. The structure of distributed regulators, used in the management systems by the considered processes, is presented. The paper analyses the results of the synthesis of the distributed management system and the results of modelling the closed-loop control system by the parameters of the hydrolithosphere process.

  15. NPTool: Towards Scalability and Reliability of Business Process Management

    Science.gov (United States)

    Braghetto, Kelly Rosa; Ferreira, João Eduardo; Pu, Calton

    Currently one important challenge in business process management is provide at the same time scalability and reliability of business process executions. This difficulty becomes more accentuated when the execution control assumes complex countless business processes. This work presents NavigationPlanTool (NPTool), a tool to control the execution of business processes. NPTool is supported by Navigation Plan Definition Language (NPDL), a language for business processes specification that uses process algebra as formal foundation. NPTool implements the NPDL language as a SQL extension. The main contribution of this paper is a description of the NPTool showing how the process algebra features combined with a relational database model can be used to provide a scalable and reliable control in the execution of business processes. The next steps of NPTool include reuse of control-flow patterns and support to data flow management.

  16. Large scale and big data processing and management

    CERN Document Server

    Sakr, Sherif

    2014-01-01

    Large Scale and Big Data: Processing and Management provides readers with a central source of reference on the data management techniques currently available for large-scale data processing. Presenting chapters written by leading researchers, academics, and practitioners, it addresses the fundamental challenges associated with Big Data processing tools and techniques across a range of computing environments.The book begins by discussing the basic concepts and tools of large-scale Big Data processing and cloud computing. It also provides an overview of different programming models and cloud-bas

  17. Navigating the changing learning landscape: perspective from bioinformatics.ca.

    Science.gov (United States)

    Brazas, Michelle D; Ouellette, B F Francis

    2013-09-01

    With the advent of YouTube channels in bioinformatics, open platforms for problem solving in bioinformatics, active web forums in computing analyses and online resources for learning to code or use a bioinformatics tool, the more traditional continuing education bioinformatics training programs have had to adapt. Bioinformatics training programs that solely rely on traditional didactic methods are being superseded by these newer resources. Yet such face-to-face instruction is still invaluable in the learning continuum. Bioinformatics.ca, which hosts the Canadian Bioinformatics Workshops, has blended more traditional learning styles with current online and social learning styles. Here we share our growing experiences over the past 12 years and look toward what the future holds for bioinformatics training programs.

  18. Implementation of Knowledge Management as Process to Management System of ÚJD SR

    International Nuclear Information System (INIS)

    Szabó, V.

    2016-01-01

    Full text: This presentation provides basic information about the development of staff’s knowledge management of Nuclear Regulatory Authority of Slovak Republic. It is a case study about implementation of knowledge management as process to the integrated management system for Slovak regulatory body. (author

  19. Distributed resource management across process boundaries

    KAUST Repository

    Suresh, Lalith

    2017-09-27

    Multi-tenant distributed systems composed of small services, such as Service-oriented Architectures (SOAs) and Micro-services, raise new challenges in attaining high performance and efficient resource utilization. In these systems, a request execution spans tens to thousands of processes, and the execution paths and resource demands on different services are generally not known when a request first enters the system. In this paper, we highlight the fundamental challenges of regulating load and scheduling in SOAs while meeting end-to-end performance objectives on metrics of concern to both tenants and operators. We design Wisp, a framework for building SOAs that transparently adapts rate limiters and request schedulers system-wide according to operator policies to satisfy end-to-end goals while responding to changing system conditions. In evaluations against production as well as synthetic workloads, Wisp successfully enforces a range of end-to-end performance objectives, such as reducing average latencies, meeting deadlines, providing fairness and isolation, and avoiding system overload.

  20. Distributed resource management across process boundaries

    KAUST Repository

    Suresh, Lalith; Bodik, Peter; Menache, Ishai; Canini, Marco; Ciucu, Florin

    2017-01-01

    Multi-tenant distributed systems composed of small services, such as Service-oriented Architectures (SOAs) and Micro-services, raise new challenges in attaining high performance and efficient resource utilization. In these systems, a request execution spans tens to thousands of processes, and the execution paths and resource demands on different services are generally not known when a request first enters the system. In this paper, we highlight the fundamental challenges of regulating load and scheduling in SOAs while meeting end-to-end performance objectives on metrics of concern to both tenants and operators. We design Wisp, a framework for building SOAs that transparently adapts rate limiters and request schedulers system-wide according to operator policies to satisfy end-to-end goals while responding to changing system conditions. In evaluations against production as well as synthetic workloads, Wisp successfully enforces a range of end-to-end performance objectives, such as reducing average latencies, meeting deadlines, providing fairness and isolation, and avoiding system overload.

  1. Management of Purex spent solvents by the alkaline hydrolysis process

    International Nuclear Information System (INIS)

    Srinivas, C.; Manohar, Smitha; Vincent, Tessy; Wattal, P.K.; Theyyunni, T.K.

    1995-01-01

    Various treatment processes were evaluated on a laboratory scale for the management of the spent solvent from the extraction of nuclear materials. Based on the lab scale evaluation it is proposed to adopt the alkaline hydrolysis process as the treatment mode for the spent solvent. The process has advantages over the other processes in terms of simplicity, low cost and ease of disposal of the secondary waste generated. (author)

  2. BUSINESS PROCESS MANAGEMENT IN INSURANCE CASE OF JADRANSKO INSURANCE COMPANY

    OpenAIRE

    Sanja Coric; Danijel Bara

    2014-01-01

    Selling insurance products in conditions of today’s modern technological solutions is faced with numerous challenges. Business processes in insurance as well as the results of these business processes are the real interface to policyholders. Modeling and analysis of business process in insurance ensure organizations to focus on the customer and increase the efficiency and quality of work. Managing critical business processes in every single organization, likewise in insurance is a key factor ...

  3. Innovation management and performance evaluation: structured process of literature review

    Directory of Open Access Journals (Sweden)

    Julieta Scheidt Dienstmann

    2014-03-01

    Full Text Available This article aims to provide a process for the construction of knowledge demanded by researchers at the initial stage of their work on innovation management. To meet this need, the process adopted was the ProKnow-C (Knowledge Development Process - Constructivist, which proposes the construction of researchers knowledge considering their perceptions on the subject, and the recognition of scientific articles analyzed. The knowledge generated in the researcher means, for this article, knowing what are the main journals, articles, authors and keywords associated with 15 articles with scientific recognition and aligned with the perception of the researcher on innovation management, with focus on results. Through this application, the process ProKnow-C is presented demonstrating how it can be used by researchers to meet their initial demands of building knowledge about innovation management and aims to instill future works  based on structured processes for selecting a theoretical framework in this field of knowledge.

  4. THE IMPORTANCE OF PERSONNEL MOTIVATION IN THE MANAGEMENT PROCESS

    Directory of Open Access Journals (Sweden)

    NĂSTASIE MIHAELA – ANDREEA

    2015-07-01

    Full Text Available General research area of this article is the motivation of personnel, essential tool in the management process, and also a component derived from human resource management. In economic activity, personnel motivation should be regarded as an internal process, not as an imperative that can be imposed from outside the economic entity. Managers of economic entities must, first, understand personnel motivation strategies, how they influence positively or negatively the internal motivations of employees. Personnel motivation by itself attracts an end, just as profitable and moral, individual and social welfare making.

  5. A decision model for the risk management of hazardous processes

    International Nuclear Information System (INIS)

    Holmberg, J.E.

    1997-03-01

    A decision model for risk management of hazardous processes as an optimisation problem of a point process is formulated in the study. In the approach, the decisions made by the management are divided into three categories: (1) planned process lifetime, (2) selection of the design and, (3) operational decisions. These three controlling methods play quite different roles in the practical risk management, which is also reflected in our approach. The optimisation of the process lifetime is related to the licensing problem of the process. It provides a boundary condition for a feasible utility function that is used as the actual objective function, i.e., maximizing the process lifetime utility. By design modifications, the management can affect the inherent accident hazard rate of the process. This is usually a discrete optimisation task. The study particularly concentrates upon the optimisation of the operational strategies given a certain design and licensing time. This is done by a dynamic risk model (marked point process model) representing the stochastic process of events observable or unobservable to the decision maker. An optimal long term control variable guiding the selection of operational alternatives in short term problems is studied. The optimisation problem is solved by the stochastic quasi-gradient procedure. The approach is illustrated by a case study. (23 refs.)

  6. UTILITY OF ANNUAL FINANCIAL STATEMENTS IN THE MANAGEMENT PROCESS

    Directory of Open Access Journals (Sweden)

    PUICAN LILIANA

    2015-07-01

    Full Text Available Process knowledge of the financial situation of the economic entity presupposes the use of analysis and synthesis, as indispensable tools of investigation. Financial management of the economic entity have to belong to the basic role in strategic financial decisions that would solve the problem of effective management of the process optimal growth, balanced and proportionate entity. That is why it becomes necessary and imperative objective analysis of the implications of the current financial management in economic entities familiarize managers with the basic tools with which they operate, acquiring knowledge about planning and financial control, evaluation techniques of investment projects, about how to conduct financial and economic diagnosis and management control of the entity, the key issues in its orientation towards performance.

  7. QUALITY MANAGEMENT IN TERMS OF STRENGTHENING THE "THREE LINES OF DEFENCE" IN RISK MANAGEMENT - PROCESS APPROACH

    OpenAIRE

    Radoica Luburic; Milan Perovic; Rajko Sekulovic

    2015-01-01

    The authors of the paper analyze risk management processes contained in the model considering its longstanding application in many European organisations. The paper also analyses quality management system (QMS) standards and risk management (RM) standards. It particularly addresses the process approach to QMS which, when coupled with active involvement of employees and constant improvements, makes this approach interesting as the topic of this paper. The analyses herein resulted in integratin...

  8. Component-Based Approach for Educating Students in Bioinformatics

    Science.gov (United States)

    Poe, D.; Venkatraman, N.; Hansen, C.; Singh, G.

    2009-01-01

    There is an increasing need for an effective method of teaching bioinformatics. Increased progress and availability of computer-based tools for educating students have led to the implementation of a computer-based system for teaching bioinformatics as described in this paper. Bioinformatics is a recent, hybrid field of study combining elements of…

  9. Conceptual framework for the mapping of management process with information technology in a business process.

    Science.gov (United States)

    Rajarathinam, Vetrickarthick; Chellappa, Swarnalatha; Nagarajan, Asha

    2015-01-01

    This study on component framework reveals the importance of management process and technology mapping in a business environment. We defined ERP as a software tool, which has to provide business solution but not necessarily an integration of all the departments. Any business process can be classified as management process, operational process and the supportive process. We have gone through entire management process and were enable to bring influencing components to be mapped with a technology for a business solution. Governance, strategic management, and decision making are thoroughly discussed and the need of mapping these components with the ERP is clearly explained. Also we suggest that implementation of this framework might reduce the ERP failures and especially the ERP misfit was completely rectified.

  10. Keemei: cloud-based validation of tabular bioinformatics file formats in Google Sheets.

    Science.gov (United States)

    Rideout, Jai Ram; Chase, John H; Bolyen, Evan; Ackermann, Gail; González, Antonio; Knight, Rob; Caporaso, J Gregory

    2016-06-13

    Bioinformatics software often requires human-generated tabular text files as input and has specific requirements for how those data are formatted. Users frequently manage these data in spreadsheet programs, which is convenient for researchers who are compiling the requisite information because the spreadsheet programs can easily be used on different platforms including laptops and tablets, and because they provide a familiar interface. It is increasingly common for many different researchers to be involved in compiling these data, including study coordinators, clinicians, lab technicians and bioinformaticians. As a result, many research groups are shifting toward using cloud-based spreadsheet programs, such as Google Sheets, which support the concurrent editing of a single spreadsheet by different users working on different platforms. Most of the researchers who enter data are not familiar with the formatting requirements of the bioinformatics programs that will be used, so validating and correcting file formats is often a bottleneck prior to beginning bioinformatics analysis. We present Keemei, a Google Sheets Add-on, for validating tabular files used in bioinformatics analyses. Keemei is available free of charge from Google's Chrome Web Store. Keemei can be installed and run on any web browser supported by Google Sheets. Keemei currently supports the validation of two widely used tabular bioinformatics formats, the Quantitative Insights into Microbial Ecology (QIIME) sample metadata mapping file format and the Spatially Referenced Genetic Data (SRGD) format, but is designed to easily support the addition of others. Keemei will save researchers time and frustration by providing a convenient interface for tabular bioinformatics file format validation. By allowing everyone involved with data entry for a project to easily validate their data, it will reduce the validation and formatting bottlenecks that are commonly encountered when human-generated data files are

  11. Business process management demystified : a tutorial on models, systems and standards for workflow management

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Desel, J.; Reisig, W.; Rozenberg, G.

    2004-01-01

    Over the last decade there has been a shift from data-aware information systems to process-aware information systems. To support business processes an enterprise information system needs to be aware of these processes and their organizational context. Business Process Management (BPM) includes

  12. Contingency Management and deliberative decision-making processes

    Directory of Open Access Journals (Sweden)

    Paul S. Regier

    2015-06-01

    Full Text Available Contingency Management is an effective treatment for drug addiction. The current explanation for its success is rooted in alternative reinforcement theory. We suggest that alternative reinforcement theory is inadequate to explain the success of Contingency Management and produce a model based on demand curves that show how little the monetary rewards offered in this treatment would affect drug use. Instead, we offer an explanation of its success based on the concept that it accesses deliberative decision-making processes. We suggest that Contingency Management is effective because it offers a concrete and immediate alternative to using drugs, which engages deliberative processes, improves the ability of those deliberative processes to attend to non-drug options, and offsets more automatic action-selection systems. This theory makes explicit predictions that can be tested, suggests which users will be most helped by Contingency Management, and suggests improvements in its implementation.

  13. Processes, Performance Drivers and ICT Tools in Human Resources Management

    Directory of Open Access Journals (Sweden)

    Oškrdal Václav

    2011-06-01

    Full Text Available This article presents an insight to processes, performance drivers and ICT tools in human resources (HR management area. On the basis of a modern approach to HR management, a set of business processes that are handled by today’s HR managers is defined. Consequently, the concept of ICT-supported performance drivers and their relevance in the area of HR management as well as the relationship between HR business processes, performance drivers and ICT tools are defined. The theoretical outcomes are further enhanced with results obtained from a survey among Czech companies. This article was written with kind courtesy of finances provided by VŠE IGA grant „IGA – 32/2010“.

  14. Contingency Management and Deliberative Decision-Making Processes.

    Science.gov (United States)

    Regier, Paul S; Redish, A David

    2015-01-01

    Contingency management is an effective treatment for drug addiction. The current explanation for its success is rooted in alternative reinforcement theory. We suggest that alternative reinforcement theory is inadequate to explain the success of contingency management and produce a model based on demand curves that show how little the monetary rewards offered in this treatment would affect drug use. Instead, we offer an explanation of its success based on the concept that it accesses deliberative decision-making processes. We suggest that contingency management is effective because it offers a concrete and immediate alternative to using drugs, which engages deliberative processes, improves the ability of those deliberative processes to attend to non-drug options, and offsets more automatic action-selection systems. This theory makes explicit predictions that can be tested, suggests which users will be most helped by contingency management, and suggests improvements in its implementation.

  15. REALIZING BUSINESS PROCESS MANAGEMENT BY HELP OF A PROCESS MAPPING DATABASE TOOL

    CERN Document Server

    Vergili, Ceren

    2016-01-01

    In a typical business sector, processes are the building blocks of the achievement. A considerable percentage of the processes are consisting of business processes. This fact is bringing the fact that business sectors are in need of a management discipline. Business Process Management (BPM) is a discipline that combines modelling, automation, execution, control, measurement, and optimization of process by considering enterprise goals, spanning systems, employees, customers, and partners. CERN’s EN – HE – HM section desires to apply the BPM discipline appropriately for improving their necessary technical, administrative and managerial actions to supply appropriate CERN industrial transport, handling and lifting equipment and to maintain it. For this reason, a Process Mapping Database Tool is created to develop a common understanding about how the section members can visualize their processes, agree on quality standards and on how to improve. It provides a management support by establishing Process Charts...

  16. Improving IC process efficiency with critical materials management

    Science.gov (United States)

    Hanson, Kathy L.; Andrews, Robert E.

    2003-06-01

    The management of critical materials in a high technology manufacturing facility is crucial to obtaining consistently high production yield. This is especially true in an industry like semiconductors where the success of the product is so dependent on the integrity of the critical production materials. Bar code systems, the traditional management tools, are voluntary, defeatable, and do not continuously monitor materials when in use. The significant costs associated with mis-management of chemicals can be captured with a customized model resulting in highly favorable ROI"s for the NOWTrak RFID chemical management system. This system transmits reliable chemical data about each individual container and generates information that can be used to increase wafer production efficiency and yield. The future of the RFID system will expand beyond the benefits of chemical management and into dynamic IC process management

  17. Next-generation Process Management with ADEPT2 (Demo Paper)

    NARCIS (Netherlands)

    Göser, Kevin; Jurisch, Martin; Acker, Hilmar; Kreher, Ulrich; Lauer, Markus; Rinderle, S.B.; Reichert, M.U.; Dadam, Peter

    Short time-to-market, easy adaptation to changes in business environment, and robustness of processes are key requirements in today’s business world. In the IT area of Business Process Management (BPM), solutions claim to satisfy these new demands, but are still not sufficient. In this paper we

  18. Software process improvement: controlling developers, managers or users?

    DEFF Research Database (Denmark)

    Nørbjerg, Jacob

    1999-01-01

    The paper discusses how the latest trend in the management of software development: software process improvement (SPI) may affect user-developer relations. At the outset, SPI concerns the "internal workings" of software organisations, but it may also be interpreted as one way to give the developer...... organisation more control over the development process and the relations with the user organization....

  19. A Survey on Evaluation Factors for Business Process Management Technology

    NARCIS (Netherlands)

    Mutschler, B.B.; Reichert, M.U.

    2006-01-01

    Estimating the value of business process management (BPM) technology is a difficult task to accomplish. Computerized business processes have a strong impact on an organization, and BPM projects have a long-term cost amortization. To systematically analyze BPM technology from an economic-driven

  20. Business process compliance management : an integrated proactive approach

    NARCIS (Netherlands)

    Elgammal, A.; Sebahi, S.; Turetken, O.; Hacid, M.-S.; Papazoglou, M.; van den Heuvel, W.; Soliman, K.S.

    2014-01-01

    Today’s enterprises demand a high degree of compliance of business processes to meet regulations, such as Sarbanes-Oxley and Basel I-III. To ensure continuous guaranteed compliance, compliance management should be considered during all phases of the business process lifecycle; from the analysis and

  1. Innovation process and innovativeness of facility management organizations

    NARCIS (Netherlands)

    Mudrak, T.; Wagenberg, van A.F.; Wubben, E.F.M.

    2005-01-01

    Purpose - The innovation patterns and processes in facility management (FM) organizations are crucial for the development of FM as a discipline, but they are not yet fully explored and understood. This paper aims to clarify FM innovation from the perspective of innovation processes and the

  2. Enhancing the Teaching-Learning Process: A Knowledge Management Approach

    Science.gov (United States)

    Bhusry, Mamta; Ranjan, Jayanthi

    2012-01-01

    Purpose: The purpose of this paper is to emphasize the need for knowledge management (KM) in the teaching-learning process in technical educational institutions (TEIs) in India, and to assert the impact of information technology (IT) based KM intervention in the teaching-learning process. Design/methodology/approach: The approach of the paper is…

  3. Process models as tools in forestry research and management

    Science.gov (United States)

    Kurt Johnsen; Lisa Samuelson; Robert Teskey; Steve McNulty; Tom Fox

    2001-01-01

    Forest process models are mathematical representations of biological systems that incorporate our understanding of physiological and ecological mechanisms into predictive algorithms. These models were originally designed and used for research purposes, but are being developed for use in practical forest management. Process models designed for research...

  4. An adaptive management process for forest soil conservation.

    Science.gov (United States)

    Michael P. Curran; Douglas G. Maynard; Ronald L. Heninger; Thomas A. Terry; Steven W. Howes; Douglas M. Stone; Thomas Niemann; Richard E. Miller; Robert F. Powers

    2005-01-01

    Soil disturbance guidelines should be based on comparable disturbance categories adapted to specific local soil conditions, validated by monitoring and research. Guidelines, standards, and practices should be continually improved based on an adaptive management process, which is presented in this paper. Core components of this process include: reliable monitoring...

  5. Bioinformatics and systems biology research update from the 15th International Conference on Bioinformatics (InCoB2016).

    Science.gov (United States)

    Schönbach, Christian; Verma, Chandra; Bond, Peter J; Ranganathan, Shoba

    2016-12-22

    The International Conference on Bioinformatics (InCoB) has been publishing peer-reviewed conference papers in BMC Bioinformatics since 2006. Of the 44 articles accepted for publication in supplement issues of BMC Bioinformatics, BMC Genomics, BMC Medical Genomics and BMC Systems Biology, 24 articles with a bioinformatics or systems biology focus are reviewed in this editorial. InCoB2017 is scheduled to be held in Shenzen, China, September 20-22, 2017.

  6. Project and Innovation Management in New Product Development Processes

    DEFF Research Database (Denmark)

    Henriksen, Leif; Gayretli, Ahmet

    2010-01-01

    Although the process of innovation is one of the most important drivers behind the growth and prosperity of today’s global economy, it is one of the least understood. This paper aims to address specific problems in carrying out new product development processes. There are crucial issues related...... to product design processes like inefficient project management, increasing product complexity, conflict management, shortfall of existing methods and tools, and high failures in new product introduction. A new approach has been proposed for a system based platform, which consist of a product platform...

  7. Editorial: Learning, teaching and disseminating knowledge in business process management

    Directory of Open Access Journals (Sweden)

    Jürgen Moormann

    2012-12-01

    Full Text Available Process-oriented thinking has become the major paradigm for managing companies and other organizations. The push for better processes has been even more intense due to rapidly evolving client needs, borderless global markets and innovations swiftly penetrating the market. Thus, education is decisive for successfully introducing and implementing Business Process Management (BPM initiatives. However, BPM education has been an area of challenge. This special issue aims to provide current research on various aspects of BPM education. It is an initial effort for consolidating better practices, experiences and pedagogical outcomes founded with empirical evidence to contribute towards the three pillars of education: learning, teaching, and disseminating knowledge in BPM.

  8. A systematic process for developing and assessing accident management plans

    International Nuclear Information System (INIS)

    Hanson, D.J.; Blackman, H.S.; Meyer, O.R.; Ward, L.W.

    1991-04-01

    This document describes a four-phase approach for developing criteria recommended for use in assessing the adequacy of nuclear power plant accident management plans. Two phases of the approach have been completed and provide a prototype process that could be used to develop an accident management plan. Based on this process, a preliminary set of assessment criteria are derived. These preliminary criteria will be refined and improved when the remaining steps of the approach are completed, that is, after the prototype process is validated through application. 9 refs., 10 figs., 7 tabs

  9. ENHANCING LEAN SUPPLY CHAIN MATURITY WITH BUSINESS PROCESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Jurij Jaklic

    2006-12-01

    Full Text Available In today’s global market the main focus of competition is not only between different companies but also between supply chains. Technological changes and organizational improvements are important for effective supply chain management (SCM, however, the main cause of SCM improvements is not the implementation of an information system (IS itself, but rather a change and an integration of business processes. The paper summarizes the most important concepts of SCM and specifically concentrates on the importance of business process management (BPM in supply chains, because full advantages can be realized when business processes in the supply chain are well defined, integrated and managed. The main purpose of this paper is to show that successful SCM calls for the maturity of supply chain processes in all involved companies and at the supply chain level, which can be realized by using effective BPM methods. A necessary condition for growing of SCM in terms of supply chain process maturity levels is an inter-organizational information system development and process renovation. Yet, BPM should not be considered as a one-time project of IS implementation and process change, but as a permanent process performance measurement, analysis and continuous improvement of the supply chain processes. The concepts are illustrated with a case study of fuel supply process.

  10. Developing cloud-based Business Process Management (BPM): a survey

    Science.gov (United States)

    Mercia; Gunawan, W.; Fajar, A. N.; Alianto, H.; Inayatulloh

    2018-03-01

    In today’s highly competitive business environment, modern enterprises are dealing difficulties to cut unnecessary costs, eliminate wastes and delivery huge benefits for the organization. Companies are increasingly turning to a more flexible IT environment to help them realize this goal. For this reason, the article applies cloud based Business Process Management (BPM) that enables to focus on modeling, monitoring and process management. Cloud based BPM consists of business processes, business information and IT resources, which help build real-time intelligence systems, based on business management and cloud technology. Cloud computing is a paradigm that involves procuring dynamically measurable resources over the internet as an IT resource service. Cloud based BPM service enables to address common problems faced by traditional BPM, especially in promoting flexibility, event-driven business process to exploit opportunities in the marketplace.

  11. Improvement of radiology services based on the process management approach.

    Science.gov (United States)

    Amaral, Creusa Sayuri Tahara; Rozenfeld, Henrique; Costa, Janaina Mascarenhas Hornos; Magon, Maria de Fátima de Andrade; Mascarenhas, Yvone Maria

    2011-06-01

    The health sector requires continuous investments to ensure the improvement of products and services from a technological standpoint, the use of new materials, equipment and tools, and the application of process management methods. Methods associated with the process management approach, such as the development of reference models of business processes, can provide significant innovations in the health sector and respond to the current market trend for modern management in this sector (Gunderman et al. (2008) [4]). This article proposes a process model for diagnostic medical X-ray imaging, from which it derives a primary reference model and describes how this information leads to gains in quality and improvements. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  12. Bioinformatics in New Generation Flavivirus Vaccines

    Directory of Open Access Journals (Sweden)

    Penelope Koraka

    2010-01-01

    Full Text Available Flavivirus infections are the most prevalent arthropod-borne infections world wide, often causing severe disease especially among children, the elderly, and the immunocompromised. In the absence of effective antiviral treatment, prevention through vaccination would greatly reduce morbidity and mortality associated with flavivirus infections. Despite the success of the empirically developed vaccines against yellow fever virus, Japanese encephalitis virus and tick-borne encephalitis virus, there is an increasing need for a more rational design and development of safe and effective vaccines. Several bioinformatic tools are available to support such rational vaccine design. In doing so, several parameters have to be taken into account, such as safety for the target population, overall immunogenicity of the candidate vaccine, and efficacy and longevity of the immune responses triggered. Examples of how bio-informatics is applied to assist in the rational design and improvements of vaccines, particularly flavivirus vaccines, are presented and discussed.

  13. Dissociated neural processing for decisions in managers and non-managers.

    Directory of Open Access Journals (Sweden)

    Svenja Caspers

    Full Text Available Functional neuroimaging studies of decision-making so far mainly focused on decisions under uncertainty or negotiation with other persons. Dual process theory assumes that, in such situations, decision making relies on either a rapid intuitive, automated or a slower rational processing system. However, it still remains elusive how personality factors or professional requirements might modulate the decision process and the underlying neural mechanisms. Since decision making is a key task of managers, we hypothesized that managers, facing higher pressure for frequent and rapid decisions than non-managers, prefer the heuristic, automated decision strategy in contrast to non-managers. Such different strategies may, in turn, rely on different neural systems. We tested managers and non-managers in a functional magnetic resonance imaging study using a forced-choice paradigm on word-pairs. Managers showed subcortical activation in the head of the caudate nucleus, and reduced hemodynamic response within the cortex. In contrast, non-managers revealed the opposite pattern. With the head of the caudate nucleus being an initiating component for process automation, these results supported the initial hypothesis, hinting at automation during decisions in managers. More generally, the findings reveal how different professional requirements might modulate cognitive decision processing.

  14. Dissociated neural processing for decisions in managers and non-managers.

    Science.gov (United States)

    Caspers, Svenja; Heim, Stefan; Lucas, Marc G; Stephan, Egon; Fischer, Lorenz; Amunts, Katrin; Zilles, Karl

    2012-01-01

    Functional neuroimaging studies of decision-making so far mainly focused on decisions under uncertainty or negotiation with other persons. Dual process theory assumes that, in such situations, decision making relies on either a rapid intuitive, automated or a slower rational processing system. However, it still remains elusive how personality factors or professional requirements might modulate the decision process and the underlying neural mechanisms. Since decision making is a key task of managers, we hypothesized that managers, facing higher pressure for frequent and rapid decisions than non-managers, prefer the heuristic, automated decision strategy in contrast to non-managers. Such different strategies may, in turn, rely on different neural systems. We tested managers and non-managers in a functional magnetic resonance imaging study using a forced-choice paradigm on word-pairs. Managers showed subcortical activation in the head of the caudate nucleus, and reduced hemodynamic response within the cortex. In contrast, non-managers revealed the opposite pattern. With the head of the caudate nucleus being an initiating component for process automation, these results supported the initial hypothesis, hinting at automation during decisions in managers. More generally, the findings reveal how different professional requirements might modulate cognitive decision processing.

  15. Lean process management implementation through enhanced problem solving capabilities

    Directory of Open Access Journals (Sweden)

    Perumal Puvanasvaran

    2010-12-01

    Full Text Available All Original Equipment Manufacturers (OEM organizations in Aerospace, Automotive and Electronics industries had to upgrade their functions. These organizations including suppliers and solutions providers are duty bound to improve their functions through strategic initiatives. One such initiative is Lean Process Management. Lean Process Management has proven to aid organizations in developing manufacturing and administrative management solutions and make the organization a leaner at the same time a ‘fitter’ one, achieving World Class standards in terms of production, quality, marketing, etc, etc. The issue or problem is, although a number of authors, experts, researchers have discussed the lean process management as part organization centric issues, they failed to provide an effective lean process management system. Besides the need to formulate an effective lean process as suggested by some authors, another important reason suggested is the employee’s development aspect regarding how to unlock the infinite potential of their workforce. This employee’s development is basically the problem solving capabilities of the employees while implementing the Lean through clear cutting protocols or processes of Lean Process Management. The employees need to be developed and equipped to contribute optimally to the process. Because of this scenario, the main objective of this study is to develop an employees development system which the author has acronym or trademark it as People Development System (PDS to enhance problem solving capability among its employees while implementing the lean process management there. Although, the PDS can be implemented throughout the organization, if it is implemented in a particular department in an organization, it will be feasible to study and analyze its effectiveness in-depth. So, this study documents and analyzes the implementation of Lean process in the Kitting Department of the aerospace company, ABC Company

  16. Investigating Individuals' Intention to be Involved in Knowledge Management Process

    OpenAIRE

    M. J.M. Razi; N. S.A. Karim

    2011-01-01

    Problem statement: Implementation of Knowledge Management (KM) process in organizations is considered as essential to be competitive in the present competitive world. Though the modern KM practices highly depend on technology, individuals (organizational members) intention to be involved in KM process plays a major role in the success. Hence, the evaluation of individuals intention is deemed as significant before the actual implementation of KM process in organizations. Nevertheless, inadequa...

  17. Bioinformatics of cardiovascular miRNA biology.

    Science.gov (United States)

    Kunz, Meik; Xiao, Ke; Liang, Chunguang; Viereck, Janika; Pachel, Christina; Frantz, Stefan; Thum, Thomas; Dandekar, Thomas

    2015-12-01

    MicroRNAs (miRNAs) are small ~22 nucleotide non-coding RNAs and are highly conserved among species. Moreover, miRNAs regulate gene expression of a large number of genes associated with important biological functions and signaling pathways. Recently, several miRNAs have been found to be associated with cardiovascular diseases. Thus, investigating the complex regulatory effect of miRNAs may lead to a better understanding of their functional role in the heart. To achieve this, bioinformatics approaches have to be coupled with validation and screening experiments to understand the complex interactions of miRNAs with the genome. This will boost the subsequent development of diagnostic markers and our understanding of the physiological and therapeutic role of miRNAs in cardiac remodeling. In this review, we focus on and explain different bioinformatics strategies and algorithms for the identification and analysis of miRNAs and their regulatory elements to better understand cardiac miRNA biology. Starting with the biogenesis of miRNAs, we present approaches such as LocARNA and miRBase for combining sequence and structure analysis including phylogenetic comparisons as well as detailed analysis of RNA folding patterns, functional target prediction, signaling pathway as well as functional analysis. We also show how far bioinformatics helps to tackle the unprecedented level of complexity and systemic effects by miRNA, underlining the strong therapeutic potential of miRNA and miRNA target structures in cardiovascular disease. In addition, we discuss drawbacks and limitations of bioinformatics algorithms and the necessity of experimental approaches for miRNA target identification. This article is part of a Special Issue entitled 'Non-coding RNAs'. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Comprehensive decision tree models in bioinformatics.

    Directory of Open Access Journals (Sweden)

    Gregor Stiglic

    Full Text Available PURPOSE: Classification is an important and widely used machine learning technique in bioinformatics. Researchers and other end-users of machine learning software often prefer to work with comprehensible models where knowledge extraction and explanation of reasoning behind the classification model are possible. METHODS: This paper presents an extension to an existing machine learning environment and a study on visual tuning of decision tree classifiers. The motivation for this research comes from the need to build effective and easily interpretable decision tree models by so called one-button data mining approach where no parameter tuning is needed. To avoid bias in classification, no classification performance measure is used during the tuning of the model that is constrained exclusively by the dimensions of the produced decision tree. RESULTS: The proposed visual tuning of decision trees was evaluated on 40 datasets containing classical machine learning problems and 31 datasets from the field of bioinformatics. Although we did not expected significant differences in classification performance, the results demonstrate a significant increase of accuracy in less complex visually tuned decision trees. In contrast to classical machine learning benchmarking datasets, we observe higher accuracy gains in bioinformatics datasets. Additionally, a user study was carried out to confirm the assumption that the tree tuning times are significantly lower for the proposed method in comparison to manual tuning of the decision tree. CONCLUSIONS: The empirical results demonstrate that by building simple models constrained by predefined visual boundaries, one not only achieves good comprehensibility, but also very good classification performance that does not differ from usually more complex models built using default settings of the classical decision tree algorithm. In addition, our study demonstrates the suitability of visually tuned decision trees for datasets

  19. Comprehensive decision tree models in bioinformatics.

    Science.gov (United States)

    Stiglic, Gregor; Kocbek, Simon; Pernek, Igor; Kokol, Peter

    2012-01-01

    Classification is an important and widely used machine learning technique in bioinformatics. Researchers and other end-users of machine learning software often prefer to work with comprehensible models where knowledge extraction and explanation of reasoning behind the classification model are possible. This paper presents an extension to an existing machine learning environment and a study on visual tuning of decision tree classifiers. The motivation for this research comes from the need to build effective and easily interpretable decision tree models by so called one-button data mining approach where no parameter tuning is needed. To avoid bias in classification, no classification performance measure is used during the tuning of the model that is constrained exclusively by the dimensions of the produced decision tree. The proposed visual tuning of decision trees was evaluated on 40 datasets containing classical machine learning problems and 31 datasets from the field of bioinformatics. Although we did not expected significant differences in classification performance, the results demonstrate a significant increase of accuracy in less complex visually tuned decision trees. In contrast to classical machine learning benchmarking datasets, we observe higher accuracy gains in bioinformatics datasets. Additionally, a user study was carried out to confirm the assumption that the tree tuning times are significantly lower for the proposed method in comparison to manual tuning of the decision tree. The empirical results demonstrate that by building simple models constrained by predefined visual boundaries, one not only achieves good comprehensibility, but also very good classification performance that does not differ from usually more complex models built using default settings of the classical decision tree algorithm. In addition, our study demonstrates the suitability of visually tuned decision trees for datasets with binary class attributes and a high number of possibly

  20. Penalized feature selection and classification in bioinformatics

    OpenAIRE

    Ma, Shuangge; Huang, Jian

    2008-01-01

    In bioinformatics studies, supervised classification with high-dimensional input variables is frequently encountered. Examples routinely arise in genomic, epigenetic and proteomic studies. Feature selection can be employed along with classifier construction to avoid over-fitting, to generate more reliable classifier and to provide more insights into the underlying causal relationships. In this article, we provide a review of several recently developed penalized feature selection and classific...

  1. The Key Principles of Process Manager Motivation in Production and Administration Processes in an Industrial Enterprise

    Directory of Open Access Journals (Sweden)

    Chromjakova Felicita

    2016-03-01

    Full Text Available The basic premise of sustainable development is that companies should completely re-evaluate their enterprise work logic and process organization. Most of the necessary changes concern employee stimulation and motivation. If we are truly interested in improving business results and the effectiveness of business processes – there would be no progress otherwise – we have to strive to break down the barriers between company management (leadership and employees in order to establish effective relationships between firms and customers. This paper presents research results of process manager activities in modern industrial enterprises, connected with a methodology proposal for the systematically-oriented process manager motivation of employees in accordance with the increased competitiveness of production and administration processes. It also presents an effective methodology of how to increase the positive effects of welldefined employee motivations from the process manager´s perspective. The core benefit of this methodology lies in the design of a systematic approach to the motivation process from the process manager side, allowing for radical performance improvement via production and administrative processes and the increased competitiveness of enterprise processes.

  2. Improving the medical records department processes by lean management.

    Science.gov (United States)

    Ajami, Sima; Ketabi, Saeedeh; Sadeghian, Akram; Saghaeinnejad-Isfahani, Sakine

    2015-01-01

    Lean management is a process improvement technique to identify waste actions and processes to eliminate them. The benefits of Lean for healthcare organizations are that first, the quality of the outcomes in terms of mistakes and errors improves. The second is that the amount of time taken through the whole process significantly improves. The purpose of this paper is to improve the Medical Records Department (MRD) processes at Ayatolah-Kashani Hospital in Isfahan, Iran by utilizing Lean management. This research was applied and an interventional study. The data have been collected by brainstorming, observation, interview, and workflow review. The study population included MRD staff and other expert staff within the hospital who were stakeholders and users of the MRD. The MRD were initially taught the concepts of Lean management and then formed into the MRD Lean team. The team then identified and reviewed the current processes subsequently; they identified wastes and values, and proposed solutions. The findings showed that the MRD units (Archive, Coding, Statistics, and Admission) had 17 current processes, 28 wastes, and 11 values were identified. In addition, they offered 27 comments for eliminating the wastes. The MRD is the critical department for the hospital information system and, therefore, the continuous improvement of its services and processes, through scientific methods such as Lean management, are essential. The study represents one of the few attempts trying to eliminate wastes in the MRD.

  3. Bioinformatics Training: A Review of Challenges, Actions and Support Requirements

    DEFF Research Database (Denmark)

    Schneider, M.V.; Watson, J.; Attwood, T.

    2010-01-01

    As bioinformatics becomes increasingly central to research in the molecular life sciences, the need to train non-bioinformaticians to make the most of bioinformatics resources is growing. Here, we review the key challenges and pitfalls to providing effective training for users of bioinformatics...... services, and discuss successful training strategies shared by a diverse set of bioinformatics trainers. We also identify steps that trainers in bioinformatics could take together to advance the state of the art in current training practices. The ideas presented in this article derive from the first...

  4. Adapting bioinformatics curricula for big data.

    Science.gov (United States)

    Greene, Anna C; Giffin, Kristine A; Greene, Casey S; Moore, Jason H

    2016-01-01

    Modern technologies are capable of generating enormous amounts of data that measure complex biological systems. Computational biologists and bioinformatics scientists are increasingly being asked to use these data to reveal key systems-level properties. We review the extent to which curricula are changing in the era of big data. We identify key competencies that scientists dealing with big data are expected to possess across fields, and we use this information to propose courses to meet these growing needs. While bioinformatics programs have traditionally trained students in data-intensive science, we identify areas of particular biological, computational and statistical emphasis important for this era that can be incorporated into existing curricula. For each area, we propose a course structured around these topics, which can be adapted in whole or in parts into existing curricula. In summary, specific challenges associated with big data provide an important opportunity to update existing curricula, but we do not foresee a wholesale redesign of bioinformatics training programs. © The Author 2015. Published by Oxford University Press.

  5. Application of Bioinformatics in Chronobiology Research

    Directory of Open Access Journals (Sweden)

    Robson da Silva Lopes

    2013-01-01

    Full Text Available Bioinformatics and other well-established sciences, such as molecular biology, genetics, and biochemistry, provide a scientific approach for the analysis of data generated through “omics” projects that may be used in studies of chronobiology. The results of studies that apply these techniques demonstrate how they significantly aided the understanding of chronobiology. However, bioinformatics tools alone cannot eliminate the need for an understanding of the field of research or the data to be considered, nor can such tools replace analysts and researchers. It is often necessary to conduct an evaluation of the results of a data mining effort to determine the degree of reliability. To this end, familiarity with the field of investigation is necessary. It is evident that the knowledge that has been accumulated through chronobiology and the use of tools derived from bioinformatics has contributed to the recognition and understanding of the patterns and biological rhythms found in living organisms. The current work aims to develop new and important applications in the near future through chronobiology research.

  6. Bringing Web 2.0 to bioinformatics.

    Science.gov (United States)

    Zhang, Zhang; Cheung, Kei-Hoi; Townsend, Jeffrey P

    2009-01-01

    Enabling deft data integration from numerous, voluminous and heterogeneous data sources is a major bioinformatic challenge. Several approaches have been proposed to address this challenge, including data warehousing and federated databasing. Yet despite the rise of these approaches, integration of data from multiple sources remains problematic and toilsome. These two approaches follow a user-to-computer communication model for data exchange, and do not facilitate a broader concept of data sharing or collaboration among users. In this report, we discuss the potential of Web 2.0 technologies to transcend this model and enhance bioinformatics research. We propose a Web 2.0-based Scientific Social Community (SSC) model for the implementation of these technologies. By establishing a social, collective and collaborative platform for data creation, sharing and integration, we promote a web services-based pipeline featuring web services for computer-to-computer data exchange as users add value. This pipeline aims to simplify data integration and creation, to realize automatic analysis, and to facilitate reuse and sharing of data. SSC can foster collaboration and harness collective intelligence to create and discover new knowledge. In addition to its research potential, we also describe its potential role as an e-learning platform in education. We discuss lessons from information technology, predict the next generation of Web (Web 3.0), and describe its potential impact on the future of bioinformatics studies.

  7. Adapting bioinformatics curricula for big data

    Science.gov (United States)

    Greene, Anna C.; Giffin, Kristine A.; Greene, Casey S.

    2016-01-01

    Modern technologies are capable of generating enormous amounts of data that measure complex biological systems. Computational biologists and bioinformatics scientists are increasingly being asked to use these data to reveal key systems-level properties. We review the extent to which curricula are changing in the era of big data. We identify key competencies that scientists dealing with big data are expected to possess across fields, and we use this information to propose courses to meet these growing needs. While bioinformatics programs have traditionally trained students in data-intensive science, we identify areas of particular biological, computational and statistical emphasis important for this era that can be incorporated into existing curricula. For each area, we propose a course structured around these topics, which can be adapted in whole or in parts into existing curricula. In summary, specific challenges associated with big data provide an important opportunity to update existing curricula, but we do not foresee a wholesale redesign of bioinformatics training programs. PMID:25829469

  8. An Adaptive Hybrid Multiprocessor technique for bioinformatics sequence alignment

    KAUST Repository

    Bonny, Talal

    2012-07-28

    Sequence alignment algorithms such as the Smith-Waterman algorithm are among the most important applications in the development of bioinformatics. Sequence alignment algorithms must process large amounts of data which may take a long time. Here, we introduce our Adaptive Hybrid Multiprocessor technique to accelerate the implementation of the Smith-Waterman algorithm. Our technique utilizes both the graphics processing unit (GPU) and the central processing unit (CPU). It adapts to the implementation according to the number of CPUs given as input by efficiently distributing the workload between the processing units. Using existing resources (GPU and CPU) in an efficient way is a novel approach. The peak performance achieved for the platforms GPU + CPU, GPU + 2CPUs, and GPU + 3CPUs is 10.4 GCUPS, 13.7 GCUPS, and 18.6 GCUPS, respectively (with the query length of 511 amino acid). © 2010 IEEE.

  9. Tissue damage in organic rainbow trout muscle investigated by proteomics and bioinformatics

    DEFF Research Database (Denmark)

    Wulff, Tune; Silva, T.; Nielsen, Michael Engelbrecht

    2013-01-01

    and magnitude of the cellular response, in the context of a regenerative process. Using a bioinformatics approach, the main biological function of these proteins were assigned, showing the regulation of proteins involved in processes like apoptosis, iron homeostasis and regulation of muscular structure...

  10. Ramping up to the Biology Workbench: A Multi-Stage Approach to Bioinformatics Education

    Science.gov (United States)

    Greene, Kathleen; Donovan, Sam

    2005-01-01

    In the process of designing and field-testing bioinformatics curriculum materials, we have adopted a three-stage, progressive model that emphasizes collaborative scientific inquiry. The elements of the model include: (1) context setting, (2) introduction to concepts, processes, and tools, and (3) development of competent use of technologically…

  11. The clinical nurse specialist as resuscitation process manager.

    Science.gov (United States)

    Schneiderhahn, Mary Elizabeth; Fish, Anne Folta

    2014-01-01

    The purpose of this article was to describe the history and leadership dimensions of the role of resuscitation process manager and provide specific examples of how this role is implemented at a Midwest medical center. In 1992, a medical center in the Midwest needed a nurse to manage resuscitation care. This role designation meant that this nurse became central to all quality improvement efforts in resuscitation care. The role expanded as clinical resuscitation guidelines were updated and as the medical center grew. The role became known as the critical care clinical nurse specialist as resuscitation process manager. This clinical care nurse specialist was called a manager, but she had no direct line authority, so she accomplished her objectives by forming a multitude of collaborative networks. Based on a framework by Finkelman, the manager role incorporated specific leadership abilities in quality improvement: (1) coordination of medical center-wide resuscitation, (2) use of interprofessional teams, (3) integration of evidence into practice, and (4) staff coaching to develop leadership. The manager coordinates resuscitation care with the goals of prevention of arrests if possible, efficient and effective implementation of resuscitation protocols, high quality of patient and family support during and after the resuscitation event, and creation or revision of resuscitation policies for in-hospital and for ambulatory care areas. The manager designs a comprehensive set of meaningful and measurable process and outcome indicators with input from interprofessional teams. The manager engages staff in learning, reflecting on care given, and using the evidence base for resuscitation care. Finally, the manager role is a balance between leading quality improvement efforts and coaching staff to implement and sustain these quality improvement initiatives. Revisions to clinical guidelines for resuscitation care since the 1990s have resulted in medical centers developing improved

  12. Management by process based systems and safety focus; Verksamhetsstyrning med process-baserade ledningssystem och saekerhetsfokus

    Energy Technology Data Exchange (ETDEWEB)

    Rydnert, Bo; Groenlund, Bjoern [SIS Forum AB, Stockholm (Sweden)

    2005-12-15

    An initiative from The Swedish Nuclear Power Inspectorate led to this study carried out in the late autumn of 2005. The objective was to understand in more detail how an increasing use of process management affects organisations, on the one hand regarding risks and security, on the other hand regarding management by objectives and other management and operative effects. The main method was interviewing representatives of companies and independent experts. More than 20 interviews were carried out. In addition a literature study was made. All participating companies are using Management Systems based on processes. However, the methods chosen, and the results achieved, vary extensively. Thus, there are surprisingly few examples of complete and effective management by processes. Yet there is no doubt that management by processes is effective and efficient. Overall goals are reached, business results are achieved in more reliable ways and customers are more satisfied. The weaknesses found can be translated into a few comprehensive recommendations. A clear, structured and acknowledged model should be used and the processes should be described unambiguously. The changed management roles should be described and obeyed extremely legibly. New types of process objectives need to be formulated. In addition one fact needs to be observed and effectively fended off. Changes are often met by mental opposition on management level, as well as among co-workers. This fact needs attention and leadership. Safety development is closely related to the design and operation of a business management system and its continual improvement. A deep understanding of what constitutes an efficient and effective management system affects the understanding of safety. safety culture and abilities to achieve safety goals. Concerning risk, the opinions were unambiguous. Management by processes as such does not result in any further risks. On the contrary. Processes give a clear view of production and

  13. Video Bioinformatics Analysis of Human Embryonic Stem Cell Colony Growth

    Science.gov (United States)

    Lin, Sabrina; Fonteno, Shawn; Satish, Shruthi; Bhanu, Bir; Talbot, Prue

    2010-01-01

    Because video data are complex and are comprised of many images, mining information from video material is difficult to do without the aid of computer software. Video bioinformatics is a powerful quantitative approach for extracting spatio-temporal data from video images using computer software to perform dating mining and analysis. In this article, we introduce a video bioinformatics method for quantifying the growth of human embryonic stem cells (hESC) by analyzing time-lapse videos collected in a Nikon BioStation CT incubator equipped with a camera for video imaging. In our experiments, hESC colonies that were attached to Matrigel were filmed for 48 hours in the BioStation CT. To determine the rate of growth of these colonies, recipes were developed using CL-Quant software which enables users to extract various types of data from video images. To accurately evaluate colony growth, three recipes were created. The first segmented the image into the colony and background, the second enhanced the image to define colonies throughout the video sequence accurately, and the third measured the number of pixels in the colony over time. The three recipes were run in sequence on video data collected in a BioStation CT to analyze the rate of growth of individual hESC colonies over 48 hours. To verify the truthfulness of the CL-Quant recipes, the same data were analyzed manually using Adobe Photoshop software. When the data obtained using the CL-Quant recipes and Photoshop were compared, results were virtually identical, indicating the CL-Quant recipes were truthful. The method described here could be applied to any video data to measure growth rates of hESC or other cells that grow in colonies. In addition, other video bioinformatics recipes can be developed in the future for other cell processes such as migration, apoptosis, and cell adhesion. PMID:20495527

  14. A review of bioinformatic methods for forensic DNA analyses.

    Science.gov (United States)

    Liu, Yao-Yuan; Harbison, SallyAnn

    2018-03-01

    Short tandem repeats, single nucleotide polymorphisms, and whole mitochondrial analyses are three classes of markers which will play an important role in the future of forensic DNA typing. The arrival of massively parallel sequencing platforms in forensic science reveals new information such as insights into the complexity and variability of the markers that were previously unseen, along with amounts of data too immense for analyses by manual means. Along with the sequencing chemistries employed, bioinformatic methods are required to process and interpret this new and extensive data. As more is learnt about the use of these new technologies for forensic applications, development and standardization of efficient, favourable tools for each stage of data processing is being carried out, and faster, more accurate methods that improve on the original approaches have been developed. As forensic laboratories search for the optimal pipeline of tools, sequencer manufacturers have incorporated pipelines into sequencer software to make analyses convenient. This review explores the current state of bioinformatic methods and tools used for the analyses of forensic markers sequenced on the massively parallel sequencing (MPS) platforms currently most widely used. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Successful process management for public utilities; Erfolgreiches Prozessmanagement fuer Stadtwerke

    Energy Technology Data Exchange (ETDEWEB)

    Knipprath, Daniel [projekt:unternehmensberatungsgesellschaft mbH, Muenchen (Germany); Schaefer, Anke [Dr. Schaefer PR- und Strategieberatung, Rostock (Germany)

    2011-06-15

    As a result of regulatory cuts in their revenue structure, public utilities are increasingly compelled to improve their cost efficiency. Furthermore, they have to deal with altered framework conditions of energy procurement as well as the necessity of sustainable customer loyalty management. The example of a regional supplier is used here to show how goal-oriented process management can contribute to securing a sustainable, promising position in the market.

  16. Spiral model of procedural cycle of educational process management

    Directory of Open Access Journals (Sweden)

    Bezrukov Valery I.

    2016-01-01

    Full Text Available The article analyzes the nature and characteristics of the spiral model Procedure educational systems management cycle. The authors identify patterns between the development of information and communication technologies and the transformation of the education management process, give the characteristics of the concept of “information literacy” and “Media Education”. Consider the design function, determine its potential in changing the traditional educational paradigm to the new - information.

  17. Advanced information processing system: Input/output network management software

    Science.gov (United States)

    Nagle, Gail; Alger, Linda; Kemp, Alexander

    1988-01-01

    The purpose of this document is to provide the software requirements and specifications for the Input/Output Network Management Services for the Advanced Information Processing System. This introduction and overview section is provided to briefly outline the overall architecture and software requirements of the AIPS system before discussing the details of the design requirements and specifications of the AIPS I/O Network Management software. A brief overview of the AIPS architecture followed by a more detailed description of the network architecture.

  18. Diagnosing resistance to change in the change management process

    OpenAIRE

    Tetiana Kuzhda

    2016-01-01

    This article explains the change management process and resistance to organizational change through examining causes of resistance to change, diagnosing them, and finding the ways to deal with resistance to change. In business environment, the one thing any company can be assured of is change. If an organization experiences change it may also need to implement new business strategies, which can create resistance among employees. Managers need to know in which phase they have to expect unusual...

  19. Tasks for the future process and works management

    International Nuclear Information System (INIS)

    Kinn, T.

    1994-01-01

    The actions which have been taken according to the company's targets formulated so far, have lead to complex large scale technical plants. The process and works management becomes more and more difficult due to increasing costs and the strict margins, which have been set by the environmental technology. A complex/An integrated approach at the development of process and works control systems can help considerably to solve these problems due to the dependency on information and partly similar tasks. Before the integrated/complex approach is made the structure and nature of the tasks of all company levels must be analysed and put into concrete terms. The resulting demand of data processing must be adjusted within the frame of a data processing development plan and be realised step by step. In this work the structures, the future tasks and the information demand of process and works management are described. (orig.) [de

  20. Certification of quality management of TECNATOM according to ISO 9001: 2000. The management of processes

    International Nuclear Information System (INIS)

    Alava, R.; Corbi, M.

    2003-01-01

    One of the main new features of ISO 9000:2000 Quality Management standards series is the incorporation of the Process Approach and the System Approach to Management. But, also, the implantation of these concepts is one of the main difficulties that it involves, since in addition to the technical complication that supposes to implant this management strategy is necessary to obtain a change in the attitude of personnel. In the article it is exposed as it has been the process followed in Tecnatom for the certification according to ISO 9001:2000 standard and the experience obtained in the implantation of the Process Approach. (Author)

  1. Temporal and Location Based RFID Event Data Management and Processing

    Science.gov (United States)

    Wang, Fusheng; Liu, Peiya

    Advance of sensor and RFID technology provides significant new power for humans to sense, understand and manage the world. RFID provides fast data collection with precise identification of objects with unique IDs without line of sight, thus it can be used for identifying, locating, tracking and monitoring physical objects. Despite these benefits, RFID poses many challenges for data processing and management. RFID data are temporal and history oriented, multi-dimensional, and carrying implicit semantics. Moreover, RFID applications are heterogeneous. RFID data management or data warehouse systems need to support generic and expressive data modeling for tracking and monitoring physical objects, and provide automated data interpretation and processing. We develop a powerful temporal and location oriented data model for modeling and queryingRFID data, and a declarative event and rule based framework for automated complex RFID event processing. The approach is general and can be easily adapted for different RFID-enabled applications, thus significantly reduces the cost of RFID data integration.

  2. Ecosystem management via interacting models of political and ecological processes

    Directory of Open Access Journals (Sweden)

    Haas, T. C.

    2004-01-01

    Full Text Available The decision to implement environmental protection options is a political one. Political realities may cause a country to not heed the most persuasive scientific analysis of an ecosystem's future health. A predictive understanding of the political processes that result in ecosystem management decisions may help guide ecosystem management policymaking. To this end, this article develops a stochastic, temporal model of how political processes influence and are influenced by ecosystem processes. This model is realized in a system of interacting influence diagrams that model the decision making of a country's political bodies. These decisions interact with a model of the ecosystem enclosed by the country. As an example, a model for Cheetah (Acinonyx jubatus management in Kenya is constructed and fitted to decision and ecological data.

  3. The integrity management cycle as a business process

    Energy Technology Data Exchange (ETDEWEB)

    Ackhurst, Trent B.; Peverelli, Romina P. [PIMS - Pipeline Integrity Management Specialists of London Ltd. (United Kingdom).

    2009-07-01

    It is a best-practice Oil and Gas pipeline integrity and reliability technique to apply integrity management cycles. This is conforms to the business principles of continuous improvement. This paper examines the integrity management cycle - both goals and objectives and subsequent component steps - from a business perspective. Traits that businesses require, to glean maximum benefit from such a cycle, are highlighted. A case study focuses upon an integrity and reliability process developed to apply to pipeline operators. installations. This is compared and contrasted to the pipeline integrity management cycle to underline both cycles. consistency with the principles of continuous improvement. (author)

  4. Modeling of processes of an adaptive business management

    Directory of Open Access Journals (Sweden)

    Karev Dmitry Vladimirovich

    2011-04-01

    Full Text Available On the basis of the analysis of systems of adaptive management board business proposed the original version of the real system of adaptive management, the basis of which used dynamic recursive model cash flow forecast and real data. Proposed definitions and the simulation of scales and intervals of model time in the control system, as well as the thresholds observations and conditions of changing (correction of the administrative decisions. The process of adaptive management is illustrated on the basis proposed by the author of the script of development of business.

  5. Continuous improvement processes using Lean Management tools. A case study

    Directory of Open Access Journals (Sweden)

    Pârv Luminița

    2017-01-01

    Full Text Available The paper describes how Lean Management may be applied in the university setting to improve the management processes. The correlation of didactic, educational and research activities with the stakeholders needs is one of the main objectives of the university. In this respect, an indicator used to analyse a university, for the purposes of fulfilling its mission, respectively for the purposes of streamlining its didactic and scientific activity, is related to the number of graduates on the labour market, acting in their area of specialization. This work presents a best practice of Lean Management at Transilvania University of Brasov, Romania.

  6. Impact of HMO ownership on management processes and utilization outcomes.

    Science.gov (United States)

    Ahern, M; Molinari, C

    2001-05-01

    To examine the effects of health maintenance organization (HMO) ownership characteristics on selected utilization outcomes and management processes affecting utilization. We used 1995 HMO data from the American Association of Health Plans. Using regression analysis, we examined the relation between HMO utilization (hospital discharges, days, and average length of stay; cardiac catheterization procedures; and average cost of outpatient prescriptions) and the structural characteristics of HMOs: ownership type (insurance company, hospital, physician, independent, and national managed care company), HMO size, for-profit status, model type, geographic region, and payer mix. HMO ownership type is significantly associated with medical management processes, including risk sharing by providers, risk sharing by consumers, and other management strategies. Relative to hospital-owned HMOs, insurance company-owned HMOs have fewer hospital discharges, fewer hospital days, and longer lengths of stay. National managed care organization-owned HMOs have fewer cardiac catheterizations and lower average outpatient prescription costs. Independently owned HMOs have more cardiac catheterizations. For-profit HMOs have lower prescription costs. Relative to hospital-owned HMOs, insurance company-owned HMOs are more likely to use hospital risk sharing and provider capitation and less likely to use out-of-pocket payments for hospital use and a closed formulary. National managed care organization-owned HMOs are less likely to use provider capitation, out-of-pocket payments for hospital use, catastrophic case management, and hospital risk sharing. Physician-hospital-owned HMOs are less likely to use catastrophic case management. For-profit HMOs are more likely to use hospital risk sharing and catastrophic case management. HMO ownership type affects utilization outcomes and management strategies.

  7. Information Technology Process Improvement Decision-Making: An Exploratory Study from the Perspective of Process Owners and Process Managers

    Science.gov (United States)

    Lamp, Sandra A.

    2012-01-01

    There is information available in the literature that discusses information technology (IT) governance and investment decision making from an executive-level perception, yet there is little information available that offers the perspective of process owners and process managers pertaining to their role in IT process improvement and investment…

  8. A project management focused framework for assuring quality work processes

    Energy Technology Data Exchange (ETDEWEB)

    Gamsby, S.O.; Mize, J.D. [Allied Signal, Inc., Albuquerque, NM (United States). Federal Mfg. and Technologies; Reid, R.A. [New Mexico Univ., Albuquerque, NM (United States)

    1996-10-01

    Federal Manufacturing & Technologies/New Mexico (FM&T/NM) of AlliedSignal is an organization of approximately 300 associates providing operations support, engineering, and other technical services for DOE, New Mexico`s National Laboratories, etc. Work performed is primarily project-oriented and ranges from executing a major long-term contract for retrofitting and maintaining a large fleet of escort vehicles to creating a single, small, prototype electronic device for measuring radiation in a unique environment. FM&T/NM is functionally organized and operates in a classic matrix format with functional departments providing personnel with technical expertise, necessary physical resources, and administrative support to several project-based groups. Like most matrix-based organizations that provide support to diverse customers, FM&T/NM has encountered problems that occur when a group of project managers is expected to work together in using and scheduling a shared set of limited resources for the good of the organization as a whole. The framework for managing projects that we present focuses on developing, understanding, and managing the relationships between the functional organization structure, the system of work processes, and the management of projects. FM&T/NM retains its functional structure which primarily assigns personnel to work processes. The evolving role of the process leader focuses primarily on designing, managing, and improving the process, and the interactions among the subprocesses. The project manager is responsible for (1) translating customer requirements into product specifications, (2) determining the sequence of activities needed to meet project goals, (3) scheduling the required work processes, (4) monitoring project progress, (5) providing liaison between the customer and process leaders, and (6) having the desired product and/or service delivered to a satisfied customer in a timely manner.

  9. Managing environmental knowledge through learning processes in Spanish hospitality companies.

    Science.gov (United States)

    Cegarra-Navarro, Juan Gabriel; Martinez Martinez, Aurora

    2010-11-01

    The major focus of this research is to investigate whether environmental knowledge has any impact on organizational outcomes through an empirical investigation of 127 Spanish hospitality companies, using structural equation models. Our results show that environmental knowledge is an important determiner for developing organizational outcomes. However, this relationship is completed with just two related constructs: Firstly, the company's acquisition process plays a key role in managing the tension between the knowledge necessary to develop the appropriated environmental initiatives and current knowledge. Secondly, the company's distribution process also sheds light on tangible means for managers to enhance their company's outcomes through environmental knowledge.

  10. Blockchains for Business Process Management - Challenges and Opportunities

    DEFF Research Database (Denmark)

    Mendling, Jan; Weber, Ingo; Van Der Aalst, Wil

    2018-01-01

    Blockchain technology offers a sizable promise to rethink the way inter-organizational business processes are managed because of its potential to realize execution without a central party serving as a single point of trust (and failure). To stimulate research on this promise and the limits thereof......, in this paper we outline the challenges and opportunities of blockchain for Business Process Management (BPM). We first reflect how blockchains could be used in the context of the established BPM lifecycle and second how they might become relevant beyond. We conclude our discourse with a summary of seven...... research directions for investigating the application of blockchain technology in the context of BPM...

  11. Improvement of hospital processes through business process management in Qaem Teaching Hospital: A work in progress.

    Science.gov (United States)

    Yarmohammadian, Mohammad H; Ebrahimipour, Hossein; Doosty, Farzaneh

    2014-01-01

    In a world of continuously changing business environments, organizations have no option; however, to deal with such a big level of transformation in order to adjust the consequential demands. Therefore, many companies need to continually improve and review their processes to maintain their competitive advantages in an uncertain environment. Meeting these challenges requires implementing the most efficient possible business processes, geared to the needs of the industry and market segments that the organization serves globally. In the last 10 years, total quality management, business process reengineering, and business process management (BPM) have been some of the management tools applied by organizations to increase business competiveness. This paper is an original article that presents implementation of "BPM" approach in the healthcare domain that allows an organization to improve and review its critical business processes. This project was performed in "Qaem Teaching Hospital" in Mashhad city, Iran and consists of four distinct steps; (1) identify business processes, (2) document the process, (3) analyze and measure the process, and (4) improve the process. Implementing BPM in Qaem Teaching Hospital changed the nature of management by allowing the organization to avoid the complexity of disparate, soloed systems. BPM instead enabled the organization to focus on business processes at a higher level.

  12. Research on process management of nuclear power technological innovation

    International Nuclear Information System (INIS)

    Yang Hua; Zhou Yu

    2005-01-01

    Different from the other technological innovation processes, the technological innovation process of nuclear power engineering project is influenced deeply by the extensive environmental factors, the technological innovation of nuclear power engineering project needs to make an effort to reduce environmental uncertainty. This paper had described the mechanism of connection technological innovation process of nuclear power engineering project with environmental factors, and issued a feasible method based on model of bargaining to incorporate technological innovation process management of nuclear power engineering project with environmental factors. This method has realistic meanings to guide the technological innovation of nuclear power engineering project. (authors)

  13. The application of mean control chart in managing industrial processes

    Directory of Open Access Journals (Sweden)

    Papić-Blagojević Nataša

    2013-01-01

    Full Text Available Along with the advent of mass production comes the problem of monitoring and maintaining the quality of the product, which stressed the need for the application of selected statistical and mathematical methods in the control process. The main objective of applying the methods of statistical control is continuous quality improvement through permanent monitoring of the process in order to discover the causes of errors. Shewart charts are the most popular method of statistical process control, which performs separation of controlled and uncontrolled variations along with detection of increased variations. This paper presents the example of Shewart mean control chart with application in managing industrial process.

  14. Internal control in the management system of meat processing enterprises

    Directory of Open Access Journals (Sweden)

    Volodymyr Kushnir

    2018-03-01

    Full Text Available The article is described the theoretical basis of internal control and its practical aspects in the work of meat processing enterprises (a case in the meat processing industry in Ukraine. The purpose of the research is to establish the theoretical foundations of the internal control and its improvement in the activity of meat processing plants of various forms of management. It is proposed to use precisely internal control among other names of domestic control. Definition of internal control, its subject and purpose are improved. The subjects and objects of internal control are determined; the principles of its implementation are supplemented. Specific control tasks in meat processing plants according to the needs of this industry are outlined. Specific examples of control subjects are presented and the role of the revision commission is emphasized. The state of internal control in meat processing plants in Ukraine is investigated and it is established that it has a bad condition and unfounded approach to its implementation by managers of meat processing enterprises. To improve the situation we recommend that each meat processing enterprise have in its staff a revision commission or an apposer (auditor. It is established that internal control is more effective in joint-stock companies than in limited liability companies. The necessity of internal control as an important element in the enterprise management system is accented.

  15. The Bioinformatics of Integrative Medical Insights: Proposals for an International PsychoSocial and Cultural Bioinformatics Project

    Directory of Open Access Journals (Sweden)

    Ernest Rossi

    2006-01-01

    Full Text Available We propose the formation of an International PsychoSocial and Cultural Bioinformatics Project (IPCBP to explore the research foundations of Integrative Medical Insights (IMI on all levels from the molecular-genomic to the psychological, cultural, social, and spiritual. Just as The Human Genome Project identified the molecular foundations of modern medicine with the new technology of sequencing DNA during the past decade, the IPCBP would extend and integrate this neuroscience knowledge base with the technology of gene expression via DNA/proteomic microarray research and brain imaging in development, stress, healing, rehabilitation, and the psychotherapeutic facilitation of existentional wellness. We anticipate that the IPCBP will require a unique international collaboration of, academic institutions, researchers, and clinical practioners for the creation of a new neuroscience of mind-body communication, brain plasticity, memory, learning, and creative processing during optimal experiential states of art, beauty, and truth. We illustrate this emerging integration of bioinformatics with medicine with a videotape of the classical 4-stage creative process in a neuroscience approach to psychotherapy.

  16. The Bioinformatics of Integrative Medical Insights: Proposals for an International Psycho-Social and Cultural Bioinformatics Project

    Directory of Open Access Journals (Sweden)

    Ernest Rossi

    2006-01-01

    Full Text Available We propose the formation of an International Psycho-Social and Cultural Bioinformatics Project (IPCBP to explore the research foundations of Integrative Medical Insights (IMI on all levels from the molecular-genomic to the psychological, cultural, social, and spiritual. Just as The Human Genome Project identified the molecular foundations of modern medicine with the new technology of sequencing DNA during the past decade, the IPCBP would extend and integrate this neuroscience knowledge base with the technology of gene expression via DNA/proteomic microarray research and brain imaging in development, stress, healing, rehabilitation, and the psychotherapeutic facilitation of existentional wellness. We anticipate that the IPCBP will require a unique international collaboration of, academic institutions, researchers, and clinical practioners for the creation of a new neuroscience of mind-body communication, brain plasticity, memory, learning, and creative processing during optimal experiential states of art, beauty, and truth. We illustrate this emerging integration of bioinformatics with medicine with a videotape of the classical 4-stage creative process in a neuroscience approach to psychotherapy.

  17. The Model of the Production Process for the Quality Management

    Directory of Open Access Journals (Sweden)

    Alot Zbigniew

    2017-02-01

    Full Text Available This article is a result of the research on the models of the production processes for the quality management and their identification. It discusses the classical model and the indicators for evaluating the capabilities by taking as its starting point the assumption of the normal distribution of the process characteristics. The division of the process types proposed by ISO 21747:2006 standard introducing models for non-stationary processes is presented. A general process model that allows in any real case to precisely describe the statistical characteristics of the process is proposed. It gives the opportunity for more detailed description, in comparison to the model proposed by ISO 21747:2006 standard, of the process characteristics and determining its capability. This model contains the type of process, statistical distribution, and the method for determining the capability and performance (long-term capability of the process. One of the model elements is proposed, own classification and resulting set of process types. The classification follows the recommendations of ISO 21747:2006 introducing models for the non-stationary processes. However, the set of the process types allows, beyond a more precise description of the process characteristics, its usage to monitor the process.

  18. MANAGING THE BUILDING DESIGN PROCESS FOR SUSTAINABILTY AND IMPROVED QUALITY

    Directory of Open Access Journals (Sweden)

    Sunday Bobadoye

    2006-01-01

    Full Text Available The essence of building design process and management for building sustainability in the creation and maintenance of a qualitative architectural product is investigated in this paper. The design process, concept of building sustainability and particularly the quality of the built environment are discussed. Akure, a state capital in Nigeria was used as a case study. The principles and indicators for sustainability of buildings and its implications on the quality of the environment are examined in details. Survey findings include the views of the professionals on the clients, perception on the design process as well as management of projects, and the implications on the quality of the ensuring products and the city’s environment. The data were factor analyzed using varimax rotation criterion (with Kaiser Normalization. The results revealed that five factors were effective, with one of them exhibiting the greatest variability and individual differences. The variables that loaded on this factor were really the aspects of the process and management relating to the clients. The findings also revealed the professionals’ wrong attitude towards design process as shown with a very high degree of variability in the study. The paper concludes by recommending the enactment and enforcement of relevant policies with adequate education of the people and the involvement of all the stakeholders in the management of building projects and environmental programmes for the realization of a qualitative architectural product.

  19. A proposal of business processes management (BPM structure and use

    Directory of Open Access Journals (Sweden)

    Rafael Araujo Kluska

    2015-09-01

    Full Text Available The organizational processes, also known as business processes, have became a fundamental structure for the management of the modern organizations. Knowing the work flow of the organization is a necessary condition for the development of continuous improvement processes. The benefits and advantages provided by the use of an approach based on BPM (Business Processes Management are evident. The benefits include improvements in efficiency, quality and flexibility, besides other aspects generating sustainable competitive advantages. There is a wide range of studies on BPM, which display several definitions and elements characterizing the various applications. This work aims to propose a conceptual framework for interconnection between BPM elements, thus providing a better understanding of organizational processes and performance in organizational management environment. As a result, a group of BPM elements is identified and classified in: methodologies, techniques and tools which are a part of or can be efficiently connected to BPM conceptual structure. A framework for conceptual interconnection between those elements is also provided. The results of BPM application are not limited to the search for operational efficiency, but might also be considered as an element to support the organizational management.

  20. H3ABioNet, a sustainable pan-African bioinformatics network for human heredity and health in Africa

    Science.gov (United States)

    Mulder, Nicola J.; Adebiyi, Ezekiel; Alami, Raouf; Benkahla, Alia; Brandful, James; Doumbia, Seydou; Everett, Dean; Fadlelmola, Faisal M.; Gaboun, Fatima; Gaseitsiwe, Simani; Ghazal, Hassan; Hazelhurst, Scott; Hide, Winston; Ibrahimi, Azeddine; Jaufeerally Fakim, Yasmina; Jongeneel, C. Victor; Joubert, Fourie; Kassim, Samar; Kayondo, Jonathan; Kumuthini, Judit; Lyantagaye, Sylvester; Makani, Julie; Mansour Alzohairy, Ahmed; Masiga, Daniel; Moussa, Ahmed; Nash, Oyekanmi; Ouwe Missi Oukem-Boyer, Odile; Owusu-Dabo, Ellis; Panji, Sumir; Patterton, Hugh; Radouani, Fouzia; Sadki, Khalid; Seghrouchni, Fouad; Tastan Bishop, Özlem; Tiffin, Nicki; Ulenga, Nzovu

    2016-01-01

    The application of genomics technologies to medicine and biomedical research is increasing in popularity, made possible by new high-throughput genotyping and sequencing technologies and improved data analysis capabilities. Some of the greatest genetic diversity among humans, animals, plants, and microbiota occurs in Africa, yet genomic research outputs from the continent are limited. The Human Heredity and Health in Africa (H3Africa) initiative was established to drive the development of genomic research for human health in Africa, and through recognition of the critical role of bioinformatics in this process, spurred the establishment of H3ABioNet, a pan-African bioinformatics network for H3Africa. The limitations in bioinformatics capacity on the continent have been a major contributory factor to the lack of notable outputs in high-throughput biology research. Although pockets of high-quality bioinformatics teams have existed previously, the majority of research institutions lack experienced faculty who can train and supervise bioinformatics students. H3ABioNet aims to address this dire need, specifically in the area of human genetics and genomics, but knock-on effects are ensuring this extends to other areas of bioinformatics. Here, we describe the emergence of genomics research and the development of bioinformatics in Africa through H3ABioNet. PMID:26627985

  1. Lakes and ponds recreation management: a state-wide application of the visitor impact management process

    Science.gov (United States)

    Jerry J. Vaske; Rodney R. Zwick; Maureen P. Donnelly

    1992-01-01

    The Visitor Impact Management (VIM) process is designed to identify unacceptable changes occurring as a result of visitor use and to develop management strategies to keep visitor impacts within acceptable levels. All previous attempts to apply the VIM planning framework have concentrated on specific resources. This paper expands this focus to an entire state. Based on...

  2. Transition management as a model for managing processes of co-evolution towards sustainable development

    NARCIS (Netherlands)

    R. Kemp (René); D.A. Loorbach (Derk); J. Rotmans (Jan)

    2007-01-01

    textabstractSustainable development requires changes in socio-technical systems and wider societal change - in beliefs, values and governance that co-evolve with technology changes. In this article we present a practical model for managing processes of co-evolution: transition management. Transition

  3. Implementing SCRUM using Business Process Management and Pattern Analysis Methodologies

    Directory of Open Access Journals (Sweden)

    Ron S. Kenett

    2013-11-01

    Full Text Available The National Institute of Standards and Technology in the US has estimated that software defects and problems annually cost 59.5 billions the U.S. economy (http://www.abeacha.com/NIST_press_release_bugs_cost.htm. The study is only one of many that demonstrate the need for significant improvements in software development processes and practices. US Federal agencies, that depend on IT to support their missions and spent at least $76 billion on IT in fiscal year 2011, experienced numerous examples of lengthy IT projects that incurred cost overruns and schedule delays while contributing little to mission-related outcomes (www.gao.gov/products/GAO-12-681. To reduce the risk of such problems, the US Office of Management and Budget recommended deploying an agile software delivery, which calls for producing software in small, short increments (GAO, 2012. Consistent with this recommendation, this paper is about the application of Business Process Management to the improvement of software and system development through SCRUM or agile techniques. It focuses on how organizational behavior and process management techniques can be integrated with knowledge management approaches to deploy agile development. The context of this work is a global company developing software solutions for service operators such as cellular phone operators. For a related paper with a comprehensive overview of agile methods in project management see Stare (2013. Through this comprehensive case study we demonstrate how such an integration can be achieved. SCRUM is a paradigm shift in many organizations in that it results in a new balance between focus on results and focus on processes. In order to describe this new paradigm of business processes this work refers to Enterprise Knowledge Development (EKD, a comprehensive approach to map and document organizational patterns. In that context, the paper emphasizes the concept of patterns, reviews the main elements of SCRUM and shows how

  4. An overview of topic modeling and its current applications in bioinformatics.

    Science.gov (United States)

    Liu, Lin; Tang, Lin; Dong, Wen; Yao, Shaowen; Zhou, Wei

    2016-01-01

    With the rapid accumulation of biological datasets, machine learning methods designed to automate data analysis are urgently needed. In recent years, so-called topic models that originated from the field of natural language processing have been receiving much attention in bioinformatics because of their interpretability. Our aim was to review the application and development of topic models for bioinformatics. This paper starts with the description of a topic model, with a focus on the understanding of topic modeling. A general outline is provided on how to build an application in a topic model and how to develop a topic model. Meanwhile, the literature on application of topic models to biological data was searched and analyzed in depth. According to the types of models and the analogy between the concept of document-topic-word and a biological object (as well as the tasks of a topic model), we categorized the related studies and provided an outlook on the use of topic models for the development of bioinformatics applications. Topic modeling is a useful method (in contrast to the traditional means of data reduction in bioinformatics) and enhances researchers' ability to interpret biological information. Nevertheless, due to the lack of topic models optimized for specific biological data, the studies on topic modeling in biological data still have a long and challenging road ahead. We believe that topic models are a promising method for various applications in bioinformatics research.

  5. A Federated Digital Identity Management Approach for Business Processes

    Science.gov (United States)

    Bertino, Elisa; Ferrini, Rodolfo; Musci, Andrea; Paci, Federica; Steuer, Kevin J.

    Business processes have gained a lot of attention because of the pressing need for integrating existing resources and services to better fulfill customer needs. A key feature of business processes is that they are built from composable services, referred to as component services, that may belong to different domains. In such a context, flexible multi-domain identity management solutions are crucial for increased security and user-convenience. In particular, it is important that during the execution of a business process the component services be able to verify the identity of the client to check that it has the required permissions for accessing the services. To address the problem of multi-domain identity management, we propose a multi-factor identity attribute verification protocol for business processes that assures clients privacy and handles naming heterogeneity.

  6. Algorithms for the process management of sealed source brachytherapy

    International Nuclear Information System (INIS)

    Engler, M.J.; Ulin, K.; Sternick, E.S.

    1996-01-01

    Incidents and misadministrations suggest that brachytherapy may benefit form clarification of the quality management program and other mandates of the US Nuclear Regulatory Commission. To that end, flowcharts of step by step subprocesses were developed and formatted with dedicated software. The overall process was similarly organized in a complex flowchart termed a general process map. Procedural and structural indicators associated with each flowchart and map were critiqued and pre-existing documentation was revised. open-quotes Step-regulation tablesclose quotes were created to refer steps and subprocesses to Nuclear Regulatory Commission rules and recommendations in their sequences of applicability. Brachytherapy algorithms were specified as programmable, recursive processes, including therapeutic dose determination and monitoring doses to the public. These algorithms are embodied in flowcharts and step-regulation tables. A general algorithm is suggested as a template form which other facilities may derive tools to facilitate process management of sealed source brachytherapy. 11 refs., 9 figs., 2 tabs

  7. Improvement of the Model of Enterprise Management Process on the Basis of General Management Functions

    Directory of Open Access Journals (Sweden)

    Ruslan Skrynkovskyy

    2017-12-01

    Full Text Available The purpose of the article is to improve the model of the enterprise (institution, organization management process on the basis of general management functions. The graphic model of the process of management according to the process-structured management is presented. It has been established that in today's business environment, the model of the management process should include such general management functions as: 1 controlling the achievement of results; 2 planning based on the main goal; 3 coordination and corrective actions (in the system of organization of work and production; 4 action as a form of act (conscious, volitional, directed; 5 accounting system (accounting, statistical, operational-technical and managerial; 6 diagnosis (economic, legal with such subfunctions as: identification of the state and capabilities; analysis (economic, legal, systemic with argumentation; assessment of the state, trends and prospects of development. The prospect of further research in this direction is: 1 the formation of a system of interrelation of functions and management methods, taking into account the presented research results; 2 development of the model of effective and efficient communication business process of the enterprise.

  8. INNOCUOUSNESS + KNOWLEDGE MANAGEMENT A CONTRIBUTION TO PROCESS IMPROVEMENT

    OpenAIRE

    García Pulido, Yadrián Arnaldo*1, Castillo Zúñiga, Victor Javier2, Medina León, Alberto3, Medina Nogueira, Daylín4, Mayorga Villamar, Carmen Manuela5

    2017-01-01

    The processes improvement is inherent to the business management. In the competitive current market, the businesses adaptation capacity is fundamental. The continuous improvement becomes into the reason of being of the companies, being adapted to the clients, being more efficient and winning in flexibility in the face of an extremely unstable economy. Several tools have been developed with the objective of improving the processes, however the integration of elements of other knowledge areas h...

  9. Impact of Customer Relationship Management on Product Innovation Process

    OpenAIRE

    Li, Yelin; Thi, Thu Sang Nguyen

    2012-01-01

    In marketing, the common view is that customer relationships enhance innovativeness. Regularly it involves doing something new or different in response to market conditions. However, previous studies have not addressed how customer relationship management (CRM) plays its role in product innovation process. This thesis proposes and tests how key CRM activities influence and relate to each stage in product innovation process. The objective of this study is to test how customer relations managem...

  10. Meeting Organizational Performance with Shared Knowledge Management Processes

    OpenAIRE

    Franco, Massimo; Mariano, Stefania

    2010-01-01

    Using empirical research data, this study investigated how knowledge is stored and retrieved in an American company and contributed to the growing body of literature on the use of knowledge, technology, and memory systems to improve organizational performance. It demonstrated the importance of individual motivation and efforts, managerial capabilities, and shared organizational technologies in the management of organizational processes and revealed factors influencing the processes of knowled...

  11. Managing projects for life cycle success : perfecting the process

    Energy Technology Data Exchange (ETDEWEB)

    Jenkins, A. [TransCanada PipeLines Ltd., Calgary, AB (Canada); Babuk, T. [Empress International Inc., Westwood, NJ (United States); Mohitpour, M. [Tempsys Pipeline Solutions Inc., Vancouver, BC (Canada)

    2004-07-01

    This paper presented a historical summary of traditional project management along with a discussion on the project management and development philosophy that can be used in a large infrastructure company that develops and operates its own projects and facilities. Two case studies from the experiences of TransCanada Pipelines Limited were also presented. It was suggested that companies seeking a first-rate reputation must maintain a long-term focus with emphasis on the improvement of the total process and harmony with the environment and community. This paper explained how project managers can create balance between the proponents, stakeholders, participants and the people and the environment while ensuring a cost effective quality product over time. Successful project managers were shown to understand and manage the components of scope, time, cost, quality, human resources, communication, risk, purchasing, safety and harmony with the community. Project development from the perspective of an owner-operator was presented with reference to consistency in approach and the decision making process. It was concluded that although project managers should focus on controlling and minimizing capital expenditures during project engineering and construction, the many elements that contribute to a project's value should also be recognized. 10 refs., 6 figs.

  12. Theoretical concepts "land management process", "land management procedure" and their relationships

    Directory of Open Access Journals (Sweden)

    Tretiak A.M.

    2017-08-01

    Full Text Available The state significance of land management activities is manifested in those legal consequences that arise after the issuance of land management documentation and are conditioned by the need to secure unsupported land rights and the use and protection of land in a state-guaranteed manner. The procedural activity of land surveyors and other persons authorized by the state to commit land management operations must be carried out in a certain order established by the state and obey the rights and obligations of the persons specified by the legislation at each stage of the development of such relations.The main goal of applying to land management organizations and land surveyors is landuse documentation, which is made in accordance with the requirements of the law and with which the relevant legal properties of the land management procedure are associated. First of all, let's dwell on such basic concepts as "land management process" and "land management procedure". Consideration of the term "land management process" implies a preliminary analysis of the category "process". At the same time, it must be admitted that the development of the procedural form of this category has not been paid attention. Considering the concept of "land management process", its place and role in the system of social relations, emphasis will be placed on the concept of a broad understanding of the legal process, the problem of which exists for decades.Thus, the legal process is a regulated by the procedural rules procedure for the activities of competent state bodies, consisting of the preparation, adoption and documentary consolidation of legal decisions of a general and individual nature. In the land law, the category "process" is specific and serves to designate relationships that provide regulatory and security land-property relationships. Particularly difficult today is the question of the delimitation of the concepts of "process" and "procedure" in general. Regarding

  13. Shared Bioinformatics Databases within the Unipro UGENE Platform

    Directory of Open Access Journals (Sweden)

    Protsyuk Ivan V.

    2015-03-01

    Full Text Available Unipro UGENE is an open-source bioinformatics toolkit that integrates popular tools along with original instruments for molecular biologists within a unified user interface. Nowadays, most bioinformatics desktop applications, including UGENE, make use of a local data model while processing different types of data. Such an approach causes an inconvenience for scientists working cooperatively and relying on the same data. This refers to the need of making multiple copies of certain files for every workplace and maintaining synchronization between them in case of modifications. Therefore, we focused on delivering a collaborative work into the UGENE user experience. Currently, several UGENE installations can be connected to a designated shared database and users can interact with it simultaneously. Such databases can be created by UGENE users and be used at their discretion. Objects of each data type, supported by UGENE such as sequences, annotations, multiple alignments, etc., can now be easily imported from or exported to a remote storage. One of the main advantages of this system, compared to existing ones, is the almost simultaneous access of client applications to shared data regardless of their volume. Moreover, the system is capable of storing millions of objects. The storage itself is a regular database server so even an inexpert user is able to deploy it. Thus, UGENE may provide access to shared data for users located, for example, in the same laboratory or institution. UGENE is available at: http://ugene.net/download.html.

  14. Agonist Binding to Chemosensory Receptors: A Systematic Bioinformatics Analysis

    Directory of Open Access Journals (Sweden)

    Fabrizio Fierro

    2017-09-01

    Full Text Available Human G-protein coupled receptors (hGPCRs constitute a large and highly pharmaceutically relevant membrane receptor superfamily. About half of the hGPCRs' family members are chemosensory receptors, involved in bitter taste and olfaction, along with a variety of other physiological processes. Hence these receptors constitute promising targets for pharmaceutical intervention. Molecular modeling has been so far the most important tool to get insights on agonist binding and receptor activation. Here we investigate both aspects by bioinformatics-based predictions across all bitter taste and odorant receptors for which site-directed mutagenesis data are available. First, we observe that state-of-the-art homology modeling combined with previously used docking procedures turned out to reproduce only a limited fraction of ligand/receptor interactions inferred by experiments. This is most probably caused by the low sequence identity with available structural templates, which limits the accuracy of the protein model and in particular of the side-chains' orientations. Methods which transcend the limited sampling of the conformational space of docking may improve the predictions. As an example corroborating this, we review here multi-scale simulations from our lab and show that, for the three complexes studied so far, they significantly enhance the predictive power of the computational approach. Second, our bioinformatics analysis provides support to previous claims that several residues, including those at positions 1.50, 2.50, and 7.52, are involved in receptor activation.

  15. MEASUREMENT OF QUALITY MANAGEMENT SYSTEM PERFORMANCE IN MEAT PROCESSING

    Directory of Open Access Journals (Sweden)

    Elena S. Voloshina

    2017-01-01

    Full Text Available Modern methods aimed to ensure the quality of foods require to implement and certify quality management systems in processing plants. In this case, to measure the effectiveness of existing QMS is often a very difficult task for the leadership due to the fragmentation of the measured metrics, or even lack thereof. This points to the relevance of the conducted research.The criteria for effectiveness assessment of the production process of meat processing plants with the use of scaling methods and Shewhart control charts are presented in the article. The authors developed and presented the formulae for the calculation of single indicators used for the further comprehensive assessment. The algorithm of statistical evaluation of the process controllability, which allows in an accessible form to estimate the statistical control of production processes and to organize statistical quality control in the development of quality management systems, is presented The proposed procedure is based on a process approach, the essence of which is the application of the Deming cycle: “Plan — Do — Check — Act”, which makes it easy to integrate it into any existing quality management system.

  16. Understanding and Managing Process Interaction in IS Development Projects

    DEFF Research Database (Denmark)

    Bygstad, Bendik; Nielsen, Peter Axel

    2012-01-01

    Software-based information systems must be developed and implemented as a part of business change. This is a major challenge, since business change and the development of software-based information systems usually are performed in separate processes. Thus, there is a need to understand and manage...... critical events in the case, what led to the events, and what the consequences are. We discuss the implications for information systems research and in particular we discuss the contribution to project management of iterative and incremental software development.......Software-based information systems must be developed and implemented as a part of business change. This is a major challenge, since business change and the development of software-based information systems usually are performed in separate processes. Thus, there is a need to understand and manage...

  17. Sustainable cost reduction by lean management in metallurgical processes

    Directory of Open Access Journals (Sweden)

    A. V. Todorut

    2016-10-01

    Full Text Available This paper focuses on the need for sustainable cost reduction in the metallurgical industry by applying Lean Management (LM tools and concepts in metallurgical production processes leading to increased competitiveness of corporations in a global market. The paper highlights that Lean Management is a novel way of thinking, adapting to change, reducing waste and continuous improvement, leading to sustainable development of companies in the metallurgical industry. The authors outline the main Lean Management instruments based on recent scientific research and include a comparative analysis of other tools, such as Sort, Straighten, Shine, Standardize, Sustain (5S, Visual Management (VM, Kaizen, Total Productive Maintenance (TPM, Single-Minute Exchange of Dies (SMED, leading to a critical appraisal of their application in the metallurgical industry.

  18. Intelligent workflow driven processing for electronic mail management

    African Journals Online (AJOL)

    Email has become one of the most efficient means of electronics communication for many years and email management has become a critical issue due to congestion. Different client/individuals encounter problems while processing their emails due to large volume of email being received and lot of request to be replied.

  19. Assessing the Army’s Software Patch Management Process

    Science.gov (United States)

    2016-03-04

    software maker or to antivirus vendors (Zetter, 2014). Fixing such a vulnerability within the zero-day period requires teamwork across multiple...Assessing the Army’s Software Patch Management Process Benjamin Alan Pryor March 4, 2016 PUBLISHED...19 Commercial-Off-the-Shelf Software

  20. Framework, process and tool for managing technology-based assets

    CSIR Research Space (South Africa)

    Kfir, R

    2000-10-01

    Full Text Available ) and the intellectual property (IP) of the organisation, The study describes a framework linking the core processes supporting the management of technology-based assets and offerings with other organisational elements such as leadership, strategy, and culture. Specific...

  1. proposal for a lean commodity management process for the south

    African Journals Online (AJOL)

    Administrator

    Commodity Managers (CMs) within the South African Navy (SAN) need new and ... The SAN supply chain process, as graphically represented in Figure 1.1, starts with the .... Tactical level: These recommendations only affect the Fleet Logistics supply ..... The sourcing of the item will be as wide as possible (global), and may.

  2. A Situational Implementation Method for Business Process Management Systems

    NARCIS (Netherlands)

    R.L. Jansen; J.P.P. Ravensteyn

    For the integrated implementation of Business Process Management and supporting information systems many methods are available. Most of these methods, however, apply a one-size fits all approach and do not take into account the specific situation of the organization in which an information system is

  3. Enhancing Managers' Skills through Conflict Process Review | EPIE ...

    African Journals Online (AJOL)

    A study was carried out among Nigerian managers who were asked to remember a conflict episode in which they had been involved in the past and to evaluate, in retrospect, the way they had handled the conflict. Results revealed that a majority of respondents showed evidence of positive learning as a result of the process ...

  4. Using Knowledge Management to Revise Software-Testing Processes

    Science.gov (United States)

    Nogeste, Kersti; Walker, Derek H. T.

    2006-01-01

    Purpose: This paper aims to use a knowledge management (KM) approach to effectively revise a utility retailer's software testing process. This paper presents a case study of how the utility organisation's customer services IT production support group improved their test planning skills through applying the American Productivity and Quality Center…

  5. Management control of credit risk in the bank lending process

    NARCIS (Netherlands)

    Scheffer, S.B.

    2004-01-01

    Management control of credit risk in the bank lending processA casestudy to explore improvements from a managerial perspectiveAt the start of this project -back in 1998- new technologies and ideas were emerging among a new generation of financial engineering professionals who have been applying

  6. Blockchains for business process management - Challenges and opportunities

    NARCIS (Netherlands)

    Mendling, Jan; Weber, Ingo; Van Der Aalst, Wil; Brocke, Jan Vom; Cabanillas, Cristina; Daniel, Florian; Debois, Søren; Di Ciccio, Claudio; Dumas, Marlon; Dustdar, Schahram; Gal, Avigdor; García-Bañuelos, Luciano; Governatori, Guido; Hull, Richard; La Rosa, Marcello; Leopold, Henrik; Leymann, Frank; Recker, Jan; Reichert, Manfred; Reijers, Hajo A.; Rinderlema, Stefanie; Solti, Andreas; Rosemann, Michael; Schulte, Stefan; Singh, Munindar P.; Slaats, Tijs; Staples, Mark; Weber, Barbara; Weidlich, Matthias; Weske, Mathias; Xu, Xiwei; Zhu, Liming

    Blockchain technology ofers a sizable promise to rethink the way interorganizational business processes are managed because of its potential to realize execution without a central party serving as a single point of trust (and failure). To stimulate research on this promise and the limits thereof, in

  7. Blockchains for business process management - Challenges and opportunities

    NARCIS (Netherlands)

    Mendling, J.; Weber, I.; van der Aalst, W.M.P.; vom Brocke, J.; Cabanillas, C.; Daniel, F.; Debois, S.; Di Ciccio, C.; Dumas, M.; Dustdar, S.; Gal, A.; García-Bañuelos, L.; Governatori, G.; Hull, R.; La Rosa, Marcello; Leopold, Henrik; Leymann, Frank; Recker, Jan; Reichert, Manfred; Reijers, H.A.; Rinderlema, Stefanie; Solti, Andreas; Rosemann, Michael; Schulte, Stefan; Singh, Munindar P.; Slaats, T.; Staples, Mark; Weber, Barbara; Weidlich, Matthias; Weske, Mathias; Xu, Xiwei; Zhu, Liming

    2018-01-01

    Blockchain technology ofers a sizable promise to rethink the way interorganizational business processes are managed because of its potential to realize execution without a central party serving as a single point of trust (and failure). To stimulate research on this promise and the limits thereof, in

  8. Managing complexity in process digitalisation with dynamic condition response graphs

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Debois, Søren; Slaats, Tijs

    2017-01-01

    . Sadly, it is also witnessed by a number of expensive failed digitalisation projects. In this paper we point to two key problems in state-of-The art BPM technologies: 1) the use of rigid flow diagrams as the "source code" of process digitalisation is not suitable for managing the complexity of knowledge...

  9. Business Process Management Systems: Hype or New Paradigm.

    NARCIS (Netherlands)

    J.P.P. Ravesteijn

    2007-01-01

    Business Process Management Systems (BPMSs) are increasingly implemented in and across organizations. There is much talk on BPMSs, and software vendors and IT-consultancy companies are leveraging this. In this paper we provide an investigation on the originality of BPMSs. We identify concepts,

  10. The Management Skills of Exam Process for Undergraduate Students

    Science.gov (United States)

    Cetin, Filiz; Cetin, Saban

    2017-01-01

    This study aims to identify to what degree undergraduate students are able to manage the exam process to be successful in exams. The study group of the research, which utilizes the survey model, consists of 350 students in total, 185 female and 165 male, attending 4 different teaching programs in Faculty of Education, Gazi University. "The…

  11. Research in adaptive management: working relations and the research process.

    Science.gov (United States)

    Amanda C. Graham; Linda E. Kruger

    2002-01-01

    This report analyzes how a small group of Forest Service scientists participating in efforts to implement adaptive management approach working relations, and how they understand and apply the research process. Nine scientists completed a questionnaire to assess their preferred mode of thinking (the Herrmann Brain Dominance Instrument), engaged in a facilitated...

  12. Management of purchase process in realization of building investment

    Directory of Open Access Journals (Sweden)

    M. Radoń

    2010-07-01

    Full Text Available In building companies process of product and service purchase is one of the main processes of quality management system [1]. Because ofshort time-limits of contract realization, high specialization of works, necessity of fulfillment of high quality requirements and assurance of profitable financial effects the management of purchase process becomes very important element of work of the building company. The serious problem in creating and keeping the efficient system of purchase management is special type of purchase in building companies. Particular investments are realized in different country regions, objects are built based on the individual design documentations and each building becomes independent organization unit that organize purchase necessary for investment realization.An example of the management system of purchase process in building company is described in the paper. Just In Time system is widelyused during the realization of building investment. This system is especially useful in buildings because some investments, especially inbig cities, are characterized by restriction in building site. This makes impossible storing the products. In such cases close synchronization between times of delivery and requirements of purchase schedule and schedule of building realization is very important. Criteria of supplier selection as well as the methods of choosing the supplier are also presented in the paper. Special attention is paid to necessity of valuation of the purchase efficiency and the purchase risk. Basic coefficients of purchase efficiency are also described in the paper.

  13. Managing uncertainty in fi shing and processing of an integrated ...

    African Journals Online (AJOL)

    The firm attempts to manage this uncertainty through planning co-ordination of fishing trawler scheduling, catch quota, processing and labour allocation, and inventory control. Schedules must necessarily be determined over some finite planning time horizon, and the trawler schedule itself introduces man-made variability, ...

  14. Strategies for Managing Conflict in the Collaboration Process.

    Science.gov (United States)

    Ivarie, Judith J.

    Approaches to managing conflict in the collaborative process are discussed, along with the need for collaboration in schools. Collaboration by teachers, administrators, parents, and others can help identify problems, consider relevant data, plan and implement interventions, and evaluate results. However, the knowledge, experience, and values of…

  15. Diagnosing resistance to change in the change management process

    Directory of Open Access Journals (Sweden)

    Tetiana Kuzhda

    2016-12-01

    Full Text Available This article explains the change management process and resistance to organizational change through examining causes of resistance to change, diagnosing them, and finding the ways to deal with resistance to change. In business environment, the one thing any company can be assured of is change. If an organization experiences change it may also need to implement new business strategies, which can create resistance among employees. Managers need to know in which phase they have to expect unusual situations, problems, and resistance to change. Most successful organizations are those that are able to adjust themselves to new conditions quickly. Preparing for change, managing change through resistance management plan and reinforcing change have been identified in the article as the main phrases of change management process that lead to improve the organization performance. Managing resistance to change is important part for success of any change effort in each company. Dealing with resistance in large part will depend on timely recognition of the real causes of resistance to change and finding the ways to reduce, overcome or eliminate the resistance to change. Developing efficient ways to introduce and implement change can ease the stress the staff feels when change is introduced. Different resistance states, causes of change resistance and forms of change resistance have been emphasized in the change management process. The proposed diagnosing model has been used to identify significant and weighty causes of resistance to change by using the expert survey and ranking causes of resistance to change. The ways to reduce and overcome resistance to change have been explained.

  16. G2LC: Resources Autoscaling for Real Time Bioinformatics Applications in IaaS

    Directory of Open Access Journals (Sweden)

    Rongdong Hu

    2015-01-01

    Full Text Available Cloud computing has started to change the way how bioinformatics research is being carried out. Researchers who have taken advantage of this technology can process larger amounts of data and speed up scientific discovery. The variability in data volume results in variable computing requirements. Therefore, bioinformatics researchers are pursuing more reliable and efficient methods for conducting sequencing analyses. This paper proposes an automated resource provisioning method, G2LC, for bioinformatics applications in IaaS. It enables application to output the results in a real time manner. Its main purpose is to guarantee applications performance, while improving resource utilization. Real sequence searching data of BLAST is used to evaluate the effectiveness of G2LC. Experimental results show that G2LC guarantees the application performance, while resource is saved up to 20.14%.

  17. Monitoring processes and measuring the effectiveness of the management system

    International Nuclear Information System (INIS)

    Bailescu, A.; Costea, D.

    2009-01-01

    This document presents the way which the 8th principle of the quality management system 'Process approach' is applied, the principle that is identified and used by international standard ISO 9000. In order to understand the evolution of the management system requirements, as used today in different activities namely, industry, services, and nuclear activities, the authors present an evolution of the quality concept and its traceability to different standards, applicable in time. There are described the requirements of ISO 9001 standard, that represents the most widely spread model for modern organization management and the IAEA concerns related to integration of the above standard requirements into the most recent safety IAEA standard 'The Management System for facilities and activities'. The IAEA Safety Standard GS-R-3 describes a management model considering both the evolution of the quality requirements into the modern management and the recovery of the experience gained by nuclear activities. The authors pleads for applying of the 8th principle as a means and a model, easy to use to ensure the achievement of management objectives. (authors)

  18. Data-Driven Process Control and Exception Handling in Process Management Systems

    NARCIS (Netherlands)

    Rinderle, S.B.; Reichert, M.U.; Dubois, E.; Pohl, K.

    Business processes are often characterized by high variability and dynamics, which cannot be always captured in contemporary process management systems (PMS). Adaptive PMS have emerged in recent years, but do not completely solve this problem. In particular, users are not adequately supported in

  19. Process of technology management in SMEs of the metal processing industry – the case study investigation

    Directory of Open Access Journals (Sweden)

    Krawczyk-Dembicka Elżbieta

    2017-03-01

    Full Text Available The main purpose of this work is to identify the factors that influence the process of technology management in the sector of small- and medium-sized enterprises of the metal processing industry, considering the shape and course required to achieve modern operation conditions by enterprises in the market.

  20. PROVIDING RELIABILITY OF HUMAN RESOURCES IN PRODUCTION MANAGEMENT PROCESS

    Directory of Open Access Journals (Sweden)

    Anna MAZUR

    2014-07-01

    Full Text Available People are the most valuable asset of an organization and the results of a company mostly depends on them. The human factor can also be a weak link in the company and cause of the high risk for many of the processes. Reliability of the human factor in the process of the manufacturing process will depend on many factors. The authors include aspects of human error, safety culture, knowledge, communication skills, teamwork and leadership role in the developed model of reliability of human resources in the management of the production process. Based on the case study and the results of research and observation of the author present risk areas defined in a specific manufacturing process and the results of evaluation of the reliability of human resources in the process.