WorldWideScience

Sample records for achieving semantic interoperability

  1. Semantically Interoperable XML Data.

    Science.gov (United States)

    Vergara-Niedermayr, Cristobal; Wang, Fusheng; Pan, Tony; Kurc, Tahsin; Saltz, Joel

    2013-09-01

    XML is ubiquitously used as an information exchange platform for web-based applications in healthcare, life sciences, and many other domains. Proliferating XML data are now managed through latest native XML database technologies. XML data sources conforming to common XML schemas could be shared and integrated with syntactic interoperability. Semantic interoperability can be achieved through semantic annotations of data models using common data elements linked to concepts from ontologies. In this paper, we present a framework and software system to support the development of semantic interoperable XML based data sources that can be shared through a Grid infrastructure. We also present our work on supporting semantic validated XML data through semantic annotations for XML Schema, semantic validation and semantic authoring of XML data. We demonstrate the use of the system for a biomedical database of medical image annotations and markups.

  2. Semantically Interoperable XML Data

    Science.gov (United States)

    Vergara-Niedermayr, Cristobal; Wang, Fusheng; Pan, Tony; Kurc, Tahsin; Saltz, Joel

    2013-01-01

    XML is ubiquitously used as an information exchange platform for web-based applications in healthcare, life sciences, and many other domains. Proliferating XML data are now managed through latest native XML database technologies. XML data sources conforming to common XML schemas could be shared and integrated with syntactic interoperability. Semantic interoperability can be achieved through semantic annotations of data models using common data elements linked to concepts from ontologies. In this paper, we present a framework and software system to support the development of semantic interoperable XML based data sources that can be shared through a Grid infrastructure. We also present our work on supporting semantic validated XML data through semantic annotations for XML Schema, semantic validation and semantic authoring of XML data. We demonstrate the use of the system for a biomedical database of medical image annotations and markups. PMID:25298789

  3. A logical approach to semantic interoperability in healthcare.

    Science.gov (United States)

    Bird, Linda; Brooks, Colleen; Cheong, Yu Chye; Tun, Nwe Ni

    2011-01-01

    Singapore is in the process of rolling out a number of national e-health initiatives, including the National Electronic Health Record (NEHR). A critical enabler in the journey towards semantic interoperability is a Logical Information Model (LIM) that harmonises the semantics of the information structure with the terminology. The Singapore LIM uses a combination of international standards, including ISO 13606-1 (a reference model for electronic health record communication), ISO 21090 (healthcare datatypes), and SNOMED CT (healthcare terminology). The LIM is accompanied by a logical design approach, used to generate interoperability artifacts, and incorporates mechanisms for achieving unidirectional and bidirectional semantic interoperability.

  4. Contribution of Clinical Archetypes, and the Challenges, towards Achieving Semantic Interoperability for EHRs.

    Science.gov (United States)

    Tapuria, Archana; Kalra, Dipak; Kobayashi, Shinji

    2013-12-01

    The objective is to introduce 'clinical archetype' which is a formal and agreed way of representing clinical information to ensure interoperability across and within Electronic Health Records (EHRs). The paper also aims at presenting the challenges building quality labeled clinical archetypes and the challenges towards achieving semantic interoperability between EHRs. Twenty years of international research, various European healthcare informatics projects and the pioneering work of the openEHR Foundation have led to the following results. The requirements for EHR information architectures have been consolidated within ISO 18308 and adopted within the ISO 13606 EHR interoperability standard. However, a generic EHR architecture cannot ensure that the clinical meaning of information from heterogeneous sources can be reliably interpreted by receiving systems and services. Therefore, clinical models called 'clinical archetypes' are required to formalize the representation of clinical information within the EHR. Part 2 of ISO 13606 defines how archetypes should be formally represented. The current challenge is to grow clinical communities to build a library of clinical archetypes and to identify how evidence of best practice and multi-professional clinical consensus should best be combined to define archetypes at the optimal level of granularity and specificity and quality label them for wide adoption. Standardizing clinical terms within EHRs using clinical terminology like Systematized Nomenclature of Medicine Clinical Terms is also a challenge. Clinical archetypes would play an important role in achieving semantic interoperability within EHRs. Attempts are being made in exploring the design and adoption challenges for clinical archetypes.

  5. An Architecture for Semantically Interoperable Electronic Health Records.

    Science.gov (United States)

    Toffanello, André; Gonçalves, Ricardo; Kitajima, Adriana; Puttini, Ricardo; Aguiar, Atualpa

    2017-01-01

    Despite the increasing adhesion of electronic health records, the challenge of semantic interoperability remains unsolved. The fact that different parties can exchange messages does not mean they can understand the underlying clinical meaning, therefore, it cannot be assumed or treated as a requirement. This work introduces an architecture designed to achieve semantic interoperability, in a way which organizations that follow different policies may still share medical information through a common infrastructure comparable to an ecosystem, whose organisms are exemplified within the Brazilian scenario. Nonetheless, the proposed approach describes a service-oriented design with modules adaptable to different contexts. We also discuss the establishment of an enterprise service bus to mediate a health infrastructure defined on top of international standards, such as openEHR and IHE. Moreover, we argue that, in order to achieve truly semantic interoperability in a wide sense, a proper profile must be published and maintained.

  6. Designing learning management system interoperability in semantic web

    Science.gov (United States)

    Anistyasari, Y.; Sarno, R.; Rochmawati, N.

    2018-01-01

    The extensive adoption of learning management system (LMS) has set the focus on the interoperability requirement. Interoperability is the ability of different computer systems, applications or services to communicate, share and exchange data, information, and knowledge in a precise, effective and consistent way. Semantic web technology and the use of ontologies are able to provide the required computational semantics and interoperability for the automation of tasks in LMS. The purpose of this study is to design learning management system interoperability in the semantic web which currently has not been investigated deeply. Moodle is utilized to design the interoperability. Several database tables of Moodle are enhanced and some features are added. The semantic web interoperability is provided by exploited ontology in content materials. The ontology is further utilized as a searching tool to match user’s queries and available courses. It is concluded that LMS interoperability in Semantic Web is possible to be performed.

  7. Semantic Interoperability in Heterogeneous IoT Infrastructure for Healthcare

    Directory of Open Access Journals (Sweden)

    Sohail Jabbar

    2017-01-01

    Full Text Available Interoperability remains a significant burden to the developers of Internet of Things’ Systems. This is due to the fact that the IoT devices are highly heterogeneous in terms of underlying communication protocols, data formats, and technologies. Secondly due to lack of worldwide acceptable standards, interoperability tools remain limited. In this paper, we proposed an IoT based Semantic Interoperability Model (IoT-SIM to provide Semantic Interoperability among heterogeneous IoT devices in healthcare domain. Physicians communicate their patients with heterogeneous IoT devices to monitor their current health status. Information between physician and patient is semantically annotated and communicated in a meaningful way. A lightweight model for semantic annotation of data using heterogeneous devices in IoT is proposed to provide annotations for data. Resource Description Framework (RDF is a semantic web framework that is used to relate things using triples to make it semantically meaningful. RDF annotated patients’ data has made it semantically interoperable. SPARQL query is used to extract records from RDF graph. For simulation of system, we used Tableau, Gruff-6.2.0, and Mysql tools.

  8. Cross border semantic interoperability for clinical research: the EHR4CR semantic resources and services

    Science.gov (United States)

    Daniel, Christel; Ouagne, David; Sadou, Eric; Forsberg, Kerstin; Gilchrist, Mark Mc; Zapletal, Eric; Paris, Nicolas; Hussain, Sajjad; Jaulent, Marie-Christine; MD, Dipka Kalra

    2016-01-01

    With the development of platforms enabling the use of routinely collected clinical data in the context of international clinical research, scalable solutions for cross border semantic interoperability need to be developed. Within the context of the IMI EHR4CR project, we first defined the requirements and evaluation criteria of the EHR4CR semantic interoperability platform and then developed the semantic resources and supportive services and tooling to assist hospital sites in standardizing their data for allowing the execution of the project use cases. The experience gained from the evaluation of the EHR4CR platform accessing to semantically equivalent data elements across 11 European participating EHR systems from 5 countries demonstrated how far the mediation model and mapping efforts met the expected requirements of the project. Developers of semantic interoperability platforms are beginning to address a core set of requirements in order to reach the goal of developing cross border semantic integration of data. PMID:27570649

  9. Visual Development Environment for Semantically Interoperable Smart Cities Applications

    OpenAIRE

    Roukounaki , Aikaterini; Soldatos , John; Petrolo , Riccardo; Loscri , Valeria; Mitton , Nathalie; Serrano , Martin

    2015-01-01

    International audience; This paper presents an IoT architecture for the semantic interoperability of diverse IoT systems and applications in smart cities. The architecture virtualizes diverse IoT systems and ensures their modelling and representation according to common standards-based IoT ontologies. Furthermore, based on this architecture, the paper introduces a first-of-a-kind visual development environment which eases the development of semantically interoperable applications in smart cit...

  10. A development framework for semantically interoperable health information systems.

    Science.gov (United States)

    Lopez, Diego M; Blobel, Bernd G M E

    2009-02-01

    Semantic interoperability is a basic challenge to be met for new generations of distributed, communicating and co-operating health information systems (HIS) enabling shared care and e-Health. Analysis, design, implementation and maintenance of such systems and intrinsic architectures have to follow a unified development methodology. The Generic Component Model (GCM) is used as a framework for modeling any system to evaluate and harmonize state of the art architecture development approaches and standards for health information systems as well as to derive a coherent architecture development framework for sustainable, semantically interoperable HIS and their components. The proposed methodology is based on the Rational Unified Process (RUP), taking advantage of its flexibility to be configured for integrating other architectural approaches such as Service-Oriented Architecture (SOA), Model-Driven Architecture (MDA), ISO 10746, and HL7 Development Framework (HDF). Existing architectural approaches have been analyzed, compared and finally harmonized towards an architecture development framework for advanced health information systems. Starting with the requirements for semantic interoperability derived from paradigm changes for health information systems, and supported in formal software process engineering methods, an appropriate development framework for semantically interoperable HIS has been provided. The usability of the framework has been exemplified in a public health scenario.

  11. The development of clinical document standards for semantic interoperability in china.

    Science.gov (United States)

    Yang, Peng; Pan, Feng; Liu, Danhong; Xu, Yongyong; Wan, Yi; Tu, Haibo; Tang, Xuejun; Hu, Jianping

    2011-12-01

    This study is aimed at developing a set of data groups (DGs) to be employed as reusable building blocks for the construction of the eight most common clinical documents used in China's general hospitals in order to achieve their structural and semantic standardization. The Diagnostics knowledge framework, the related approaches taken from the Health Level Seven (HL7), the Integrating the Healthcare Enterprise (IHE), and the Healthcare Information Technology Standards Panel (HITSP) and 1,487 original clinical records were considered together to form the DG architecture and data sets. The internal structure, content, and semantics of each DG were then defined by mapping each DG data set to a corresponding Clinical Document Architecture data element and matching each DG data set to the metadata in the Chinese National Health Data Dictionary. By using the DGs as reusable building blocks, standardized structures and semantics regarding the clinical documents for semantic interoperability were able to be constructed. Altogether, 5 header DGs, 48 section DGs, and 17 entry DGs were developed. Several issues regarding the DGs, including their internal structure, identifiers, data set names, definitions, length and format, data types, and value sets, were further defined. Standardized structures and semantics regarding the eight clinical documents were structured by the DGs. This approach of constructing clinical document standards using DGs is a feasible standard-driven solution useful in preparing documents possessing semantic interoperability among the disparate information systems in China. These standards need to be validated and refined through further study.

  12. A federated semantic metadata registry framework for enabling interoperability across clinical research and care domains.

    Science.gov (United States)

    Sinaci, A Anil; Laleci Erturkmen, Gokce B

    2013-10-01

    In order to enable secondary use of Electronic Health Records (EHRs) by bridging the interoperability gap between clinical care and research domains, in this paper, a unified methodology and the supporting framework is introduced which brings together the power of metadata registries (MDR) and semantic web technologies. We introduce a federated semantic metadata registry framework by extending the ISO/IEC 11179 standard, and enable integration of data element registries through Linked Open Data (LOD) principles where each Common Data Element (CDE) can be uniquely referenced, queried and processed to enable the syntactic and semantic interoperability. Each CDE and their components are maintained as LOD resources enabling semantic links with other CDEs, terminology systems and with implementation dependent content models; hence facilitating semantic search, much effective reuse and semantic interoperability across different application domains. There are several important efforts addressing the semantic interoperability in healthcare domain such as IHE DEX profile proposal, CDISC SHARE and CDISC2RDF. Our architecture complements these by providing a framework to interlink existing data element registries and repositories for multiplying their potential for semantic interoperability to a greater extent. Open source implementation of the federated semantic MDR framework presented in this paper is the core of the semantic interoperability layer of the SALUS project which enables the execution of the post marketing safety analysis studies on top of existing EHR systems. Copyright © 2013 Elsevier Inc. All rights reserved.

  13. Semantic and syntactic interoperability in online processing of big Earth observation data.

    Science.gov (United States)

    Sudmanns, Martin; Tiede, Dirk; Lang, Stefan; Baraldi, Andrea

    2018-01-01

    The challenge of enabling syntactic and semantic interoperability for comprehensive and reproducible online processing of big Earth observation (EO) data is still unsolved. Supporting both types of interoperability is one of the requirements to efficiently extract valuable information from the large amount of available multi-temporal gridded data sets. The proposed system wraps world models, (semantic interoperability) into OGC Web Processing Services (syntactic interoperability) for semantic online analyses. World models describe spatio-temporal entities and their relationships in a formal way. The proposed system serves as enabler for (1) technical interoperability using a standardised interface to be used by all types of clients and (2) allowing experts from different domains to develop complex analyses together as collaborative effort. Users are connecting the world models online to the data, which are maintained in a centralised storage as 3D spatio-temporal data cubes. It allows also non-experts to extract valuable information from EO data because data management, low-level interactions or specific software issues can be ignored. We discuss the concept of the proposed system, provide a technical implementation example and describe three use cases for extracting changes from EO images and demonstrate the usability also for non-EO, gridded, multi-temporal data sets (CORINE land cover).

  14. Mediation, Alignment, and Information Services for Semantic interoperability (MAISSI): A Trade Study

    National Research Council Canada - National Science Library

    Barlos, Fotis; Hunter, Dan; Krikeles, Basil; McDonough, James

    2007-01-01

    .... Semantic Interoperability (SI) encompasses a broad range of technologies such as data mediation and schema matching, ontology alignment, and context representation that attempt to enable systems to understand each others semantics...

  15. Semantic Interoperability Almost Without Using The Same Vocabulary: Is It Possible?

    Science.gov (United States)

    Krisnadhi, A. A.

    2016-12-01

    Semantic interoperability, which is a key requirement in realizing cross-repository data integration, is often understood as using the same ontology or vocabulary. Consequently, within a particular domain, one can easily assume that there has to be one unifying domain ontology covering as many vocabulary terms in the domain as possible in order to realize any form of data integration across multiple data sources. Furthermore, the desire to provide very precise definition of those many terms led to the development of huge, foundational and domain ontologies that are comprehensive, but too complicated, restrictive, monolithic, and difficult to use and reuse, which cause common data providers to avoid using them. This problem is especially true in a domain as diverse as geosciences as it is virtually impossible to reach an agreement to the semantics of many terms (e.g., there are hundreds of definitions of forest used throughout the world). To overcome this challenge, modular ontology architecture has emerged in recent years, fueled among others, by advances in the ontology design pattern research. Each ontology pattern models only one key notion. It can act as a small module of a larger ontology. Such a module is developed in such a way that it is largely independent of how other notions in the same domain are modeled. This leads to an increased reusability. Furthermore, an ontology formed out of such modules would have an improved understandability over large, monolithic ontologies. Semantic interoperability in the aforementioned architecture is not achieved by enforcing the use of the same vocabulary, but rather, promoting alignment to the same ontology patterns. In this work, we elaborate how this architecture realizes the above idea. In particular, we describe how multiple data sources with differing perspectives and vocabularies can interoperate through this architecture. Building the solution upon semantic technologies such as Linked Data and the Web Ontology

  16. Conceptual Model Formalization in a Semantic Interoperability Service Framework: Transforming Relational Database Schemas to OWL.

    Science.gov (United States)

    Bravo, Carlos; Suarez, Carlos; González, Carolina; López, Diego; Blobel, Bernd

    2014-01-01

    Healthcare information is distributed through multiple heterogeneous and autonomous systems. Access to, and sharing of, distributed information sources are a challenging task. To contribute to meeting this challenge, this paper presents a formal, complete and semi-automatic transformation service from Relational Databases to Web Ontology Language. The proposed service makes use of an algorithm that allows to transform several data models of different domains by deploying mainly inheritance rules. The paper emphasizes the relevance of integrating the proposed approach into an ontology-based interoperability service to achieve semantic interoperability.

  17. Interoperable cross-domain semantic and geospatial framework for automatic change detection

    Science.gov (United States)

    Kuo, Chiao-Ling; Hong, Jung-Hong

    2016-01-01

    With the increasingly diverse types of geospatial data established over the last few decades, semantic interoperability in integrated applications has attracted much interest in the field of Geographic Information System (GIS). This paper proposes a new strategy and framework to process cross-domain geodata at the semantic level. This framework leverages the semantic equivalence of concepts between domains through bridge ontology and facilitates the integrated use of different domain data, which has been long considered as an essential superiority of GIS, but is impeded by the lack of understanding about the semantics implicitly hidden in the data. We choose the task of change detection to demonstrate how the introduction of ontology concept can effectively make the integration possible. We analyze the common properties of geodata and change detection factors, then construct rules and summarize possible change scenario for making final decisions. The use of topographic map data to detect changes in land use shows promising success, as far as the improvement of efficiency and level of automation is concerned. We believe the ontology-oriented approach will enable a new way for data integration across different domains from the perspective of semantic interoperability, and even open a new dimensionality for the future GIS.

  18. The 'PEARL' Data Warehouse: Initial Challenges Faced with Semantic and Syntactic Interoperability.

    Science.gov (United States)

    Mahmoud, Samhar; Boyd, Andy; Curcin, Vasa; Bache, Richard; Ali, Asad; Miles, Simon; Taweel, Adel; Delaney, Brendan; Macleod, John

    2017-01-01

    Data about patients are available from diverse sources, including those routinely collected as individuals interact with service providers, and those provided directly by individuals through surveys. Linking these data can lead to a more complete picture about the individual, to inform either care decision making or research investigations. However, post-linkage, differences in data recording systems and formats present barriers to achieving these aims. This paper describes an approach to combine linked GP records with study observations, and reports initial challenges related to semantic and syntactic interoperability issues.

  19. RuleML-Based Learning Object Interoperability on the Semantic Web

    Science.gov (United States)

    Biletskiy, Yevgen; Boley, Harold; Ranganathan, Girish R.

    2008-01-01

    Purpose: The present paper aims to describe an approach for building the Semantic Web rules for interoperation between heterogeneous learning objects, namely course outlines from different universities, and one of the rule uses: identifying (in)compatibilities between course descriptions. Design/methodology/approach: As proof of concept, a rule…

  20. caCORE version 3: Implementation of a model driven, service-oriented architecture for semantic interoperability.

    Science.gov (United States)

    Komatsoulis, George A; Warzel, Denise B; Hartel, Francis W; Shanbhag, Krishnakant; Chilukuri, Ram; Fragoso, Gilberto; Coronado, Sherri de; Reeves, Dianne M; Hadfield, Jillaine B; Ludet, Christophe; Covitz, Peter A

    2008-02-01

    One of the requirements for a federated information system is interoperability, the ability of one computer system to access and use the resources of another system. This feature is particularly important in biomedical research systems, which need to coordinate a variety of disparate types of data. In order to meet this need, the National Cancer Institute Center for Bioinformatics (NCICB) has created the cancer Common Ontologic Representation Environment (caCORE), an interoperability infrastructure based on Model Driven Architecture. The caCORE infrastructure provides a mechanism to create interoperable biomedical information systems. Systems built using the caCORE paradigm address both aspects of interoperability: the ability to access data (syntactic interoperability) and understand the data once retrieved (semantic interoperability). This infrastructure consists of an integrated set of three major components: a controlled terminology service (Enterprise Vocabulary Services), a standards-based metadata repository (the cancer Data Standards Repository) and an information system with an Application Programming Interface (API) based on Domain Model Driven Architecture. This infrastructure is being leveraged to create a Semantic Service-Oriented Architecture (SSOA) for cancer research by the National Cancer Institute's cancer Biomedical Informatics Grid (caBIG).

  1. Emergent Semantics Interoperability in Large-Scale Decentralized Information Systems

    CERN Document Server

    Cudré-Mauroux, Philippe

    2008-01-01

    Peer-to-peer systems are evolving with new information-system architectures, leading to the idea that the principles of decentralization and self-organization will offer new approaches in informatics, especially for systems that scale with the number of users or for which central authorities do not prevail. This book describes a new way of building global agreements (semantic interoperability) based only on decentralized, self-organizing interactions.

  2. Facilitating Semantic Interoperability Among Ocean Data Systems: ODIP-R2R Student Outcomes

    Science.gov (United States)

    Stocks, K. I.; Chen, Y.; Shepherd, A.; Chandler, C. L.; Dockery, N.; Elya, J. L.; Smith, S. R.; Ferreira, R.; Fu, L.; Arko, R. A.

    2014-12-01

    With informatics providing an increasingly important set of tools for geoscientists, it is critical to train the next generation of scientists in information and data techniques. The NSF-supported Rolling Deck to Repository (R2R) Program works with the academic fleet community to routinely document, assess, and preserve the underway sensor data from U.S. research vessels. The Ocean Data Interoperability Platform (ODIP) is an EU-US-Australian collaboration fostering interoperability among regional e-infrastructures through workshops and joint prototype development. The need to align terminology between systems is a common challenge across all of the ODIP prototypes. Five R2R students were supported to address aspects of semantic interoperability within ODIP. Developing a vocabulary matching service that links terms from different vocabularies with similar concept. The service implements Google Refine reconciliation service interface such that users can leverage Google Refine application as a friendly user interface while linking different vocabulary terms. Developing Resource Description Framework (RDF) resources that map Shipboard Automated Meteorological Oceanographic System (SAMOS) vocabularies to internationally served vocabularies. Each SAMOS vocabulary term (data parameter and quality control flag) will be described as an RDF resource page. These RDF resources allow for enhanced discoverability and retrieval of SAMOS data by enabling data searches based on parameter. Improving data retrieval and interoperability by exposing data and mapped vocabularies using Semantic Web technologies. We have collaborated with ODIP participating organizations in order to build a generalized data model that will be used to populate a SPARQL endpoint in order to provide expressive querying over our data files. Mapping local and regional vocabularies used by R2R to those used by ODIP partners. This work is described more fully in a companion poster. Making published Linked Data

  3. Quality model for semantic IS standards

    NARCIS (Netherlands)

    Folmer, Erwin Johan Albert

    2011-01-01

    Semantic IS (Information Systems) standards are essential for achieving interoperability between organizations. However a recent survey suggests that not the full benefits of standards are achieved, due to the quality issues. This paper presents a quality model for semantic IS standards, that should

  4. An approach to define semantics for BPM systems interoperability

    Science.gov (United States)

    Rico, Mariela; Caliusco, María Laura; Chiotti, Omar; Rosa Galli, María

    2015-04-01

    This article proposes defining semantics for Business Process Management systems interoperability through the ontology of Electronic Business Documents (EBD) used to interchange the information required to perform cross-organizational processes. The semantic model generated allows aligning enterprise's business processes to support cross-organizational processes by matching the business ontology of each business partner with the EBD ontology. The result is a flexible software architecture that allows dynamically defining cross-organizational business processes by reusing the EBD ontology. For developing the semantic model, a method is presented, which is based on a strategy for discovering entity features whose interpretation depends on the context, and representing them for enriching the ontology. The proposed method complements ontology learning techniques that can not infer semantic features not represented in data sources. In order to improve the representation of these entity features, the method proposes using widely accepted ontologies, for representing time entities and relations, physical quantities, measurement units, official country names, and currencies and funds, among others. When the ontologies reuse is not possible, the method proposes identifying whether that feature is simple or complex, and defines a strategy to be followed. An empirical validation of the approach has been performed through a case study.

  5. Semantic Interoperability in Czech Healthcare Environment Supported by HL7 Version 3

    Czech Academy of Sciences Publication Activity Database

    Nagy, Miroslav; Hanzlíček, Petr; Přečková, Petra; Říha, Antonín; Dioszegi, Matěj; Seidl, Libor; Zvárová, Jana

    2010-01-01

    Roč. 49, č. 2 (2010), s. 186-195 ISSN 0026-1270 R&D Projects: GA MŠk(CZ) 1M06014; GA AV ČR 1ET200300413 Institutional research plan: CEZ:AV0Z10300504 Keywords : information storage and retrieval * electronic health record * HL7 * semantic interoperability * communication standards Subject RIV: IN - Informatics, Computer Science Impact factor: 1.472, year: 2010

  6. A step-by-step methodology for enterprise interoperability projects

    Science.gov (United States)

    Chalmeta, Ricardo; Pazos, Verónica

    2015-05-01

    Enterprise interoperability is one of the key factors for enhancing enterprise competitiveness. Achieving enterprise interoperability is an extremely complex process which involves different technological, human and organisational elements. In this paper we present a framework to help enterprise interoperability. The framework has been developed taking into account the three domains of interoperability: Enterprise Modelling, Architecture and Platform and Ontologies. The main novelty of the framework in comparison to existing ones is that it includes a step-by-step methodology that explains how to carry out an enterprise interoperability project taking into account different interoperability views, like business, process, human resources, technology, knowledge and semantics.

  7. Geospatial semantic web

    CERN Document Server

    Zhang, Chuanrong; Li, Weidong

    2015-01-01

    This book covers key issues related to Geospatial Semantic Web, including geospatial web services for spatial data interoperability; geospatial ontology for semantic interoperability; ontology creation, sharing, and integration; querying knowledge and information from heterogeneous data source; interfaces for Geospatial Semantic Web, VGI (Volunteered Geographic Information) and Geospatial Semantic Web; challenges of Geospatial Semantic Web; and development of Geospatial Semantic Web applications. This book also describes state-of-the-art technologies that attempt to solve these problems such as WFS, WMS, RDF, OWL, and GeoSPARQL, and demonstrates how to use the Geospatial Semantic Web technologies to solve practical real-world problems such as spatial data interoperability.

  8. Large scale healthcare data integration and analysis using the semantic web.

    Science.gov (United States)

    Timm, John; Renly, Sondra; Farkash, Ariel

    2011-01-01

    Healthcare data interoperability can only be achieved when the semantics of the content is well defined and consistently implemented across heterogeneous data sources. Achieving these objectives of interoperability requires the collaboration of experts from several domains. This paper describes tooling that integrates Semantic Web technologies with common tools to facilitate cross-domain collaborative development for the purposes of data interoperability. Our approach is divided into stages of data harmonization and representation, model transformation, and instance generation. We applied our approach on Hypergenes, an EU funded project, where we use our method to the Essential Hypertension disease model using a CDA template. Our domain expert partners include clinical providers, clinical domain researchers, healthcare information technology experts, and a variety of clinical data consumers. We show that bringing Semantic Web technologies into the healthcare interoperability toolkit increases opportunities for beneficial collaboration thus improving patient care and clinical research outcomes.

  9. System and methods of resource usage using an interoperable management framework

    Science.gov (United States)

    Heileman, Gregory L.; Jamkhedkar, Pramod A.; Lamb, Christopher C.

    2017-10-31

    Generic rights expression language allowing interoperability across different computing environments including resource usage of different applications. A formal framework for usage management provides scaffolding upon which interoperable usage management systems can be built. Certain features of the framework are standardized, such as the operational semantics, including areas free of standards that necessitate choice and innovation to achieve a balance of flexibility and usability for interoperability in usage management systems.

  10. Implementation of a metadata architecture and knowledge collection to support semantic interoperability in an enterprise data warehouse.

    Science.gov (United States)

    Dhaval, Rakesh; Borlawsky, Tara; Ostrander, Michael; Santangelo, Jennifer; Kamal, Jyoti; Payne, Philip R O

    2008-11-06

    In order to enhance interoperability between enterprise systems, and improve data validity and reliability throughout The Ohio State University Medical Center (OSUMC), we have initiated the development of an ontology-anchored metadata architecture and knowledge collection for our enterprise data warehouse. The metadata and corresponding semantic relationships stored in the OSUMC knowledge collection are intended to promote consistency and interoperability across the heterogeneous clinical, research, business and education information managed within the data warehouse.

  11. Modelling and approaching pragmatic interoperability of distributed geoscience data

    Science.gov (United States)

    Ma, Xiaogang

    2010-05-01

    Interoperability of geodata, which is essential for sharing information and discovering insights within a cyberinfrastructure, is receiving increasing attention. A key requirement of interoperability in the context of geodata sharing is that data provided by local sources can be accessed, decoded, understood and appropriately used by external users. Various researchers have discussed that there are four levels in data interoperability issues: system, syntax, schematics and semantics, which respectively relate to the platform, encoding, structure and meaning of geodata. Ontology-driven approaches have been significantly studied addressing schematic and semantic interoperability issues of geodata in the last decade. There are different types, e.g. top-level ontologies, domain ontologies and application ontologies and display forms, e.g. glossaries, thesauri, conceptual schemas and logical theories. Many geodata providers are maintaining their identified local application ontologies in order to drive standardization in local databases. However, semantic heterogeneities often exist between these local ontologies, even though they are derived from equivalent disciplines. In contrast, common ontologies are being studied in different geoscience disciplines (e.g., NAMD, SWEET, etc.) as a standardization procedure to coordinate diverse local ontologies. Semantic mediation, e.g. mapping between local ontologies, or mapping local ontologies to common ontologies, has been studied as an effective way of achieving semantic interoperability between local ontologies thus reconciling semantic heterogeneities in multi-source geodata. Nevertheless, confusion still exists in the research field of semantic interoperability. One problem is caused by eliminating elements of local pragmatic contexts in semantic mediation. Comparing to the context-independent feature of a common domain ontology, local application ontologies are closely related to elements (e.g., people, time, location

  12. CityGML - Interoperable semantic 3D city models

    Science.gov (United States)

    Gröger, Gerhard; Plümer, Lutz

    2012-07-01

    CityGML is the international standard of the Open Geospatial Consortium (OGC) for the representation and exchange of 3D city models. It defines the three-dimensional geometry, topology, semantics and appearance of the most relevant topographic objects in urban or regional contexts. These definitions are provided in different, well-defined Levels-of-Detail (multiresolution model). The focus of CityGML is on the semantical aspects of 3D city models, its structures, taxonomies and aggregations, allowing users to employ virtual 3D city models for advanced analysis and visualization tasks in a variety of application domains such as urban planning, indoor/outdoor pedestrian navigation, environmental simulations, cultural heritage, or facility management. This is in contrast to purely geometrical/graphical models such as KML, VRML, or X3D, which do not provide sufficient semantics. CityGML is based on the Geography Markup Language (GML), which provides a standardized geometry model. Due to this model and its well-defined semantics and structures, CityGML facilitates interoperable data exchange in the context of geo web services and spatial data infrastructures. Since its standardization in 2008, CityGML has become used on a worldwide scale: tools from notable companies in the geospatial field provide CityGML interfaces. Many applications and projects use this standard. CityGML is also having a strong impact on science: numerous approaches use CityGML, particularly its semantics, for disaster management, emergency responses, or energy-related applications as well as for visualizations, or they contribute to CityGML, improving its consistency and validity, or use CityGML, particularly its different Levels-of-Detail, as a source or target for generalizations. This paper gives an overview of CityGML, its underlying concepts, its Levels-of-Detail, how to extend it, its applications, its likely future development, and the role it plays in scientific research. Furthermore, its

  13. An approach for the semantic interoperability of ISO EN 13606 and OpenEHR archetypes.

    Science.gov (United States)

    Martínez-Costa, Catalina; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2010-10-01

    The communication between health information systems of hospitals and primary care organizations is currently an important challenge to improve the quality of clinical practice and patient safety. However, clinical information is usually distributed among several independent systems that may be syntactically or semantically incompatible. This fact prevents healthcare professionals from accessing clinical information of patients in an understandable and normalized way. In this work, we address the semantic interoperability of two EHR standards: OpenEHR and ISO EN 13606. Both standards follow the dual model approach which distinguishes information and knowledge, this being represented through archetypes. The solution presented here is capable of transforming OpenEHR archetypes into ISO EN 13606 and vice versa by combining Semantic Web and Model-driven Engineering technologies. The resulting software implementation has been tested using publicly available collections of archetypes for both standards.

  14. Semantic modeling and interoperability in product and process engineering a technology for engineering informatics

    CERN Document Server

    2013-01-01

    In the past decade, feature-based design and manufacturing has gained some momentum in various engineering domains to represent and reuse semantic patterns with effective applicability. However, the actual scope of feature application is still very limited. Semantic Modeling and Interoperability in Product and Process Engineering provides a systematic solution for the challenging engineering informatics field aiming at the enhancement of sustainable knowledge representation, implementation and reuse in an open and yet practically manageable scale.   This semantic modeling technology supports uniform, multi-facet and multi-level collaborative system engineering with heterogeneous computer-aided tools, such as CADCAM, CAE, and ERP.  This presented unified feature model can be applied to product and process representation, development, implementation and management. Practical case studies and test samples are provided to illustrate applications which can be implemented by the readers in real-world scenarios. �...

  15. A Semantic Web-based System for Managing Clinical Archetypes.

    Science.gov (United States)

    Fernandez-Breis, Jesualdo Tomas; Menarguez-Tortosa, Marcos; Martinez-Costa, Catalina; Fernandez-Breis, Eneko; Herrero-Sempere, Jose; Moner, David; Sanchez, Jesus; Valencia-Garcia, Rafael; Robles, Montserrat

    2008-01-01

    Archetypes facilitate the sharing of clinical knowledge and therefore are a basic tool for achieving interoperability between healthcare information systems. In this paper, a Semantic Web System for Managing Archetypes is presented. This system allows for the semantic annotation of archetypes, as well for performing semantic searches. The current system is capable of working with both ISO13606 and OpenEHR archetypes.

  16. Sharing meanings: developing interoperable semantic technologies to enhance reproducibility in earth and environmental science research

    Science.gov (United States)

    Schildhauer, M.

    2015-12-01

    Earth and environmental scientists are familiar with the entities, processes, and theories germane to their field of study, and comfortable collecting and analyzing data in their area of interest. Yet, while there appears to be consistency and agreement as to the scientific "terms" used to describe features in their data and analyses, aside from a few fundamental physical characteristics—such as mass or velocity-- there can be broad tolerances, if not considerable ambiguity, in how many earth science "terms" map to the underlying "concepts" that they actually represent. This ambiguity in meanings, or "semantics", creates major problems for scientific reproducibility. It greatly impedes the ability to replicate results—by making it difficult to determine the specifics of the intended meanings of terms such as deforestation or carbon flux -- as to scope, composition, magnitude, etc. In addition, semantic ambiguity complicates assemblage of comparable data for reproducing results, due to ambiguous or idiosyncratic labels for measurements, such as percent cover of forest, where the term "forest" is undefined; or where a reported output of "total carbon-emissions" might just include CO2 emissions, but not methane emissions. In this talk, we describe how the NSF-funded DataONE repository for earth and environmental science data (http://dataone.org), is using W3C-standard languages (RDF/OWL) to build an ontology for clarifying concepts embodied in heterogeneous data and model outputs. With an initial focus on carbon cycling concepts using terrestrial biospheric model outputs and LTER productivity data, we describe how we are achieving interoperability with "semantic vocabularies" (or ontologies) from aligned earth and life science domains, including OBO-foundry ontologies such as ENVO and BCO; the ISO/OGC O&M; and the NSF Earthcube GeoLink project. Our talk will also discuss best practices that may be helpful for other groups interested in constructing their own

  17. Semantic Document Model to Enhance Data and Knowledge Interoperability

    Science.gov (United States)

    Nešić, Saša

    To enable document data and knowledge to be efficiently shared and reused across application, enterprise, and community boundaries, desktop documents should be completely open and queryable resources, whose data and knowledge are represented in a form understandable to both humans and machines. At the same time, these are the requirements that desktop documents need to satisfy in order to contribute to the visions of the Semantic Web. With the aim of achieving this goal, we have developed the Semantic Document Model (SDM), which turns desktop documents into Semantic Documents as uniquely identified and semantically annotated composite resources, that can be instantiated into human-readable (HR) and machine-processable (MP) forms. In this paper, we present the SDM along with an RDF and ontology-based solution for the MP document instance. Moreover, on top of the proposed model, we have built the Semantic Document Management System (SDMS), which provides a set of services that exploit the model. As an application example that takes advantage of SDMS services, we have extended MS Office with a set of tools that enables users to transform MS Office documents (e.g., MS Word and MS PowerPoint) into Semantic Documents, and to search local and distant semantic document repositories for document content units (CUs) over Semantic Web protocols.

  18. Interoperability and different ways of knowing: How semantics can aid in cross-cultural understanding

    Science.gov (United States)

    Pulsifer, P. L.; Parsons, M. A.; Duerr, R. E.; Fox, P. A.; Khalsa, S. S.; McCusker, J. P.; McGuinness, D. L.

    2012-12-01

    differences in its application. Furthermore, it is an analog encoding scheme whose meaning has evolved over time. By semantically modeling the egg code, its subtle variations, and how it connects to other data, we illustrate a mechanism for translating across data formats and representations. But there are limits to what semantically modeling the egg-code can achieve. The egg-code and common operational sea ice formats do not address community needs, notably the timing and processes of sea ice freeze-up and break-up which have profound impact on local hunting, shipping, oil exploration, and safety. We work with local experts from four very different Indigenous communities and scientific creators of sea ice forecasts to establish an understanding of concepts and terminology related to fall freeze-up and spring break up from the individually represented regions. This helps expand our conceptions of sea ice while also aiding in understanding across cultures and communities, and in passing knowledge to younger generations. This is an early step to expanding concepts of interoperability to very different ways of knowing to make data truly relevant and locally useful.

  19. Benchmarking semantic web technology

    CERN Document Server

    García-Castro, R

    2009-01-01

    This book addresses the problem of benchmarking Semantic Web Technologies; first, from a methodological point of view, proposing a general methodology to follow in benchmarking activities over Semantic Web Technologies and, second, from a practical point of view, presenting two international benchmarking activities that involved benchmarking the interoperability of Semantic Web technologies using RDF(S) as the interchange language in one activity and OWL in the other.The book presents in detail how the different resources needed for these interoperability benchmarking activities were defined:

  20. The Fractal Nature of the Semantic Web

    OpenAIRE

    Berners-Lee, Tim; Kagal, Lalana

    2008-01-01

    In the past, many knowledge representation systems failed because they were too monolithic and didn’t scale well, whereas other systems failed to have an impact because they were small and isolated. Along with this trade-off in size, there is also a constant tension between the cost involved in building a larger community that can interoperate through common terms and the cost of the lack of interoperability. The semantic web offers a good compromise between these approaches as it achieves wi...

  1. CCR+: Metadata Based Extended Personal Health Record Data Model Interoperable with the ASTM CCR Standard.

    Science.gov (United States)

    Park, Yu Rang; Yoon, Young Jo; Jang, Tae Hun; Seo, Hwa Jeong; Kim, Ju Han

    2014-01-01

    Extension of the standard model while retaining compliance with it is a challenging issue because there is currently no method for semantically or syntactically verifying an extended data model. A metadata-based extended model, named CCR+, was designed and implemented to achieve interoperability between standard and extended models. Furthermore, a multilayered validation method was devised to validate the standard and extended models. The American Society for Testing and Materials (ASTM) Community Care Record (CCR) standard was selected to evaluate the CCR+ model; two CCR and one CCR+ XML files were evaluated. In total, 188 metadata were extracted from the ASTM CCR standard; these metadata are semantically interconnected and registered in the metadata registry. An extended-data-model-specific validation file was generated from these metadata. This file can be used in a smartphone application (Health Avatar CCR+) as a part of a multilayered validation. The new CCR+ model was successfully evaluated via a patient-centric exchange scenario involving multiple hospitals, with the results supporting both syntactic and semantic interoperability between the standard CCR and extended, CCR+, model. A feasible method for delivering an extended model that complies with the standard model is presented herein. There is a great need to extend static standard models such as the ASTM CCR in various domains: the methods presented here represent an important reference for achieving interoperability between standard and extended models.

  2. Enabling interoperability in planetary sciences and heliophysics: The case for an information model

    Science.gov (United States)

    Hughes, J. Steven; Crichton, Daniel J.; Raugh, Anne C.; Cecconi, Baptiste; Guinness, Edward A.; Isbell, Christopher E.; Mafi, Joseph N.; Gordon, Mitchell K.; Hardman, Sean H.; Joyner, Ronald S.

    2018-01-01

    The Planetary Data System has developed the PDS4 Information Model to enable interoperability across diverse science disciplines. The Information Model is based on an integration of International Organization for Standardization (ISO) level standards for trusted digital archives, information model development, and metadata registries. Where controlled vocabularies provides a basic level of interoperability by providing a common set of terms for communication between both machines and humans the Information Model improves interoperability by means of an ontology that provides semantic information or additional related context for the terms. The information model was defined by team of computer scientists and science experts from each of the diverse disciplines in the Planetary Science community, including Atmospheres, Geosciences, Cartography and Imaging Sciences, Navigational and Ancillary Information, Planetary Plasma Interactions, Ring-Moon Systems, and Small Bodies. The model was designed to be extensible beyond the Planetary Science community, for example there are overlaps between certain PDS disciplines and the Heliophysics and Astrophysics disciplines. "Interoperability" can apply to many aspects of both the developer and the end-user experience, for example agency-to-agency, semantic level, and application level interoperability. We define these types of interoperability and focus on semantic level interoperability, the type of interoperability most directly enabled by an information model.

  3. An Integrated Framework to Achieve Interoperability in Person-Centric Health Management

    Directory of Open Access Journals (Sweden)

    Fabio Vergari

    2011-01-01

    Full Text Available The need for high-quality out-of-hospital healthcare is a known socioeconomic problem. Exploiting ICT's evolution, ad-hoc telemedicine solutions have been proposed in the past. Integrating such ad-hoc solutions in order to cost-effectively support the entire healthcare cycle is still a research challenge. In order to handle the heterogeneity of relevant information and to overcome the fragmentation of out-of-hospital instrumentation in person-centric healthcare systems, a shared and open source interoperability component can be adopted, which is ontology driven and based on the semantic web data model. The feasibility and the advantages of the proposed approach are demonstrated by presenting the use case of real-time monitoring of patients' health and their environmental context.

  4. Toward semantic interoperability with linked foundational ontologies in ROMULUS

    CSIR Research Space (South Africa)

    Khan, ZC

    2013-06-01

    Full Text Available A purpose of a foundational ontology is to solve interoperability issues among ontologies. Many foundational ontologies have been developed, reintroducing the ontology interoperability problem. We address this with the new online foundational...

  5. Interoperable Multimedia Annotation and Retrieval for the Tourism Sector

    NARCIS (Netherlands)

    Chatzitoulousis, Antonios; Efraimidis, Pavlos S.; Athanasiadis, I.N.

    2015-01-01

    The Atlas Metadata System (AMS) employs semantic web annotation techniques in order to create an interoperable information annotation and retrieval platform for the tourism sector. AMS adopts state-of-the-art metadata vocabularies, annotation techniques and semantic web technologies.

  6. On the applicability of schema integration techniques to database interoperation

    NARCIS (Netherlands)

    Vermeer, Mark W.W.; Apers, Peter M.G.

    1996-01-01

    We discuss the applicability of schema integration techniques developed for tightly-coupled database interoperation to interoperation of databases stemming from different modelling contexts. We illustrate that in such an environment, it is typically quite difficult to infer the real-world semantics

  7. Clinical information modeling processes for semantic interoperability of electronic health records: systematic review and inductive analysis.

    Science.gov (United States)

    Moreno-Conde, Alberto; Moner, David; Cruz, Wellington Dimas da; Santos, Marcelo R; Maldonado, José Alberto; Robles, Montserrat; Kalra, Dipak

    2015-07-01

    This systematic review aims to identify and compare the existing processes and methodologies that have been published in the literature for defining clinical information models (CIMs) that support the semantic interoperability of electronic health record (EHR) systems. Following the preferred reporting items for systematic reviews and meta-analyses systematic review methodology, the authors reviewed published papers between 2000 and 2013 that covered that semantic interoperability of EHRs, found by searching the PubMed, IEEE Xplore, and ScienceDirect databases. Additionally, after selection of a final group of articles, an inductive content analysis was done to summarize the steps and methodologies followed in order to build CIMs described in those articles. Three hundred and seventy-eight articles were screened and thirty six were selected for full review. The articles selected for full review were analyzed to extract relevant information for the analysis and characterized according to the steps the authors had followed for clinical information modeling. Most of the reviewed papers lack a detailed description of the modeling methodologies used to create CIMs. A representative example is the lack of description related to the definition of terminology bindings and the publication of the generated models. However, this systematic review confirms that most clinical information modeling activities follow very similar steps for the definition of CIMs. Having a robust and shared methodology could improve their correctness, reliability, and quality. Independently of implementation technologies and standards, it is possible to find common patterns in methods for developing CIMs, suggesting the viability of defining a unified good practice methodology to be used by any clinical information modeler. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  8. The Semantic Management of Environmental Resources within the Interoperable Context of the EuroGEOSS: Alignment of GEMET and the GEOSS SBAs

    Science.gov (United States)

    Cialone, Claudia; Stock, Kristin

    2010-05-01

    EuroGEOSS is a European Commission funded project. It aims at improving a scientific understanding of the complex mechanisms which drive changes affecting our planet, identifying and establishing interoperable arrangements between environmental information systems. These systems would be sustained and operated by organizations with a clear mandate and resources and rendered available following the specifications of already existent frameworks such as GEOSS (the Global Earth Observation System of systems)1 and INSPIRE (the Infrastructure for Spatial Information in the European Community)2. The EuroGEOSS project's infrastructure focuses on three thematic areas: forestry, drought and biodiversity. One of the important activities in the project is the retrieval, parsing and harmonization of the large amount of heterogeneous environmental data available at local, regional and global levels between these strategic areas. The challenge is to render it semantically and technically interoperable in a simple way. An initial step in achieving this semantic and technical interoperability involves the selection of appropriate classification schemes (for example, thesauri, ontologies and controlled vocabularies) to describe the resources in the EuroGEOSS framework. These classifications become a crucial part of the interoperable framework scaffolding because they allow data providers to describe their resources and thus support resource discovery, execution and orchestration of varying levels of complexity. However, at present, given the diverse range of environmental thesauri, controlled vocabularies and ontologies and the large number of resources provided by project participants, the selection of appropriate classification schemes involves a number of considerations. First of all, there is the semantic difficulty of selecting classification schemes that contain concepts that are relevant to each thematic area. Secondly, EuroGEOSS is intended to accommodate a number of

  9. A Proof-of-Concept for Semantically Interoperable Federation of IoT Experimentation Facilities

    Directory of Open Access Journals (Sweden)

    Jorge Lanza

    2016-06-01

    Full Text Available The Internet-of-Things (IoT is unanimously identified as one of the main pillars of future smart scenarios. The potential of IoT technologies and deployments has been already demonstrated in a number of different application areas, including transport, energy, safety and healthcare. However, despite the growing number of IoT deployments, the majority of IoT applications tend to be self-contained, thereby forming application silos. A lightweight data centric integration and combination of these silos presents several challenges that still need to be addressed. Indeed, the ability to combine and synthesize data streams and services from diverse IoT platforms and testbeds, holds the promise to increase the potentiality of smart applications in terms of size, scope and targeted business context. In this article, a proof-of-concept implementation that federates two different IoT experimentation facilities by means of semantic-based technologies will be described. The specification and design of the implemented system and information models will be described together with the practical details of the developments carried out and its integration with the existing IoT platforms supporting the aforementioned testbeds. Overall, the system described in this paper demonstrates that it is possible to open new horizons in the development of IoT applications and experiments at a global scale, that transcend the (silo boundaries of individual deployments, based on the semantic interconnection and interoperability of diverse IoT platforms and testbeds.

  10. A Proof-of-Concept for Semantically Interoperable Federation of IoT Experimentation Facilities.

    Science.gov (United States)

    Lanza, Jorge; Sanchez, Luis; Gomez, David; Elsaleh, Tarek; Steinke, Ronald; Cirillo, Flavio

    2016-06-29

    The Internet-of-Things (IoT) is unanimously identified as one of the main pillars of future smart scenarios. The potential of IoT technologies and deployments has been already demonstrated in a number of different application areas, including transport, energy, safety and healthcare. However, despite the growing number of IoT deployments, the majority of IoT applications tend to be self-contained, thereby forming application silos. A lightweight data centric integration and combination of these silos presents several challenges that still need to be addressed. Indeed, the ability to combine and synthesize data streams and services from diverse IoT platforms and testbeds, holds the promise to increase the potentiality of smart applications in terms of size, scope and targeted business context. In this article, a proof-of-concept implementation that federates two different IoT experimentation facilities by means of semantic-based technologies will be described. The specification and design of the implemented system and information models will be described together with the practical details of the developments carried out and its integration with the existing IoT platforms supporting the aforementioned testbeds. Overall, the system described in this paper demonstrates that it is possible to open new horizons in the development of IoT applications and experiments at a global scale, that transcend the (silo) boundaries of individual deployments, based on the semantic interconnection and interoperability of diverse IoT platforms and testbeds.

  11. Extending the GI Brokering Suite to Support New Interoperability Specifications

    Science.gov (United States)

    Boldrini, E.; Papeschi, F.; Santoro, M.; Nativi, S.

    2014-12-01

    The GI brokering suite provides the discovery, access, and semantic Brokers (i.e. GI-cat, GI-axe, GI-sem) that empower a Brokering framework for multi-disciplinary and multi-organizational interoperability. GI suite has been successfully deployed in the framework of several programmes and initiatives, such as European Union funded projects, NSF BCube, and the intergovernmental coordinated effort Global Earth Observation System of Systems (GEOSS). Each GI suite Broker facilitates interoperability for a particular functionality (i.e. discovery, access, semantic extension) among a set of brokered resources published by autonomous providers (e.g. data repositories, web services, semantic assets) and a set of heterogeneous consumers (e.g. client applications, portals, apps). A wide set of data models, encoding formats, and service protocols are already supported by the GI suite, such as the ones defined by international standardizing organizations like OGC and ISO (e.g. WxS, CSW, SWE, GML, netCDF) and by Community specifications (e.g. THREDDS, OpenSearch, OPeNDAP, ESRI APIs). Using GI suite, resources published by a particular Community or organization through their specific technology (e.g. OPeNDAP/netCDF) can be transparently discovered, accessed, and used by different Communities utilizing their preferred tools (e.g. a GIS visualizing WMS layers). Since Information Technology is a moving target, new standards and technologies continuously emerge and are adopted in the Earth Science context too. Therefore, GI Brokering suite was conceived to be flexible and accommodate new interoperability protocols and data models. For example, GI suite has recently added support to well-used specifications, introduced to implement Linked data, Semantic Web and precise community needs. Amongst the others, they included: DCAT: a RDF vocabulary designed to facilitate interoperability between Web data catalogs. CKAN: a data management system for data distribution, particularly used by

  12. Towards an Approach of Semantic Access Control for Cloud Computing

    Science.gov (United States)

    Hu, Luokai; Ying, Shi; Jia, Xiangyang; Zhao, Kai

    With the development of cloud computing, the mutual understandability among distributed Access Control Policies (ACPs) has become an important issue in the security field of cloud computing. Semantic Web technology provides the solution to semantic interoperability of heterogeneous applications. In this paper, we analysis existing access control methods and present a new Semantic Access Control Policy Language (SACPL) for describing ACPs in cloud computing environment. Access Control Oriented Ontology System (ACOOS) is designed as the semantic basis of SACPL. Ontology-based SACPL language can effectively solve the interoperability issue of distributed ACPs. This study enriches the research that the semantic web technology is applied in the field of security, and provides a new way of thinking of access control in cloud computing.

  13. Implementing Interoperability in the Seafood Industry: Learning from Experiences in Other Sectors.

    Science.gov (United States)

    Bhatt, Tejas; Gooch, Martin; Dent, Benjamin; Sylvia, Gilbert

    2017-08-01

    Interoperability of communication and information technologies within and between businesses operating along supply chains is being pursued and implemented in numerous industries worldwide to increase the efficiency and effectiveness of operations. The desire for greater interoperability is also driven by the need to reduce business risk through more informed management decisions. Interoperability is achieved by the development of a technology architecture that guides the design and implementation of communication systems existing within individual businesses and between businesses comprising the supply chain. Technology architectures are developed through a purposeful dialogue about why the architecture is required, the benefits and opportunities that the architecture offers the industry, and how the architecture will translate into practical results. An assessment of how the finance, travel, and health industries and a sector of the food industry-fresh produce-have implemented interoperability was conducted to identify lessons learned that can aid the development of interoperability in the seafood industry. The findings include identification of the need for strong, effective governance during the establishment and operation of an interoperability initiative to ensure the existence of common protocols and standards. The resulting insights were distilled into a series of principles for enabling syntactic and semantic interoperability in any industry, which we summarize in this article. Categorized as "structural," "operational," and "integrative," the principles describe requirements and solutions that are pivotal to enabling businesses to create and capture value from full chain interoperability. The principles are also fundamental to allowing governments and advocacy groups to use traceability for public good. © 2017 Institute of Food Technologists®.

  14. Semantic Web Technologies as the Foundation for the Information Infrastructure

    NARCIS (Netherlands)

    Van Oosterom, Peter; Zlatanova, S.; Van Harmelen, Frank; Van Oosterom, Peter; Zlatanova, S

    2008-01-01

    The Semantic Web is arising over the pas few years as a realistic option for a world wide Information Infrastructure, with its promises of semantic interoperability and serendipitous reuse. In this paper we will analyse the essential ingredients of semantic technologies, what makes them suitable as

  15. Using software interoperability to achieve a virtual design environment

    Science.gov (United States)

    Gregory, G. Groot; Koshel, R. John

    2005-09-01

    A variety of simulation tools, including optical design and analysis, have benefited by many years of evolution in software functionality and computing power, thus making the notion of virtual design environments a reality. To simulate the optical characteristics of a system, one needs to include optical performance, mechanical design and manufacturing aspects simultaneously. To date, no single software program offers a universal solution. One approach to achieve an integrated environment is to select tools that offer a high degree of interoperability. This allows the selection of the best tools for each aspect of the design working in concert to solve the problem. This paper discusses the issues of how to assemble a design environment and provides an example of a combination of tools for illumination design. We begin by offering a broad definition of interoperability from an optical analysis perspective. This definition includes aspects of file interchange formats, software communications protocols and customized applications. One example solution is proposed by combining SolidWorks1 for computer-aided design (CAD), TracePro2 for optical analysis and MATLAB3 as the mathematical engine for tolerance analysis. The resulting virtual tool will be applied to a lightpipe design task to illustrate how such a system can be used.

  16. Design of large-scale enterprise interoperable value webs

    NARCIS (Netherlands)

    Hofman, W.J.

    2011-01-01

    Still a lot of enterprises are faced with the issue of interoperability. Whereas large enterprises are able to implement the required technology, SMEs (Small and Medium sized Enterprises) face challenges as they lack knowledge and budget. Enterprises have defined their specific semantics and

  17. Auto-Generated Semantic Processing Services

    Science.gov (United States)

    Davis, Rodney; Hupf, Greg

    2009-01-01

    Auto-Generated Semantic Processing (AGSP) Services is a suite of software tools for automated generation of other computer programs, denoted cross-platform semantic adapters, that support interoperability of computer-based communication systems that utilize a variety of both new and legacy communication software running in a variety of operating- system/computer-hardware combinations. AGSP has numerous potential uses in military, space-exploration, and other government applications as well as in commercial telecommunications. The cross-platform semantic adapters take advantage of common features of computer- based communication systems to enforce semantics, messaging protocols, and standards of processing of streams of binary data to ensure integrity of data and consistency of meaning among interoperating systems. The auto-generation aspect of AGSP Services reduces development time and effort by emphasizing specification and minimizing implementation: In effect, the design, building, and debugging of software for effecting conversions among complex communication protocols, custom device mappings, and unique data-manipulation algorithms is replaced with metadata specifications that map to an abstract platform-independent communications model. AGSP Services is modular and has been shown to be easily integrable into new and legacy NASA flight and ground communication systems.

  18. Interoperability of Geographic Information: A Communication Process –Based Prototype

    Directory of Open Access Journals (Sweden)

    Jean Brodeur

    2005-04-01

    Full Text Available Since 1990, municipal, state/provincial, and federal governments have developed numerous geographic databases over the years to fulfill organizations' specific needs. As such, same real world topographic phenomena have been abstracted differently, for instance vegetation (surface, trees (surface, wooded area (line, wooded area (point and line, milieu boisé (surface, zone boisée (unknown geometry. Today, information about these geographic phenomena is accessible on the Internet from Web infrastructures specially developed to simplify their access. Early in the nineties, the development of interoperability of geographic information has been undertaken to solve syntactic, structural, and semantic heterogeneities as well as spatial and temporal heterogeneities to facilitate sharing and integration of such data. Recently, we have proposed a new conceptual framework for interoperability of geographic information based on the human communication process, cognitive science, and ontology, and introduced geosemantic proximity, a reasoning methodology to qualify dynamically the semantic similarity between geographic abstractions. This framework could be of interest to other disciplines. This paper presents the details of our framework for interoperability of geographic information as well as a prototype.

  19. A Service Oriented Architecture Approach to Achieve Interoperability between Immunization Information Systems in Iran.

    Science.gov (United States)

    Hosseini, Masoud; Ahmadi, Maryam; Dixon, Brian E

    2014-01-01

    Clinical decision support (CDS) systems can support vaccine forecasting and immunization reminders; however, immunization decision-making requires data from fragmented, independent systems. Interoperability and accurate data exchange between immunization information systems (IIS) is an essential factor to utilize Immunization CDS systems. Service oriented architecture (SOA) and Health Level 7 (HL7) are dominant standards for web-based exchange of clinical information. We implemented a system based on SOA and HL7 v3 to support immunization CDS in Iran. We evaluated system performance by exchanging 1500 immunization records for roughly 400 infants between two IISs. System turnaround time is less than a minute for synchronous operation calls and the retrieved immunization history of infants were always identical in different systems. CDS generated reports were accordant to immunization guidelines and the calculations for next visit times were accurate. Interoperability is rare or nonexistent between IIS. Since inter-state data exchange is rare in United States, this approach could be a good prototype to achieve interoperability of immunization information.

  20. Towards technical interoperability in telemedicine.

    Energy Technology Data Exchange (ETDEWEB)

    Craft, Richard Layne, II

    2004-05-01

    For telemedicine to realize the vision of anywhere, anytime access to care, the question of how to create a fully interoperable technical infrastructure must be addressed. After briefly discussing how 'technical interoperability' compares with other types of interoperability being addressed in the telemedicine community today, this paper describes reasons for pursuing technical interoperability, presents a proposed framework for realizing technical interoperability, identifies key issues that will need to be addressed if technical interoperability is to be achieved, and suggests a course of action that the telemedicine community might follow to accomplish this goal.

  1. Improved semantic interoperability for content reuse through knowledge organization systems

    Directory of Open Access Journals (Sweden)

    José Antonio Moreiro González

    2012-04-01

    Full Text Available The Knowledge Organization Systems (KOS are resources designed to improve the knowledge interoperability, management and retrieval. As increases the web resources, it’s evidenced the lack of KOS, with the consequent impact in the resources interoperability. The KOSS are, by definition, complicated and costly tools, so much in his creation as in his management. The reuse of similar organizational structures is a necessary element in this context. They analyses experiences of reuse of The KOS and signals like the new standards are impinged on this appearance.

  2. Combining Archetypes with Fast Health Interoperability Resources in Future-proof Health Information Systems.

    Science.gov (United States)

    Bosca, Diego; Moner, David; Maldonado, Jose Alberto; Robles, Montserrat

    2015-01-01

    Messaging standards, and specifically HL7 v2, are heavily used for the communication and interoperability of Health Information Systems. HL7 FHIR was created as an evolution of the messaging standards to achieve semantic interoperability. FHIR is somehow similar to other approaches like the dual model methodology as both are based on the precise modeling of clinical information. In this paper, we demonstrate how we can apply the dual model methodology to standards like FHIR. We show the usefulness of this approach for data transformation between FHIR and other specifications such as HL7 CDA, EN ISO 13606, and openEHR. We also discuss the advantages and disadvantages of defining archetypes over FHIR, and the consequences and outcomes of this approach. Finally, we exemplify this approach by creating a testing data server that supports both FHIR resources and archetypes.

  3. The caCORE Software Development Kit: Streamlining construction of interoperable biomedical information services

    Directory of Open Access Journals (Sweden)

    Warzel Denise

    2006-01-01

    Full Text Available Abstract Background Robust, programmatically accessible biomedical information services that syntactically and semantically interoperate with other resources are challenging to construct. Such systems require the adoption of common information models, data representations and terminology standards as well as documented application programming interfaces (APIs. The National Cancer Institute (NCI developed the cancer common ontologic representation environment (caCORE to provide the infrastructure necessary to achieve interoperability across the systems it develops or sponsors. The caCORE Software Development Kit (SDK was designed to provide developers both within and outside the NCI with the tools needed to construct such interoperable software systems. Results The caCORE SDK requires a Unified Modeling Language (UML tool to begin the development workflow with the construction of a domain information model in the form of a UML Class Diagram. Models are annotated with concepts and definitions from a description logic terminology source using the Semantic Connector component. The annotated model is registered in the Cancer Data Standards Repository (caDSR using the UML Loader component. System software is automatically generated using the Codegen component, which produces middleware that runs on an application server. The caCORE SDK was initially tested and validated using a seven-class UML model, and has been used to generate the caCORE production system, which includes models with dozens of classes. The deployed system supports access through object-oriented APIs with consistent syntax for retrieval of any type of data object across all classes in the original UML model. The caCORE SDK is currently being used by several development teams, including by participants in the cancer biomedical informatics grid (caBIG program, to create compatible data services. caBIG compatibility standards are based upon caCORE resources, and thus the caCORE SDK has

  4. W2E--Wellness Warehouse Engine for Semantic Interoperability of Consumer Health Data.

    Science.gov (United States)

    Honko, Harri; Andalibi, Vafa; Aaltonen, Timo; Parak, Jakub; Saaranen, Mika; Viik, Jari; Korhonen, Ilkka

    2016-11-01

    Novel health monitoring devices and applications allow consumers easy and ubiquitous ways to monitor their health status. However, technologies from different providers lack both technical and semantic interoperability and hence the resulting health data are often deeply tied to a specific service, which is limiting its reusability and utilization in different services. We have designed a Wellness Warehouse Engine (W2E) that bridges this gap and enables seamless exchange of data between different services. W2E provides interfaces to various data sources and makes data available via unified representational state transfer application programming interface to other services. Importantly, it includes Unifier--an engine that allows transforming input data into generic units reusable by other services, and Analyzer--an engine that allows advanced analysis of input data, such as combining different data sources into new output parameters. In this paper, we describe the architecture of W2E and demonstrate its applicability by using it for unifying data from four consumer activity trackers, using a test base of 20 subjects each carrying out three different tracking sessions. Finally, we discuss challenges of building a scalable Unifier engine for the ever-enlarging number of new devices.

  5. Requirements of a security framework for the semantic web

    CSIR Research Space (South Africa)

    Mbaya, IR

    2009-02-01

    Full Text Available The vision of the Semantic Web is to provide the World Wide Web the ability to automate interoperate and reason about resources and services on the Web. However, the autonomous dynamic open distributed and heterogeneous nature of the Semantic Web...

  6. Semantically Enhanced Online Configuration of Feedback Control Schemes.

    Science.gov (United States)

    Milis, Georgios M; Panayiotou, Christos G; Polycarpou, Marios M

    2018-03-01

    Recent progress toward the realization of the "Internet of Things" has improved the ability of physical and soft/cyber entities to operate effectively within large-scale, heterogeneous systems. It is important that such capacity be accompanied by feedback control capabilities sufficient to ensure that the overall systems behave according to their specifications and meet their functional objectives. To achieve this, such systems require new architectures that facilitate the online deployment, composition, interoperability, and scalability of control system components. Most current control systems lack scalability and interoperability because their design is based on a fixed configuration of specific components, with knowledge of their individual characteristics only implicitly passed through the design. This paper addresses the need for flexibility when replacing components or installing new components, which might occur when an existing component is upgraded or when a new application requires a new component, without the need to readjust or redesign the overall system. A semantically enhanced feedback control architecture is introduced for a class of systems, aimed at accommodating new components into a closed-loop control framework by exploiting the semantic inference capabilities of an ontology-based knowledge model. This architecture supports continuous operation of the control system, a crucial property for large-scale systems for which interruptions have negative impact on key performance metrics that may include human comfort and welfare or economy costs. A case-study example from the smart buildings domain is used to illustrate the proposed architecture and semantic inference mechanisms.

  7. A semantically rich and standardised approach enhancing discovery of sensor data and metadata

    Science.gov (United States)

    Kokkinaki, Alexandra; Buck, Justin; Darroch, Louise

    2016-04-01

    The marine environment plays an essential role in the earth's climate. To enhance the ability to monitor the health of this important system, innovative sensors are being produced and combined with state of the art sensor technology. As the number of sensors deployed is continually increasing,, it is a challenge for data users to find the data that meet their specific needs. Furthermore, users need to integrate diverse ocean datasets originating from the same or even different systems. Standards provide a solution to the above mentioned challenges. The Open Geospatial Consortium (OGC) has created Sensor Web Enablement (SWE) standards that enable different sensor networks to establish syntactic interoperability. When combined with widely accepted controlled vocabularies, they become semantically rich and semantic interoperability is achievable. In addition, Linked Data is the recommended best practice for exposing, sharing and connecting information on the Semantic Web using Uniform Resource Identifiers (URIs), Resource Description Framework (RDF) and RDF Query Language (SPARQL). As part of the EU-funded SenseOCEAN project, the British Oceanographic Data Centre (BODC) is working on the standardisation of sensor metadata enabling 'plug and play' sensor integration. Our approach combines standards, controlled vocabularies and persistent URIs to publish sensor descriptions, their data and associated metadata as 5 star Linked Data and OGC SWE (SensorML, Observations & Measurements) standard. Thus sensors become readily discoverable, accessible and useable via the web. Content and context based searching is also enabled since sensors descriptions are understood by machines. Additionally, sensor data can be combined with other sensor or Linked Data datasets to form knowledge. This presentation will describe the work done in BODC to achieve syntactic and semantic interoperability in the sensor domain. It will illustrate the reuse and extension of the Semantic Sensor

  8. The GEOSS solution for enabling data interoperability and integrative research.

    Science.gov (United States)

    Nativi, Stefano; Mazzetti, Paolo; Craglia, Max; Pirrone, Nicola

    2014-03-01

    Global sustainability research requires an integrative research effort underpinned by digital infrastructures (systems) able to harness data and heterogeneous information across disciplines. Digital data and information sharing across systems and applications is achieved by implementing interoperability: a property of a product or system to work with other products or systems, present or future. There are at least three main interoperability challenges a digital infrastructure must address: technological, semantic, and organizational. In recent years, important international programs and initiatives are focusing on such an ambitious objective. This manuscript presents and combines the studies and the experiences carried out by three relevant projects, focusing on the heavy metal domain: Global Mercury Observation System, Global Earth Observation System of Systems (GEOSS), and INSPIRE. This research work recognized a valuable interoperability service bus (i.e., a set of standards models, interfaces, and good practices) proposed to characterize the integrative research cyber-infrastructure of the heavy metal research community. In the paper, the GEOSS common infrastructure is discussed implementing a multidisciplinary and participatory research infrastructure, introducing a possible roadmap for the heavy metal pollution research community to join GEOSS as a new Group on Earth Observation community of practice and develop a research infrastructure for carrying out integrative research in its specific domain.

  9. Model for Semantically Rich Point Cloud Data

    Science.gov (United States)

    Poux, F.; Neuville, R.; Hallot, P.; Billen, R.

    2017-10-01

    This paper proposes an interoperable model for managing high dimensional point clouds while integrating semantics. Point clouds from sensors are a direct source of information physically describing a 3D state of the recorded environment. As such, they are an exhaustive representation of the real world at every scale: 3D reality-based spatial data. Their generation is increasingly fast but processing routines and data models lack of knowledge to reason from information extraction rather than interpretation. The enhanced smart point cloud developed model allows to bring intelligence to point clouds via 3 connected meta-models while linking available knowledge and classification procedures that permits semantic injection. Interoperability drives the model adaptation to potentially many applications through specialized domain ontologies. A first prototype is implemented in Python and PostgreSQL database and allows to combine semantic and spatial concepts for basic hybrid queries on different point clouds.

  10. MODEL FOR SEMANTICALLY RICH POINT CLOUD DATA

    Directory of Open Access Journals (Sweden)

    F. Poux

    2017-10-01

    Full Text Available This paper proposes an interoperable model for managing high dimensional point clouds while integrating semantics. Point clouds from sensors are a direct source of information physically describing a 3D state of the recorded environment. As such, they are an exhaustive representation of the real world at every scale: 3D reality-based spatial data. Their generation is increasingly fast but processing routines and data models lack of knowledge to reason from information extraction rather than interpretation. The enhanced smart point cloud developed model allows to bring intelligence to point clouds via 3 connected meta-models while linking available knowledge and classification procedures that permits semantic injection. Interoperability drives the model adaptation to potentially many applications through specialized domain ontologies. A first prototype is implemented in Python and PostgreSQL database and allows to combine semantic and spatial concepts for basic hybrid queries on different point clouds.

  11. IHE based interoperability - benefits and challenges.

    Science.gov (United States)

    Wozak, Florian; Ammenwerth, Elske; Hörbst, Alexander; Sögner, Peter; Mair, Richard; Schabetsberger, Thomas

    2008-01-01

    Optimized workflows and communication between institutions involved in a patient's treatment process can lead to improved quality and efficiency in the healthcare sector. Electronic Health Records (EHRs) provide a patient-centered access to clinical data across institutional boundaries supporting the above mentioned aspects. Interoperability is regarded as vital success factor. However a clear definition of interoperability does not exist. The aim of this work is to define and to assess interoperability criteria as required for EHRs. The definition and assessment of interoperability criteria is supported by the analysis of existing literature and personal experience as well as by discussions with several domain experts. Criteria for interoperability addresses the following aspects: Interfaces, Semantics, Legal and organizational aspects and Security. The Integrating the Healthcare Enterprises initiative (IHE) profiles make a major contribution to these aspects, but they also arise new problems. Flexibility for adoption to different organizational/regional or other specific conditions is missing. Regional or national initiatives should get a possibility to realize their specific needs within the boundaries of IHE profiles. Security so far is an optional element which is one of IHE greatest omissions. An integrated security approach seems to be preferable. Irrespective of the so far practical significance of the IHE profiles it appears to be of great importance, that the profiles are constantly checked against practical experiences and are continuously adapted.

  12. Using ontological inference and hierarchical matchmaking to overcome semantic heterogeneity in remote sensing-based biodiversity monitoring

    Science.gov (United States)

    Nieland, Simon; Kleinschmit, Birgit; Förster, Michael

    2015-05-01

    Ontology-based applications hold promise in improving spatial data interoperability. In this work we use remote sensing-based biodiversity information and apply semantic formalisation and ontological inference to show improvements in data interoperability/comparability. The proposed methodology includes an observation-based, "bottom-up" engineering approach for remote sensing applications and gives a practical example of semantic mediation of geospatial products. We apply the methodology to three different nomenclatures used for remote sensing-based classification of two heathland nature conservation areas in Belgium and Germany. We analysed sensor nomenclatures with respect to their semantic formalisation and their bio-geographical differences. The results indicate that a hierarchical and transparent nomenclature is far more important for transferability than the sensor or study area. The inclusion of additional information, not necessarily belonging to a vegetation class description, is a key factor for the future success of using semantics for interoperability in remote sensing.

  13. Supporting spatial data harmonization process with the use of ontologies and Semantic Web technologies

    Science.gov (United States)

    Strzelecki, M.; Iwaniak, A.; Łukowicz, J.; Kaczmarek, I.

    2013-10-01

    Nowadays, spatial information is not only used by professionals, but also by common citizens, who uses it for their daily activities. Open Data initiative states that data should be freely and unreservedly available for all users. It also applies to spatial data. As spatial data becomes widely available it is essential to publish it in form which guarantees the possibility of integrating it with other, heterogeneous data sources. Interoperability is the possibility to combine spatial data sets from different sources in a consistent way as well as providing access to it. Providing syntactic interoperability based on well-known data formats is relatively simple, unlike providing semantic interoperability, due to the multiple possible data interpretation. One of the issues connected with the problem of achieving interoperability is data harmonization. It is a process of providing access to spatial data in a representation that allows combining it with other harmonized data in a coherent way by using a common set of data product specification. Spatial data harmonization is performed by creating definition of reclassification and transformation rules (mapping schema) for source application schema. Creation of those rules is a very demanding task which requires wide domain knowledge and a detailed look into application schemas. The paper focuses on proposing methods for supporting data harmonization process, by automated or supervised creation of mapping schemas with the use of ontologies, ontology matching methods and Semantic Web technologies.

  14. The MADE reference information model for interoperable pervasive telemedicine systems

    NARCIS (Netherlands)

    Fung, L.S.N.; Jones, Valerie M.; Hermens, Hermanus J.

    2017-01-01

    Objectives: The main objective is to develop and validate a reference information model (RIM) to support semantic interoperability of pervasive telemedicine systems. The RIM is one component within a larger, computer-interpretable "MADE language" developed by the authors in the context of the

  15. Open PHACTS: Semantic interoperability for drug discovery

    NARCIS (Netherlands)

    William, A.J; Harland, L; Groth, P.T.; Pettifier, S; Christine, C; Willighagen, E.L; Evelo, C.T.; Blomberg, N; Ecker, G.; Goble, C.A.; Mons, B

    2012-01-01

    Open PHACTS is a public-private partnership between academia, publishers, small and medium sized enterprises and pharmaceutical companies. The goal of the project is to deliver and sustain an 'open pharmacological space' using and enhancing state-of-the-art semantic web standards and technologies.

  16. Semantic Web Technologies for Mobile Context-Aware Services

    National Research Council Canada - National Science Library

    Sadeh, Norman M

    2006-01-01

    The emergence of Semantic Web Services and automated service discovery, access and composition functionality will enable higher levels of interoperability and automation across a broad range of contexts (e.g...

  17. Comprehensive Analysis of Semantic Web Reasoners and Tools: A Survey

    Science.gov (United States)

    Khamparia, Aditya; Pandey, Babita

    2017-01-01

    Ontologies are emerging as best representation techniques for knowledge based context domains. The continuing need for interoperation, collaboration and effective information retrieval has lead to the creation of semantic web with the help of tools and reasoners which manages personalized information. The future of semantic web lies in an ontology…

  18. ICD-11 (JLMMS) and SCT Inter-Operation.

    Science.gov (United States)

    Mamou, Marzouk; Rector, Alan; Schulz, Stefan; Campbell, James; Solbrig, Harold; Rodrigues, Jean-Marie

    2016-01-01

    The goal of this work is to contribute to a smooth and semantically sound inter-operability between the ICD-11 (International Classification of Diseases-11th revision Joint Linearization for Mortality, Morbidity and Statistics) and SNOMED CT (SCT). To guarantee such inter-operation between a classification, characterized by a single hierarchy of mutually exclusive and exhaustive classes, as is the JLMMS successor of ICD-10 on the one hand, and the multi-hierarchical, ontology-based clinical terminology SCT on the other hand, we use ontology axioms that logically express generalizable truths. This is expressed by the compositional grammar of SCT, together with queries on axiomsof SCT. We test the feasibility of the method on the circulatory chapter of ICD-11 JLMMS and present limitations and results.

  19. A Dynamic Defense Modeling and Simulation Methodology using Semantic Web Services

    Directory of Open Access Journals (Sweden)

    Kangsun Lee

    2010-04-01

    Full Text Available Defense Modeling and Simulations require interoperable and autonomous federates in order to fully simulate complex behavior of war-fighters and to dynamically adapt themselves to various war-game events, commands and controls. In this paper, we propose a semantic web service based methodology to develop war-game simulations. Our methodology encapsulates war-game logic into a set of web services with additional semantic information in WSDL (Web Service Description Language and OWL (Web Ontology Language. By utilizing dynamic discovery and binding power of semantic web services, we are able to dynamically reconfigure federates according to various simulation events. An ASuW (Anti-Surface Warfare simulator is constructed to demonstrate the methodology and successfully shows that the level of interoperability and autonomy can be greatly improved.

  20. The MMI Semantic Framework: Rosetta Stones for Earth Sciences

    Science.gov (United States)

    Rueda, C.; Bermudez, L. E.; Graybeal, J.; Alexander, P.

    2009-12-01

    Semantic interoperability—the exchange of meaning among computer systems—is needed to successfully share data in Ocean Science and across all Earth sciences. The best approach toward semantic interoperability requires a designed framework, and operationally tested tools and infrastructure within that framework. Currently available technologies make a scientific semantic framework feasible, but its development requires sustainable architectural vision and development processes. This presentation outlines the MMI Semantic Framework, including recent progress on it and its client applications. The MMI Semantic Framework consists of tools, infrastructure, and operational and community procedures and best practices, to meet short-term and long-term semantic interoperability goals. The design and prioritization of the semantic framework capabilities are based on real-world scenarios in Earth observation systems. We describe some key uses cases, as well as the associated requirements for building the overall infrastructure, which is realized through the MMI Ontology Registry and Repository. This system includes support for community creation and sharing of semantic content, ontology registration, version management, and seamless integration of user-friendly tools and application programming interfaces. The presentation describes the architectural components for semantic mediation, registry and repository for vocabularies, ontology, and term mappings. We show how the technologies and approaches in the framework can address community needs for managing and exchanging semantic information. We will demonstrate how different types of users and client applications exploit the tools and services for data aggregation, visualization, archiving, and integration. Specific examples from OOSTethys (http://www.oostethys.org) and the Ocean Observatories Initiative Cyberinfrastructure (http://www.oceanobservatories.org) will be cited. Finally, we show how semantic augmentation of web

  1. Towards a Semantic Web of Things: A Hybrid Semantic Annotation, Extraction, and Reasoning Framework for Cyber-Physical System

    OpenAIRE

    Wu, Zhenyu; Xu, Yuan; Yang, Yunong; Zhang, Chunhong; Zhu, Xinning; Ji, Yang

    2017-01-01

    Web of Things (WoT) facilitates the discovery and interoperability of Internet of Things (IoT) devices in a cyber-physical system (CPS). Moreover, a uniform knowledge representation of physical resources is quite necessary for further composition, collaboration, and decision-making process in CPS. Though several efforts have integrated semantics with WoT, such as knowledge engineering methods based on semantic sensor networks (SSN), it still could not represent the complex relationships betwe...

  2. Architecture for WSN Nodes Integration in Context Aware Systems Using Semantic Messages

    Science.gov (United States)

    Larizgoitia, Iker; Muguira, Leire; Vazquez, Juan Ignacio

    Wireless sensor networks (WSN) are becoming extremely popular in the development of context aware systems. Traditionally WSN have been focused on capturing data, which was later analyzed and interpreted in a server with more computational power. In this kind of scenario the problem of representing the sensor information needs to be addressed. Every node in the network might have different sensors attached; therefore their correspondent packet structures will be different. The server has to be aware of the meaning of every single structure and data in order to be able to interpret them. Multiple sensors, multiple nodes, multiple packet structures (and not following a standard format) is neither scalable nor interoperable. Context aware systems have solved this problem with the use of semantic technologies. They provide a common framework to achieve a standard definition of any domain. Nevertheless, these representations are computationally expensive, so a WSN cannot afford them. The work presented in this paper tries to bridge the gap between the sensor information and its semantic representation, by defining a simple architecture that enables the definition of this information natively in a semantic way, achieving the integration of the semantic information in the network packets. This will have several benefits, the most important being the possibility of promoting every WSN node to a real semantic information source.

  3. Governance of Interoperability in Intergovernmental Services - Towards an Empirical Taxonomy

    Directory of Open Access Journals (Sweden)

    Herbert Kubicek

    2008-12-01

    Full Text Available High quality and comfortable online delivery of governmental services often requires the seamless exchange of data between two or more government agencies. Smooth data exchange, in turn, requires interoperability of the databases and workflows in the agencies involved. Interoperability (IOP is a complex issue covering purely technical aspects such as transmission protocols and data exchange formats, but also content-related semantic aspects such as identifiers and the meaning of codes as well as organizational, contractual or legal issues. Starting from IOP frameworks which provide classifications of what has to be standardized, this paper, based on an ongoing research project, adopts a political and managerial view and tries to clarify the governance of achieving IOP, i.e. where and by whom IOPstandards are developed and established and how they are put into operation. By analyzing 32 cases of successful implementation of IOP in E-Government services within the European Union empirical indicators for different aspects of governance are proposed and applied to develop an empirical taxonomy of different types of IOP governance which can be used for future comparative research regarding success factors, barriers etc.

  4. Basic semantic architecture of interoperability for the intelligent distribution in the CFE electrical system; Arquitectura base de interoperabilidad semantica para el sistema electrico de distribucion inteligente en la CFE

    Energy Technology Data Exchange (ETDEWEB)

    Espinosa Reza, Alfredo; Garcia Mendoza, Raul; Borja Diaz, Jesus Fidel; Sierra Rodriguez, Benjamin [Instituto de Investigaciones Electricas, Cuernavaca, Morelos (Mexico)

    2010-07-01

    The physical and logical architecture of the interoperability platform defined for the distribution management systems (DMS), of the Distribution Subdivision of Comision Federal de Electricidad (CFE) in Mexico is presented. The adopted architecture includes the definition of a technological platform to manage the exchange of information between systems and applications, sustained in the Model of Common Information (CIM), established in norms IEC61968 and IEC 61970. The architecture based on SSOA (Semantic Services Oriented Architecture), on EIB (Enterprise Integration Bus) and on GID (Generic Interface Definition) is presented, as well as the sequence to obtain the interoperability of systems related to the Distribution Management of the of electrical energy in Mexico. Of equal way it is described the process to establish a Semantic Model of the Electrical System of Distribution (SED) and the creation of instances CIM/XML, oriented to the interoperability of the information systems in the DMS scope, by means of the interchange of messages conformed and validated according to the structure obtained and agreed to the rules established by Model CIM. In this way, the messages and the information interchanged among systems, assure the compatibility and correct interpretation in an independent way to the developer, mark or manufacturer of the system source and destiny. The primary target is to establish the infrastructure semantic base of interoperability, cradle in standards that sustain the strategic definition of an Electrical System of Intelligent Distribution (SEDI) in Mexico. [Spanish] Se presenta la arquitectura fisica y logica de la plataforma de interoperabilidad definida para los sistemas de gestion de la distribucion (DMS por sus siglas en ingles), de la Subdireccion de Distribucion de la Comision Federal de Electricidad (CFE) en Mexico. La arquitectura adoptada incluye la definicion de una plataforma tecnologica para gestionar el intercambio de informacion

  5. Collaborative ocean resource interoperability - multi-use of ocean data on the semantic web

    OpenAIRE

    Tao, Feng; Campbell, Jon; Pagnani, Maureen; Griffiths, Gwyn

    2009-01-01

    Earth Observations (EO) collect various characteristics of the objective environment using sensors which often have different measuring, spatial and temporal coverage. Making individual observational data interoperable becomes equally important when viewed in the context of its expensive and time-consuming EO operations. Interoperability will improve reusability of existing observations in both the broader context, and with other observations. As a demonstration of the potential offered by se...

  6. Semantic integration of medication data into the EHOP Clinical Data Warehouse.

    Science.gov (United States)

    Delamarre, Denis; Bouzille, Guillaume; Dalleau, Kevin; Courtel, Denis; Cuggia, Marc

    2015-01-01

    Reusing medication data is crucial for many medical research domains. Semantic integration of such data in clinical data warehouse (CDW) is quite challenging. Our objective was to develop a reliable and scalable method for integrating prescription data into EHOP (a French CDW). PN13/PHAST was used as the semantic interoperability standard during the ETL process, and to store the prescriptions as documents in the CDW. Theriaque was used as a drug knowledge database (DKDB), to annotate the prescription dataset with the finest granularity, and to provide semantic capabilities to the EHOP query workbench. the system was evaluated on a clinical data set. Depending on the use case, the precision ranged from 52% to 100%, Recall was always 100%. interoperability standards and DKDB, document approach, and the finest granularity approach are the key factors for successful drug data integration in CDW.

  7. Documentary languages and knowledge organization systems in the context of the semantic web

    Directory of Open Access Journals (Sweden)

    Marilda Lopes Ginez de Lara

    Full Text Available The aim of this study was to discuss the need for formal documentary languages as a condition for it to function in the Semantic Web. Based on a bibliographic review, Linked Open Data is presented as an initial condition for the operationalization of the Semantic Web, similar to the movement of Linked Open Vocabularies that aimed to promote interoperability among vocabularies. We highlight the Simple Knowledge Organization System format by analyzing its main characteristics and presenting the new standard ISO 25964-1/2:2011/2012 -Thesauri and interoperability with other vocabularies, that revises previous recommendations, adding requirements for the interoperability and mapping of vocabularies. We discuss conceptual problems in the formalization of vocabularies and the need to invest critically in its operationalization, suggesting alternatives to harness the mapping of vocabularies.

  8. Semantic SenseLab: Implementing the vision of the Semantic Web in neuroscience.

    Science.gov (United States)

    Samwald, Matthias; Chen, Huajun; Ruttenberg, Alan; Lim, Ernest; Marenco, Luis; Miller, Perry; Shepherd, Gordon; Cheung, Kei-Hoi

    2010-01-01

    Integrative neuroscience research needs a scalable informatics framework that enables semantic integration of diverse types of neuroscience data. This paper describes the use of the Web Ontology Language (OWL) and other Semantic Web technologies for the representation and integration of molecular-level data provided by several of SenseLab suite of neuroscience databases. Based on the original database structure, we semi-automatically translated the databases into OWL ontologies with manual addition of semantic enrichment. The SenseLab ontologies are extensively linked to other biomedical Semantic Web resources, including the Subcellular Anatomy Ontology, Brain Architecture Management System, the Gene Ontology, BIRNLex and UniProt. The SenseLab ontologies have also been mapped to the Basic Formal Ontology and Relation Ontology, which helps ease interoperability with many other existing and future biomedical ontologies for the Semantic Web. In addition, approaches to representing contradictory research statements are described. The SenseLab ontologies are designed for use on the Semantic Web that enables their integration into a growing collection of biomedical information resources. We demonstrate that our approach can yield significant potential benefits and that the Semantic Web is rapidly becoming mature enough to realize its anticipated promises. The ontologies are available online at http://neuroweb.med.yale.edu/senselab/. 2009 Elsevier B.V. All rights reserved.

  9. Publication, discovery and interoperability of Clinical Decision Support Systems: A Linked Data approach.

    Science.gov (United States)

    Marco-Ruiz, Luis; Pedrinaci, Carlos; Maldonado, J A; Panziera, Luca; Chen, Rong; Bellika, J Gustav

    2016-08-01

    The high costs involved in the development of Clinical Decision Support Systems (CDSS) make it necessary to share their functionality across different systems and organizations. Service Oriented Architectures (SOA) have been proposed to allow reusing CDSS by encapsulating them in a Web service. However, strong barriers in sharing CDS functionality are still present as a consequence of lack of expressiveness of services' interfaces. Linked Services are the evolution of the Semantic Web Services paradigm to process Linked Data. They aim to provide semantic descriptions over SOA implementations to overcome the limitations derived from the syntactic nature of Web services technologies. To facilitate the publication, discovery and interoperability of CDS services by evolving them into Linked Services that expose their interfaces as Linked Data. We developed methods and models to enhance CDS SOA as Linked Services that define a rich semantic layer based on machine interpretable ontologies that powers their interoperability and reuse. These ontologies provided unambiguous descriptions of CDS services properties to expose them to the Web of Data. We developed models compliant with Linked Data principles to create a semantic representation of the components that compose CDS services. To evaluate our approach we implemented a set of CDS Linked Services using a Web service definition ontology. The definitions of Web services were linked to the models developed in order to attach unambiguous semantics to the service components. All models were bound to SNOMED-CT and public ontologies (e.g. Dublin Core) in order to count on a lingua franca to explore them. Discovery and analysis of CDS services based on machine interpretable models was performed reasoning over the ontologies built. Linked Services can be used effectively to expose CDS services to the Web of Data by building on current CDS standards. This allows building shared Linked Knowledge Bases to provide machine

  10. Design and Implementation of e-Health System Based on Semantic Sensor Network Using IETF YANG

    Directory of Open Access Journals (Sweden)

    Wenquan Jin

    2018-02-01

    Full Text Available Recently, healthcare services can be delivered effectively to patients anytime and anywhere using e-Health systems. e-Health systems are developed through Information and Communication Technologies (ICT that involve sensors, mobiles, and web-based applications for the delivery of healthcare services and information. Remote healthcare is an important purpose of the e-Health system. Usually, the eHealth system includes heterogeneous sensors from diverse manufacturers producing data in different formats. Device interoperability and data normalization is a challenging task that needs research attention. Several solutions are proposed in the literature based on manual interpretation through explicit programming. However, programmatically implementing the interpretation of the data sender and data receiver in the e-Health system for the data transmission is counterproductive as modification will be required for each new device added into the system. In this paper, an e-Health system with the Semantic Sensor Network (SSN is proposed to address the device interoperability issue. In the proposed system, we have used IETF YANG for modeling the semantic e-Health data to represent the information of e-Health sensors. This modeling scheme helps in provisioning semantic interoperability between devices and expressing the sensing data in a user-friendly manner. For this purpose, we have developed an ontology for e-Health data that supports different styles of data formats. The ontology is defined in YANG for provisioning semantic interpretation of sensing data in the system by constructing meta-models of e-Health sensors. The proposed approach assists in the auto-configuration of eHealth sensors and querying the sensor network with semantic interoperability support for the e-Health system.

  11. Integrating semantic dimension into openEHR archetypes for the management of cerebral palsy electronic medical records.

    Science.gov (United States)

    Ellouze, Afef Samet; Bouaziz, Rafik; Ghorbel, Hanen

    2016-10-01

    Integrating semantic dimension into clinical archetypes is necessary once modeling medical records. First, it enables semantic interoperability and, it offers applying semantic activities on clinical data and provides a higher design quality of Electronic Medical Record (EMR) systems. However, to obtain these advantages, designers need to use archetypes that cover semantic features of clinical concepts involved in their specific applications. In fact, most of archetypes filed within open repositories are expressed in the Archetype Definition Language (ALD) which allows defining only the syntactic structure of clinical concepts weakening semantic activities on the EMR content in the semantic web environment. This paper focuses on the modeling of an EMR prototype for infants affected by Cerebral Palsy (CP), using the dual model approach and integrating semantic web technologies. Such a modeling provides a better delivery of quality of care and ensures semantic interoperability between all involved therapies' information systems. First, data to be documented are identified and collected from the involved therapies. Subsequently, data are analyzed and arranged into archetypes expressed in accordance of ADL. During this step, open archetype repositories are explored, in order to find the suitable archetypes. Then, ADL archetypes are transformed into archetypes expressed in OWL-DL (Ontology Web Language - Description Language). Finally, we construct an ontological source related to these archetypes enabling hence their annotation to facilitate data extraction and providing possibility to exercise semantic activities on such archetypes. Semantic dimension integration into EMR modeled in accordance to the archetype approach. The feasibility of our solution is shown through the development of a prototype, baptized "CP-SMS", which ensures semantic exploitation of CP EMR. This prototype provides the following features: (i) creation of CP EMR instances and their checking by

  12. The eXtensible ontology development (XOD) principles and tool implementation to support ontology interoperability.

    Science.gov (United States)

    He, Yongqun; Xiang, Zuoshuang; Zheng, Jie; Lin, Yu; Overton, James A; Ong, Edison

    2018-01-12

    Ontologies are critical to data/metadata and knowledge standardization, sharing, and analysis. With hundreds of biological and biomedical ontologies developed, it has become critical to ensure ontology interoperability and the usage of interoperable ontologies for standardized data representation and integration. The suite of web-based Ontoanimal tools (e.g., Ontofox, Ontorat, and Ontobee) support different aspects of extensible ontology development. By summarizing the common features of Ontoanimal and other similar tools, we identified and proposed an "eXtensible Ontology Development" (XOD) strategy and its associated four principles. These XOD principles reuse existing terms and semantic relations from reliable ontologies, develop and apply well-established ontology design patterns (ODPs), and involve community efforts to support new ontology development, promoting standardized and interoperable data and knowledge representation and integration. The adoption of the XOD strategy, together with robust XOD tool development, will greatly support ontology interoperability and robust ontology applications to support data to be Findable, Accessible, Interoperable and Reusable (i.e., FAIR).

  13. Principles of data integration and interoperability in the GEO Biodiversity Observation Network

    Science.gov (United States)

    Saarenmaa, Hannu; Ó Tuama, Éamonn

    2010-05-01

    The goal of the Global Earth Observation System of Systems (GEOSS) is to link existing information systems into a global and flexible network to address nine areas of critical importance to society. One of these "societal benefit areas" is biodiversity and it will be supported by a GEOSS sub-system known as the GEO Biodiversity Observation Network (GEO BON). In planning the GEO BON, it was soon recognised that there are already a multitude of existing networks and initiatives in place worldwide. What has been lacking is a coordinated framework that allows for information sharing and exchange between the networks. Traversing across the various scales of biodiversity, in particular from the individual and species levels to the ecosystems level has long been a challenge. Furthermore, some of the major regions of the world have already taken steps to coordinate their efforts, but links between the regions have not been a priority until now. Linking biodiversity data to that of the other GEO societal benefit areas, in particular ecosystems, climate, and agriculture to produce useful information for the UN Conventions and other policy-making bodies is another need that calls for integration of information. Integration and interoperability are therefore a major theme of GEO BON, and a "system of systems" is very much needed. There are several approaches to integration that need to be considered. Data integration requires harmonising concepts, agreeing on vocabularies, and building ontologies. Semantic mediation of data using these building blocks is still not easy to achieve. Agreements on, or mappings between, the metadata standards that will be used across the networks is a major requirement that will need to be addressed early on. With interoperable metadata, service integration will be possible through registry of registries systems such as GBIF's forthcoming GBDRS and the GEO Clearinghouse. Chaining various services that build intermediate products using workflow

  14. Semantic Observation Integration

    Directory of Open Access Journals (Sweden)

    Werner Kuhn

    2012-09-01

    Full Text Available Although the integration of sensor-based information into analysis and decision making has been a research topic for many years, semantic interoperability has not yet been reached. The advent of user-generated content for the geospatial domain, Volunteered Geographic Information (VGI, makes it even more difficult to establish semantic integration. This paper proposes a novel approach to integrating conventional sensor information and VGI, which is exploited in the context of detecting forest fires. In contrast to common logic-based semantic descriptions, we present a formal system using algebraic specifications to unambiguously describe the processing steps from natural phenomena to value-added information. A generic ontology of observations is extended and profiled for forest fire detection in order to illustrate how the sensing process, and transformations between heterogeneous sensing systems, can be represented as mathematical functions and grouped into abstract data types. We discuss the required ontological commitments and a possible generalization.

  15. Interoperability and HealthGRID.

    Science.gov (United States)

    Bescos, C; Schmitt, D; Kass, J; García-Barbero, M; Kantchev, P

    2005-01-01

    GRID technology, with initiatives like the GGF, will have the potential to allow both competition and interoperability not only among applications and toolkits, but also among implementations of key services. The pyramid of eHealth interoperability should be achieved from standards in communication and data security, storage and processing, to the policy initiatives, including organizational protocols, financing procedures, and legal framework. The open challenges for GRID use in clinical fields illustrate the potential of the combination of grid technologies with medical routine into a wider interoperable framework. The Telemedicine Alliance is a consortium (ESA, WHO and ITU), initiated in 2002, in building a vision for the provision of eHealth to European citizens by 2010. After a survey with more that 50 interviews of experts, interoperability was identified as the main showstopper to eHealth implementation. There are already several groups and organizations contributing to standardization. TM-Alliance is supporting the "e-Health Standardization Coordination Group" (eHSCG). It is now, in the design and development phase of GRID technology in Health, the right moment to act with the aim of achieving an interoperable and open framework. The Health area should benefit from the initiatives started at the GGF in terms of global architecture and services definitions, as well as from the security and other web services applications developed under the Internet umbrella. There is a risk that existing important results of the standardization efforts in this area are not taken up simply because they are not always known.

  16. Data interoperability software solution for emergency reaction in the Europe Union

    Science.gov (United States)

    Casado, R.; Rubiera, E.; Sacristan, M.; Schütte, F.; Peters, R.

    2015-07-01

    Emergency management becomes more challenging in international crisis episodes because of cultural, semantic and linguistic differences between all stakeholders, especially first responders. Misunderstandings between first responders makes decision making slower and more difficult. However, spread and development of networks and IT-based emergency management systems (EMSs) have improved emergency responses, which have become more coordinated. Despite improvements made in recent years, EMSs have not still solved problems related to cultural, semantic and linguistic differences which are the real cause of slower decision making. In addition, from a technical perspective, the consolidation of current EMSs and the different formats used to exchange information offers another problem to be solved in any solution proposed for information interoperability between heterogeneous EMSs in different contexts. To overcome these problems, we present a software solution based on semantic and mediation technologies. EMERGency ELements (EMERGEL) (Fundacion CTIC and AntwortING Ingenieurbüro PartG, 2013), a common and modular ontology shared by all the stakeholders, has been defined. It offers the best solution to gather all stakeholders' knowledge in a unique and flexible data model, taking into account different countries' cultural and linguistic issues. To deal with the diversity of data protocols and formats, we have designed a service-oriented architecture for data interoperability (named DISASTER: Data Interoperability Solution At STakeholders Emergency Reaction) providing a flexible extensible solution to solve the mediation issues. Web services have been adopted as specific technology to implement this paradigm that has the most significant academic and industrial visibility and attraction. Contributions of this work have been validated through the design and development of a cross-border realistic prototype scenario, actively involving both emergency managers and emergency

  17. Product-driven Enterprise Interoperability for Manufacturing Systems Integration

    OpenAIRE

    Dassisti , Michele; Panetto , Hervé; Tursi , Angela

    2006-01-01

    International audience; The “Babel tower effect”, induced by the heterogeneity of applications available in the operation of enterprises brings to a consistent lack of “exchangeability” and risk of semantic loss whenever cooperation has to take place within the same enterprise. Generally speaking, this kind of problem falls within the umbrella of interoperability between local reference information models .This position paper discuss some idea on this field and traces a research roadmap to ma...

  18. Integrated Semantics Service Platform for the Internet of Things: A Case Study of a Smart Office

    Science.gov (United States)

    Ryu, Minwoo; Kim, Jaeho; Yun, Jaeseok

    2015-01-01

    The Internet of Things (IoT) allows machines and devices in the world to connect with each other and generate a huge amount of data, which has a great potential to provide useful knowledge across service domains. Combining the context of IoT with semantic technologies, we can build integrated semantic systems to support semantic interoperability. In this paper, we propose an integrated semantic service platform (ISSP) to support ontological models in various IoT-based service domains of a smart city. In particular, we address three main problems for providing integrated semantic services together with IoT systems: semantic discovery, dynamic semantic representation, and semantic data repository for IoT resources. To show the feasibility of the ISSP, we develop a prototype service for a smart office using the ISSP, which can provide a preset, personalized office environment by interpreting user text input via a smartphone. We also discuss a scenario to show how the ISSP-based method would help build a smart city, where services in each service domain can discover and exploit IoT resources that are wanted across domains. We expect that our method could eventually contribute to providing people in a smart city with more integrated, comprehensive services based on semantic interoperability. PMID:25608216

  19. Integrated Semantics Service Platform for the Internet of Things: A Case Study of a Smart Office

    Directory of Open Access Journals (Sweden)

    Minwoo Ryu

    2015-01-01

    Full Text Available The Internet of Things (IoT allows machines and devices in the world to connect with each other and generate a huge amount of data, which has a great potential to provide useful knowledge across service domains. Combining the context of IoT with semantic technologies, we can build integrated semantic systems to support semantic interoperability. In this paper, we propose an integrated semantic service platform (ISSP to support ontological models in various IoT-based service domains of a smart city. In particular, we address three main problems for providing integrated semantic services together with IoT systems: semantic discovery, dynamic semantic representation, and semantic data repository for IoT resources. To show the feasibility of the ISSP, we develop a prototype service for a smart office using the ISSP, which can provide a preset, personalized office environment by interpreting user text input via a smartphone. We also discuss a scenario to show how the ISSP-based method would help build a smart city, where services in each service domain can discover and exploit IoT resources that are wanted across domains. We expect that our method could eventually contribute to providing people in a smart city with more integrated, comprehensive services based on semantic interoperability.

  20. Integrated semantics service platform for the Internet of Things: a case study of a smart office.

    Science.gov (United States)

    Ryu, Minwoo; Kim, Jaeho; Yun, Jaeseok

    2015-01-19

    The Internet of Things (IoT) allows machines and devices in the world to connect with each other and generate a huge amount of data, which has a great potential to provide useful knowledge across service domains. Combining the context of IoT with semantic technologies, we can build integrated semantic systems to support semantic interoperability. In this paper, we propose an integrated semantic service platform (ISSP) to support ontological models in various IoT-based service domains of a smart city. In particular, we address three main problems for providing integrated semantic services together with IoT systems: semantic discovery, dynamic semantic representation, and semantic data repository for IoT resources. To show the feasibility of the ISSP, we develop a prototype service for a smart office using the ISSP, which can provide a preset, personalized office environment by interpreting user text input via a smartphone. We also discuss a scenario to show how the ISSP-based method would help build a smart city, where services in each service domain can discover and exploit IoT resources that are wanted across domains. We expect that our method could eventually contribute to providing people in a smart city with more integrated, comprehensive services based on semantic interoperability.

  1. Towards IoT platforms’ integration : Semantic Translations between W3C SSN and ETSI SAREF

    NARCIS (Netherlands)

    Moreira, João Luiz; Daniele, L.M.; Ferreira Pires, Luis; van Sinderen, Marten J.; Wasielewska, Katarzyna; Szmeja, Pawel; Pawlowski, Wieslaw; Ganzha, Maria; Paprzycki, Marcin

    2017-01-01

    Several IoT ontologies have been developed lately to improve the semantic interoperability of IoT solutions. The most popular of these ontologies, the W3C Semantic Sensor Network (SSN), is considered an ontological foundation for diverse IoT initiatives, particularly OpenIoT. With characteristics

  2. OGC Geographic Information Service Deductive Semantic Reasoning Based on Description Vocabularies Reduction

    Directory of Open Access Journals (Sweden)

    MIAO Lizhi

    2015-09-01

    Full Text Available As geographic information interoperability and sharing developing, more and more interoperable OGC (open geospatial consortium Web services (OWS are generated and published through the internet. These services can facilitate the integration of different scientific applications by searching, finding, and utilizing the large number of scientific data and Web services. However, these services are widely dispersed and hard to be found and utilized with executive semantic retrieval. This is especially true when considering the weak semantic description of geographic information service data. Focusing on semantic retrieval and reasoning of the distributed OWS resources, a deductive and semantic reasoning method is proposed to describe and search relevant OWS resources. Specifically, ①description words are extracted from OWS metadata file to generate GISe ontology-database and instance-database based on geographic ontology according to basic geographic elements category, ②a description words reduction model is put forward to implement knowledge reduction on GISe instance-database based on rough set theory and generate optimized instances database, ③utilizing GISe ontology-database and optimized instance-database to implement semantic inference and reasoning of geographic searching objects is used as an example to demonstrate the efficiency, feasibility and recall ration of the proposed description-word-based reduction model.

  3. Leveraging the Semantic Web for Adaptive Education

    Science.gov (United States)

    Kravcik, Milos; Gasevic, Dragan

    2007-01-01

    In the area of technology-enhanced learning reusability and interoperability issues essentially influence the productivity and efficiency of learning and authoring solutions. There are two basic approaches how to overcome these problems--one attempts to do it via standards and the other by means of the Semantic Web. In practice, these approaches…

  4. Interoperability between phenotype and anatomy ontologies.

    Science.gov (United States)

    Hoehndorf, Robert; Oellrich, Anika; Rebholz-Schuhmann, Dietrich

    2010-12-15

    Phenotypic information is important for the analysis of the molecular mechanisms underlying disease. A formal ontological representation of phenotypic information can help to identify, interpret and infer phenotypic traits based on experimental findings. The methods that are currently used to represent data and information about phenotypes fail to make the semantics of the phenotypic trait explicit and do not interoperate with ontologies of anatomy and other domains. Therefore, valuable resources for the analysis of phenotype studies remain unconnected and inaccessible to automated analysis and reasoning. We provide a framework to formalize phenotypic descriptions and make their semantics explicit. Based on this formalization, we provide the means to integrate phenotypic descriptions with ontologies of other domains, in particular anatomy and physiology. We demonstrate how our framework leads to the capability to represent disease phenotypes, perform powerful queries that were not possible before and infer additional knowledge. http://bioonto.de/pmwiki.php/Main/PheneOntology.

  5. Towards Interoperable IoT Deployments inSmart Cities - How project VITAL enables smart, secure and cost- effective cities

    OpenAIRE

    Schiele , Gregor; Soldatos , John; Mitton , Nathalie

    2014-01-01

    International audience; IoT-based deployments in smart cities raise several challenges, especially in terms of interoperability. In this paper, we illustrate semantic interoperability solutions for IoT systems. Based on these solutions, we describe how the FP7 VITAL project aims to bridge numerous silo IoT deployments in smart cities through repurposing and reusing sensors and data streams across multiple applications without carelessly compromising citizens’ security and privacy. This approa...

  6. Ontology-based knowledge representation for resolution of semantic heterogeneity in GIS

    Science.gov (United States)

    Liu, Ying; Xiao, Han; Wang, Limin; Han, Jialing

    2017-07-01

    Lack of semantic interoperability in geographical information systems has been identified as the main obstacle for data sharing and database integration. The new method should be found to overcome the problems of semantic heterogeneity. Ontologies are considered to be one approach to support geographic information sharing. This paper presents an ontology-driven integration approach to help in detecting and possibly resolving semantic conflicts. Its originality is that each data source participating in the integration process contains an ontology that defines the meaning of its own data. This approach ensures the automation of the integration through regulation of semantic integration algorithm. Finally, land classification in field GIS is described as the example.

  7. caCORE version 3: Implementation of a model driven, service-oriented architecture for semantic interoperability

    OpenAIRE

    Komatsoulis, George A.; Warzel, Denise B.; Hartel, Frank W.; Shanbhag, Krishnakant; Chilukuri, Ram; Fragoso, Gilberto; de Coronado, Sherri; Reeves, Dianne M.; Hadfield, Jillaine B.; Ludet, Christophe; Covitz, Peter A.

    2007-01-01

    One of the requirements for a federated information system is interoperability, the ability of one computer system to access and use the resources of another system. This feature is particularly important in biomedical research systems, which need to coordinate a variety of disparate types of data. In order to meet this need, the National Cancer Institute Center for Bioinformatics (NCICB) has created the cancer Common Ontologic Representation Environment (caCORE), an interoperability infrastr...

  8. Connecting Archaeological Data and Grey Literature via Semantic Cross Search

    Directory of Open Access Journals (Sweden)

    Douglas Tudhope

    2011-07-01

    Full Text Available Differing terminology and database structure hinders meaningful cross search of excavation datasets. Matching free text grey literature reports with datasets poses yet more challenges. Conventional search techniques are unable to cross search between archaeological datasets and Web-based grey literature. Results are reported from two AHRC funded research projects that investigated the use of semantic techniques to link digital archive databases, vocabularies and associated grey literature. STAR (Semantic Technologies for Archaeological Resources was a collaboration between the University of Glamorgan, Hypermedia Research Unit and English Heritage (EH. The main outcome is a research Demonstrator (available online, which cross searches over excavation datasets from different database schemas, including Raunds Roman, Raunds Prehistoric, Museum of London, Silchester Roman and Stanwick sampling. The system additionally cross searches over an extract of excavation reports from the OASIS index of grey literature, operated by the Archaeology Data Service (ADS. A conceptual framework provided by the CIDOC Conceptual Reference Model (CRM integrates the different database structures and the metadata automatically generated from the OASIS reports by natural language processing techniques. The methods employed for extracting semantic RDF representations from the datasets and the information extraction from grey literature are described. The STELLAR project provides freely available tools to reduce the costs of mapping and extracting data to semantic search systems such as the Demonstrator and to linked data representation generally. Detailed use scenarios (and a screen capture video provide a basis for a discussion of key issues, including cost-benefits, ontology modelling, mapping, terminology control, semantic implementation and information extraction issues. The scenarios show that semantic interoperability can be achieved by mapping and extracting

  9. The semantic web in translational medicine: current applications and future directions.

    Science.gov (United States)

    Machado, Catia M; Rebholz-Schuhmann, Dietrich; Freitas, Ana T; Couto, Francisco M

    2015-01-01

    Semantic web technologies offer an approach to data integration and sharing, even for resources developed independently or broadly distributed across the web. This approach is particularly suitable for scientific domains that profit from large amounts of data that reside in the public domain and that have to be exploited in combination. Translational medicine is such a domain, which in addition has to integrate private data from the clinical domain with proprietary data from the pharmaceutical domain. In this survey, we present the results of our analysis of translational medicine solutions that follow a semantic web approach. We assessed these solutions in terms of their target medical use case; the resources covered to achieve their objectives; and their use of existing semantic web resources for the purposes of data sharing, data interoperability and knowledge discovery. The semantic web technologies seem to fulfill their role in facilitating the integration and exploration of data from disparate sources, but it is also clear that simply using them is not enough. It is fundamental to reuse resources, to define mappings between resources, to share data and knowledge. All these aspects allow the instantiation of translational medicine at the semantic web-scale, thus resulting in a network of solutions that can share resources for a faster transfer of new scientific results into the clinical practice. The envisioned network of translational medicine solutions is on its way, but it still requires resolving the challenges of sharing protected data and of integrating semantic-driven technologies into the clinical practice. © The Author 2013. Published by Oxford University Press.

  10. The role of architecture and ontology for interoperability.

    Science.gov (United States)

    Blobel, Bernd; González, Carolina; Oemig, Frank; Lopéz, Diego; Nykänen, Pirkko; Ruotsalainen, Pekka

    2010-01-01

    Turning from organization-centric to process-controlled or even to personalized approaches, advanced healthcare settings have to meet special interoperability challenges. eHealth and pHealth solutions must assure interoperability between actors cooperating to achieve common business objectives. Hereby, the interoperability chain also includes individually tailored technical systems, but also sensors and actuators. For enabling corresponding pervasive computing and even autonomic computing, individualized systems have to be based on an architecture framework covering many domains, scientifically managed by specialized disciplines using their specific ontologies in a formalized way. Therefore, interoperability has to advance from a communication protocol to an architecture-centric approach mastering ontology coordination challenges.

  11. Semantic Web technologies in software engineering

    OpenAIRE

    Gall, H C; Reif, G

    2008-01-01

    Over the years, the software engineering community has developed various tools to support the specification, development, and maintainance of software. Many of these tools use proprietary data formats to store artifacts which hamper interoperability. However, the Semantic Web provides a common framework that allows data to be shared and reused across application, enterprise, and community boundaries. Ontologies are used define the concepts in the domain of discourse and their relationships an...

  12. Managing Interoperability for GEOSS - A Report from the SIF

    Science.gov (United States)

    Khalsa, S. J.; Actur, D.; Nativi, S.; Browdy, S.; Eglitis, P.

    2009-04-01

    The Global Earth Observation System of Systems (GEOSS) is a coordinating and integrating framework for Earth observing and information systems, which are contributed on a voluntary basis by Members and Participating Organizations of the intergovernmental Group on Earth Observations (GEO). GEOSS exists to support informed decision making for the benefit of society, including the implementation of international environmental treaty obligations. GEO Members and Participating organizations use the GEOSS Common Infrastructure (GCI) to register their Earth observation resources, thereby making them discoverable and consumable by both humans and client applications. Essential to meeting GEO user needs is a process for supporting interoperability of observing, processing, modeling and dissemination capabilities. The GEO Standards and Interoperability Forum (SIF) was created to develop, implement and oversee this process. The SIF supports GEO organizations contributing resources to the GEOSS by helping them understand and work with the GEOSS interoperability guidelines and encouraging them to register their "interoperability arrangements" (standards or other ad hoc arrangements for interoperability) in the GEOSS standards registry, which is part of the GCI. These registered interoperability arrangements support the actual services used to achieve interoperability of systems. By making information about these interoperability arrangements available to users of the GEOSS the SIF enhances the understanding and utility of contributed resources. We describe the procedures that the SIF has enacted to carry out its work. To operate effectively the SIF uses a workflow system and is establishing a set of regional teams and domain experts. In the near term our work has focused on population and review of the GEOSS Standards Registry, but we are also developing approaches to achieving progressive convergence on, and uptake of, an optimal set of interoperability arrangements for all of

  13. Lemnos Interoperable Security Program

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, John [Tennessee Valley Authority, Knoxville, TN (United States); Halbgewachs, Ron [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Chavez, Adrian [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Smith, Rhett [Schweitzer Engineering Laboratories, Chattanooga, TN (United States); Teumim, David [Teumim Technical, Allentown, PA (United States)

    2012-01-31

    The manner in which the control systems are being designed and operated in the energy sector is undergoing some of the most significant changes in history due to the evolution of technology and the increasing number of interconnections to other system. With these changes however come two significant challenges that the energy sector must face; 1) Cyber security is more important than ever before, and 2) Cyber security is more complicated than ever before. A key requirement in helping utilities and vendors alike in meeting these challenges is interoperability. While interoperability has been present in much of the discussions relating to technology utilized within the energy sector and especially the Smart Grid, it has been absent in the context of cyber security. The Lemnos project addresses these challenges by focusing on the interoperability of devices utilized within utility control systems which support critical cyber security functions. In theory, interoperability is possible with many of the cyber security solutions available to utilities today. The reality is that the effort required to achieve cyber security interoperability is often a barrier for utilities. For example, consider IPSec, a widely-used Internet Protocol to define Virtual Private Networks, or tunnels , to communicate securely through untrusted public and private networks. The IPSec protocol suite has a significant number of configuration options and encryption parameters to choose from, which must be agreed upon and adopted by both parties establishing the tunnel. The exercise in getting software or devices from different vendors to interoperate is labor intensive and requires a significant amount of security expertise by the end user. Scale this effort to a significant number of devices operating over a large geographical area and the challenge becomes so overwhelming that it often leads utilities to pursue solutions from a single vendor. These single vendor solutions may inadvertently lock

  14. Provenance-Based Approaches to Semantic Web Service Discovery and Usage

    Science.gov (United States)

    Narock, Thomas William

    2012-01-01

    The World Wide Web Consortium defines a Web Service as "a software system designed to support interoperable machine-to-machine interaction over a network." Web Services have become increasingly important both within and across organizational boundaries. With the recent advent of the Semantic Web, web services have evolved into semantic…

  15. Semantic web in the e-learning

    Directory of Open Access Journals (Sweden)

    Andrenizia Aquino Eluan

    2008-01-01

    Full Text Available With the evolution of the technology of information and communication, the Web is adding diversity of resources that can facilitate the development of some areas of the knowledge, because promotes the access and the use of information globalised, accessible and without borders. Discusses the semantic Web as a means of sharing information to adopt standards for interoperability to the communication in network. Among the concerns that surround the education area, are the strategies of search and information retrieval in a relevant and effective for the knowledge of construction and learning. In this context, is the Distance Education, which area can enjoy the resources of the Semantic Web and the advantages of using ontology, which will be presented in this article

  16. HBIM TO VR. SEMANTIC AWARENESS AND DATA ENRICHMENT INTEROPERABILITY FOR PARAMETRIC LIBRARIES OF HISTORICAL ARCHITECTURE

    Directory of Open Access Journals (Sweden)

    R. Quattrini

    2018-05-01

    Full Text Available Recently we assist to an increasing availability of HBIM models rich in geometric and informative terms. Instead, there is still a lack of researches implementing dedicated libraries, based on parametric intelligence and semantically aware, related to the architectural heritage. Additional challenges became from their portability in non-desktop environment (such as VR. The research article demonstrates the validity of a workflow applied to the architectural heritage, which starting from the semantic modeling reaches the visualization in a virtual reality environment, passing through the necessary phases of export, data migration and management. The three-dimensional modeling of the classical Doric order takes place in the BIM work environment and is configured as a necessary starting point for the implementation of data, parametric intelligences and definition of ontologies that exclusively qualify the model. The study also enables an effective method for data migration from the BIM model to databases integrated into VR technologies for AH. Furthermore, the process intends to propose a methodology, applicable in a return path, suited to the achievement of an appropriate data enrichment of each model and to the possibility of interaction in VR environment with the model.

  17. [Lessons learned in the implementation of interoperable National Health Information Systems: a systematic review].

    Science.gov (United States)

    Ovies-Bernal, Diana Paola; Agudelo-Londoño, Sandra M

    2014-01-01

    Identify shared criteria used throughout the world in the implementation of interoperable National Health Information Systems (NHIS) and provide validated scientific information on the dimensions affecting interoperability. This systematic review sought to identify primary articles on the implementation of interoperable NHIS published in scientific journals in English, Portuguese, or Spanish between 1990 and 2011 through a search of eight databases of electronic journals in the health sciences and informatics: MEDLINE (PubMed), Proquest, Ovid, EBSCO, MD Consult, Virtual Health Library, Metapress, and SciELO. The full texts of the articles were reviewed, and those that focused on technical computer aspects or on normative issues were excluded, as well as those that did not meet the quality criteria for systematic reviews of interventions. Of 291 studies found and reviewed, only five met the inclusion criteria. These articles reported on the process of implementing an interoperable NHIS in Brazil, China, the United States, Turkey, and the Semiautonomous Region of Zanzíbar, respectively. Five common basic criteria affecting implementation of the NHIS were identified: standards in place to govern the process, availability of trained human talent, financial and structural constraints, definition of standards, and assurance that the information is secure. Four dimensions affecting interoperability were defined: technical, semantic, legal, and organizational. The criteria identified have to be adapted to the actual situation in each country and a proactive approach should be used to ensure that implementation of the interoperable NHIS is strategic, simple, and reliable.

  18. D-ATM, a working example of health care interoperability: From dirt path to gravel road.

    Science.gov (United States)

    DeClaris, John-William

    2009-01-01

    For many years, there have been calls for interoperability within health care systems. The technology currently exists and is being used in business areas like banking and commerce, to name a few. Yet the question remains, why has interoperability not been achieved in health care? This paper examines issues encountered and success achieved with interoperability during the development of the Digital Access To Medication (D-ATM) project, sponsored by the Substance Abuse and Mental Health Services Administration (SAMHSA). D-ATM is the first government funded interoperable patient management system. The goal of this paper is to provide lessons learned and propose one possible road map for health care interoperability within private industry and how government can help.

  19. Grid interoperability: the interoperations cookbook

    Energy Technology Data Exchange (ETDEWEB)

    Field, L; Schulz, M [CERN (Switzerland)], E-mail: Laurence.Field@cern.ch, E-mail: Markus.Schulz@cern.ch

    2008-07-01

    Over recent years a number of grid projects have emerged which have built grid infrastructures that are now the computing backbones for various user communities. A significant number of these communities are limited to one grid infrastructure due to the different middleware and procedures used in each grid. Grid interoperation is trying to bridge these differences and enable virtual organizations to access resources independent of the grid project affiliation. This paper gives an overview of grid interoperation and describes the current methods used to bridge the differences between grids. Actual use cases encountered during the last three years are discussed and the most important interfaces required for interoperability are highlighted. A summary of the standardisation efforts in these areas is given and we argue for moving more aggressively towards standards.

  20. Grid interoperability: the interoperations cookbook

    International Nuclear Information System (INIS)

    Field, L; Schulz, M

    2008-01-01

    Over recent years a number of grid projects have emerged which have built grid infrastructures that are now the computing backbones for various user communities. A significant number of these communities are limited to one grid infrastructure due to the different middleware and procedures used in each grid. Grid interoperation is trying to bridge these differences and enable virtual organizations to access resources independent of the grid project affiliation. This paper gives an overview of grid interoperation and describes the current methods used to bridge the differences between grids. Actual use cases encountered during the last three years are discussed and the most important interfaces required for interoperability are highlighted. A summary of the standardisation efforts in these areas is given and we argue for moving more aggressively towards standards

  1. PSG: Peer-to-Peer semantic grid framework architecture

    Directory of Open Access Journals (Sweden)

    Amira Soliman

    2011-07-01

    Full Text Available The grid vision, of sharing diverse resources in a flexible, coordinated and secure manner, strongly depends on metadata. Currently, grid metadata is generated and used in an ad-hoc fashion, much of it buried in the grid middleware code libraries and database schemas. This ad-hoc expression and use of metadata causes chronic dependency on human intervention during the operation of grid machinery. Therefore, the Semantic Grid is emerged as an extension of the grid in which rich resource metadata is exposed and handled explicitly, and shared and managed via grid protocols. The layering of an explicit semantic infrastructure over the grid infrastructure potentially leads to increase interoperability and flexibility. In this paper, we present PSG framework architecture that offers semantic-based grid services. PSG architecture allows the explicit use of semantics and defining the associated grid services. PSG architecture is originated from the integration of Peer-to-Peer (P2P computing with semantics and agents. Ontologies are used in annotating each grid component, developing users/nodes profiles and organizing framework agents. While, P2P is responsible for organizing and coordinating the grid nodes and resources.

  2. Semantic Technologies for Nuclear Knowledge Modelling and Applications

    International Nuclear Information System (INIS)

    Beraha, D.; Gladyshev, M.

    2016-01-01

    Full text: The IAEA has been engaged in working with Member States to preserve and enhance nuclear knowledge, and in supporting wide dissemination of safety related technical and technological information enhancing nuclear safety. The knowledge organization systems (ontologies, taxonomies, thesauri, etc.) provide one of the means to model and structure a given knowledge domain. The significance of knowledge organization systems (KOS) has been greatly enhanced by the evolution of the semantic technologies, enabling machines to “understand” the concepts described in a KOS, and to use them in a variety of applications. Over recent years semantic technologies have emerged as efficient means to improve access to information and knowledge. The Semantic Web Standards play an important role in creating an infrastructure of interoperable data sources based on principles of Linked Data. The status of utilizing semantic technologies in the nuclear domain is shortly reviewed, noting that such technologies are in their early stage of adoption, and considering some aspects which are specific to nuclear knowledge management. Several areas are described where semantic technologies are already deployed, and other areas are indicated where applications based on semantic technologies will have a strong impact on nuclear knowledge management in the near future. (author

  3. An Interoperability Platform Enabling Reuse of Electronic Health Records for Signal Verification Studies

    Directory of Open Access Journals (Sweden)

    Mustafa Yuksel

    2016-01-01

    Full Text Available Depending mostly on voluntarily sent spontaneous reports, pharmacovigilance studies are hampered by low quantity and quality of patient data. Our objective is to improve postmarket safety studies by enabling safety analysts to seamlessly access a wide range of EHR sources for collecting deidentified medical data sets of selected patient populations and tracing the reported incidents back to original EHRs. We have developed an ontological framework where EHR sources and target clinical research systems can continue using their own local data models, interfaces, and terminology systems, while structural interoperability and Semantic Interoperability are handled through rule-based reasoning on formal representations of different models and terminology systems maintained in the SALUS Semantic Resource Set. SALUS Common Information Model at the core of this set acts as the common mediator. We demonstrate the capabilities of our framework through one of the SALUS safety analysis tools, namely, the Case Series Characterization Tool, which have been deployed on top of regional EHR Data Warehouse of the Lombardy Region containing about 1 billion records from 16 million patients and validated by several pharmacovigilance researchers with real-life cases. The results confirm significant improvements in signal detection and evaluation compared to traditional methods with the missing background information.

  4. Enterprise Interoperability - Proceedings of the 5th International IFIP Working Conference on Enterprise Interoperability, IWEI 2013

    NARCIS (Netherlands)

    van Sinderen, Marten J.; Oude Luttighuis, P.H.W.M.; Folmer, Erwin Johan Albert; Bosems, S.; Unknown, [Unknown

    IWEI is an International IFIP Working Conference covering all aspects of enterprise interoperability with the purpose of achieving flexible cross-organizational collaboration through integrated support at business and technical levels. It provides a forum for discussing ideas and results among both

  5. FHIR Healthcare Directories: Adopting Shared Interfaces to Achieve Interoperable Medical Device Data Integration.

    Science.gov (United States)

    Tyndall, Timothy; Tyndall, Ayami

    2018-01-01

    Healthcare directories are vital for interoperability among healthcare providers, researchers and patients. Past efforts at directory services have not provided the tools to allow integration of the diverse data sources. Many are overly strict, incompatible with legacy databases, and do not provide Data Provenance. A more architecture-independent system is needed to enable secure, GDPR-compatible (8) service discovery across organizational boundaries. We review our development of a portable Data Provenance Toolkit supporting provenance within Health Information Exchange (HIE) systems. The Toolkit has been integrated with client software and successfully leveraged in clinical data integration. The Toolkit validates provenance stored in a Blockchain or Directory record and creates provenance signatures, providing standardized provenance that moves with the data. This healthcare directory suite implements discovery of healthcare data by HIE and EHR systems via FHIR. Shortcomings of past directory efforts include the ability to map complex datasets and enabling interoperability via exchange endpoint discovery. By delivering data without dictating how it is stored we improve exchange and facilitate discovery on a multi-national level through open source, fully interoperable tools. With the development of Data Provenance resources we enhance exchange and improve security and usability throughout the health data continuum.

  6. Special topic interoperability and EHR: Combining openEHR, SNOMED, IHE, and continua as approaches to interoperability on national ehealth

    DEFF Research Database (Denmark)

    Bestek, M.; Stanimirovi, D.

    2017-01-01

    into the eHealth environment, and identification of the main success factors in the field, which are necessary for achieving required interoperability, and consequently, for the successful implementation of eHealth projects in general. Methods: The paper represents an in-depth analysis regarding...... the potential application of openEHR, SNOMED, IHE and Continua approaches in the development and implementation process of eHealth in Slovenia. The research method used is both exploratory and deductive in nature. The methodological framework is grounded on information retrieval with a special focus on research...... could alleviate the pertinent interoperability issues in the Slovenian eHealth context. 2. Analyzing the possibilities (requirements) for their inclusion in the construction process for individual eHealth solutions. 3. Identification and charting the main success factors in the interoperability field...

  7. Evaluating the Organizational Interoperability Maturity Level in ICT Research Center

    Directory of Open Access Journals (Sweden)

    Manijeh Haghighinasab

    2011-03-01

    Full Text Available Interoperability refers to the ability to provide services and to accept services from other systems or devices. Collaborative enterprises face additional challenges to interoperate seamlessly within a networked organization. The major task here is to assess the maturity level of interoperating organizations. For this purpose the maturity models for enterprise were reviewed based on vendors’ reliability and advantages versus disadvantages. Interoperability maturity model was deduced from ATHENA project as European Integrated Project in 2005, this model named as EIMM was examined in Iran information and Communication Institute as a leading Telecommunication organization. 115 questionnaires were distributed between staff of 4 departments: Information Technology, Communication Technology, Security and Strategic studies regarding six areas of concern: Enterprise Modeling, Business Strategy Process, Organization and Competences, Products and Services, Systems and Technology, Legal Environment, Security and Trust at five maturity levels: Performed, Modeled , Integrated, Interoperable and Optimizing maturity. The findings showed different levels of maturity in this Institute. To achieve Interoperability level, appropriate practices are proposed for promotion to the higher levels.

  8. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation.

    Science.gov (United States)

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  9. Semantic Interoperability for Computational Mineralogy: Experiences of the eMinerals Consortium

    Science.gov (United States)

    Walker, A. M.; White, T. O.; Dove, M. T.; Bruin, R. P.; Couch, P. A.; Tyer, R. P.

    2006-12-01

    The use of atomic scale computer simulation of minerals to obtain information for geophysics and environmental science has grown enormously over the past couple of decades. It is now routine to probe mineral behavior in the Earth's deep interior and in the surface environment by borrowing methods and simulation codes from computational chemistry and physics. It is becoming increasingly important to use methods embodied in more than one of these codes to solve any single scientific problem. However, scientific codes are rarely designed for easy interoperability and data exchange; data formats are often code-specific, poorly documented and fragile, liable to frequent change between software versions, and even compiler versions. This means that the scientist's simple desire to use the methodological approaches offered by multiple codes is frustrated, and even the sharing of data between collaborators becomes fraught with difficulties. The eMinerals consortium was formed in the early stages of the UK eScience program with the aim of developing the tools needed to apply atomic scale simulation to environmental problems in a grid-enabled world, and to harness the computational power offered by grid technologies to address some outstanding mineralogical problems. One example of the kind of problem we can tackle is the origin of the compressibility anomaly in silica glass. By passing data directly between simulation and analysis tools we were able to probe this effect in more detail than has previously been possible and have shown how the anomaly is related to the details of the amorphous structure. In order to approach this kind of problem we have constructed a mini-grid, a small scale and extensible combined compute- and data-grid that allows the execution of many calculations in parallel, and the transparent storage of semantically-rich marked-up result data. Importantly, we automatically capture multiple kinds of metadata and key results from each calculation. We

  10. Interoperability Strategic Vision

    Energy Technology Data Exchange (ETDEWEB)

    Widergren, Steven E.; Knight, Mark R.; Melton, Ronald B.; Narang, David; Martin, Maurice; Nordman, Bruce; Khandekar, Aditya; Hardy, Keith S.

    2018-02-28

    The Interoperability Strategic Vision whitepaper aims to promote a common understanding of the meaning and characteristics of interoperability and to provide a strategy to advance the state of interoperability as applied to integration challenges facing grid modernization. This includes addressing the quality of integrating devices and systems and the discipline to improve the process of successfully integrating these components as business models and information technology improve over time. The strategic vision for interoperability described in this document applies throughout the electric energy generation, delivery, and end-use supply chain. Its scope includes interactive technologies and business processes from bulk energy levels to lower voltage level equipment and the millions of appliances that are becoming equipped with processing power and communication interfaces. A transformational aspect of a vision for interoperability in the future electric system is the coordinated operation of intelligent devices and systems at the edges of grid infrastructure. This challenge offers an example for addressing interoperability concerns throughout the electric system.

  11. Semantical Markov Logic Network for Distributed Reasoning in Cyber-Physical Systems

    Directory of Open Access Journals (Sweden)

    Abdul-Wahid Mohammed

    2017-01-01

    Full Text Available The challenges associated with developing accurate models for cyber-physical systems are attributable to the intrinsic concurrent and heterogeneous computations of these systems. Even though reasoning based on interconnected domain specific ontologies shows promise in enhancing modularity and joint functionality modelling, it has become necessary to build interoperable cyber-physical systems due to the growing pervasiveness of these systems. In this paper, we propose a semantically oriented distributed reasoning architecture for cyber-physical systems. This model accomplishes reasoning through a combination of heterogeneous models of computation. Using the flexibility of semantic agents as a formal representation for heterogeneous computational platforms, we define autonomous and intelligent agent-based reasoning procedure for distributed cyber-physical systems. Sensor networks underpin the semantic capabilities of this architecture, and semantic reasoning based on Markov logic networks is adopted to address uncertainty in modelling. To illustrate feasibility of this approach, we present a Markov logic based semantic event model for cyber-physical systems and discuss a case study of event handling and processing in a smart home.

  12. A semantically-aided architecture for a web-based monitoring system for carotid atherosclerosis.

    Science.gov (United States)

    Kolias, Vassileios D; Stamou, Giorgos; Golemati, Spyretta; Stoitsis, Giannis; Gkekas, Christos D; Liapis, Christos D; Nikita, Konstantina S

    2015-08-01

    Carotid atherosclerosis is a multifactorial disease and its clinical diagnosis depends on the evaluation of heterogeneous clinical data, such as imaging exams, biochemical tests and the patient's clinical history. The lack of interoperability between Health Information Systems (HIS) does not allow the physicians to acquire all the necessary data for the diagnostic process. In this paper, a semantically-aided architecture is proposed for a web-based monitoring system for carotid atherosclerosis that is able to gather and unify heterogeneous data with the use of an ontology and to create a common interface for data access enhancing the interoperability of HIS. The architecture is based on an application ontology of carotid atherosclerosis that is used to (a) integrate heterogeneous data sources on the basis of semantic representation and ontological reasoning and (b) access the critical information using SPARQL query rewriting and ontology-based data access services. The architecture was tested over a carotid atherosclerosis dataset consisting of the imaging exams and the clinical profile of 233 patients, using a set of complex queries, constructed by the physicians. The proposed architecture was evaluated with respect to the complexity of the queries that the physicians could make and the retrieval speed. The proposed architecture gave promising results in terms of interoperability, data integration of heterogeneous sources with an ontological way and expanded capabilities of query and retrieval in HIS.

  13. Using architectures for semantic interoperability to create journal clubs for emergency response

    Energy Technology Data Exchange (ETDEWEB)

    Powell, James E [Los Alamos National Laboratory; Collins, Linn M [Los Alamos National Laboratory; Martinez, Mark L B [Los Alamos National Laboratory

    2009-01-01

    In certain types of 'slow burn' emergencies, careful accumulation and evaluation of information can offer a crucial advantage. The SARS outbreak in the first decade of the 21st century was such an event, and ad hoc journal clubs played a critical role in assisting scientific and technical responders in identifying and developing various strategies for halting what could have become a dangerous pandemic. This research-in-progress paper describes a process for leveraging emerging semantic web and digital library architectures and standards to (1) create a focused collection of bibliographic metadata, (2) extract semantic information, (3) convert it to the Resource Description Framework /Extensible Markup Language (RDF/XML), and (4) integrate it so that scientific and technical responders can share and explore critical information in the collections.

  14. An ontological system for interoperable spatial generalisation in biodiversity monitoring

    Science.gov (United States)

    Nieland, Simon; Moran, Niklas; Kleinschmit, Birgit; Förster, Michael

    2015-11-01

    Semantic heterogeneity remains a barrier to data comparability and standardisation of results in different fields of spatial research. Because of its thematic complexity, differing acquisition methods and national nomenclatures, interoperability of biodiversity monitoring information is especially difficult. Since data collection methods and interpretation manuals broadly vary there is a need for automatised, objective methodologies for the generation of comparable data-sets. Ontology-based applications offer vast opportunities in data management and standardisation. This study examines two data-sets of protected heathlands in Germany and Belgium which are based on remote sensing image classification and semantically formalised in an OWL2 ontology. The proposed methodology uses semantic relations of the two data-sets, which are (semi-)automatically derived from remote sensing imagery, to generate objective and comparable information about the status of protected areas by utilising kernel-based spatial reclassification. This automatised method suggests a generalisation approach, which is able to generate delineation of Special Areas of Conservation (SAC) of the European biodiversity Natura 2000 network. Furthermore, it is able to transfer generalisation rules between areas surveyed with varying acquisition methods in different countries by taking into account automated inference of the underlying semantics. The generalisation results were compared with the manual delineation of terrestrial monitoring. For the different habitats in the two sites an accuracy of above 70% was detected. However, it has to be highlighted that the delineation of the ground-truth data inherits a high degree of uncertainty, which is discussed in this study.

  15. ICSE 2009 Tutorial - Semantic Web Technologies in Software Engineering

    OpenAIRE

    Gall, H C; Reif, G

    2009-01-01

    Over the years, the software engineering community has developed various tools to support the specification, development, and maintainance of software. Many of these tools use proprietary data formats to store artifacts which hamper interoperability. On the other hand, the Semantic Web provides a common framework that allows data to be shared and reused across application, enterprise, and community boundaries. Ontologies are used to define the concepts in the domain of discourse and their rel...

  16. Towards a semantic learning model fostering learning object reusability

    OpenAIRE

    Fernandes , Emmanuel; Madhour , Hend; Wentland Forte , Maia; Miniaoui , Sami

    2005-01-01

    We try in this paper to propose a domain model for both author's and learner's needs concerning learning objects reuse. First of all, we present four key criteria for an efficient authoring tool: adaptive level of granularity, flexibility, integration and interoperability. Secondly, we introduce and describe our six-level Semantic Learning Model (SLM) designed to facilitate multi-level reuse of learning materials and search by defining a multi-layer model for metadata. Finally, after mapping ...

  17. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation (presentation)

    Science.gov (United States)

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  18. COEUS: "semantic web in a box" for biomedical applications.

    Science.gov (United States)

    Lopes, Pedro; Oliveira, José Luís

    2012-12-17

    As the "omics" revolution unfolds, the growth in data quantity and diversity is bringing about the need for pioneering bioinformatics software, capable of significantly improving the research workflow. To cope with these computer science demands, biomedical software engineers are adopting emerging semantic web technologies that better suit the life sciences domain. The latter's complex relationships are easily mapped into semantic web graphs, enabling a superior understanding of collected knowledge. Despite increased awareness of semantic web technologies in bioinformatics, their use is still limited. COEUS is a new semantic web framework, aiming at a streamlined application development cycle and following a "semantic web in a box" approach. The framework provides a single package including advanced data integration and triplification tools, base ontologies, a web-oriented engine and a flexible exploration API. Resources can be integrated from heterogeneous sources, including CSV and XML files or SQL and SPARQL query results, and mapped directly to one or more ontologies. Advanced interoperability features include REST services, a SPARQL endpoint and LinkedData publication. These enable the creation of multiple applications for web, desktop or mobile environments, and empower a new knowledge federation layer. The platform, targeted at biomedical application developers, provides a complete skeleton ready for rapid application deployment, enhancing the creation of new semantic information systems. COEUS is available as open source at http://bioinformatics.ua.pt/coeus/.

  19. Enabling interoperability-as-a-service for connected IoT infrastructures and Smart Objects

    DEFF Research Database (Denmark)

    Hovstø, Asbjørn; Guan, Yajuan; Quintero, Juan Carlos Vasquez

    2018-01-01

    Lack of interoperability is considered as the most important barrier to achieve the global integration of Internet-of-Things (IoT) ecosystems across borders of different disciplines, vendors and standards. Indeed, the current IoT landscape consists of a large set of non-interoperable infrastructu...

  20. Special Topic Interoperability and EHR: Combining openEHR, SNOMED, IHE, and Continua as approaches to interoperability on national eHealth.

    Science.gov (United States)

    Beštek, Mate; Stanimirović, Dalibor

    2017-08-09

    The main aims of the paper comprise the characterization and examination of the potential approaches regarding interoperability. This includes openEHR, SNOMED, IHE, and Continua as combined interoperability approaches, possibilities for their incorporation into the eHealth environment, and identification of the main success factors in the field, which are necessary for achieving required interoperability, and consequently, for the successful implementation of eHealth projects in general. The paper represents an in-depth analysis regarding the potential application of openEHR, SNOMED, IHE and Continua approaches in the development and implementation process of eHealth in Slovenia. The research method used is both exploratory and deductive in nature. The methodological framework is grounded on information retrieval with a special focus on research and charting of existing experience in the field, and sources, both electronic and written, which include interoperability concepts and related implementation issues. The paper will try to answer the following inquiries that are complementing each other: 1. Scrutiny of the potential approaches, which could alleviate the pertinent interoperability issues in the Slovenian eHealth context. 2. Analyzing the possibilities (requirements) for their inclusion in the construction process for individual eHealth solutions. 3. Identification and charting the main success factors in the interoperability field that critically influence development and implementation of eHealth projects in an efficient manner. Provided insights and identified success factors could serve as a constituent of the strategic starting points for continuous integration of interoperability principles into the healthcare domain. Moreover, the general implementation of the identified success factors could facilitate better penetration of ICT into the healthcare environment and enable the eHealth-based transformation of the health system especially in the countries

  1. Enterprise interoperability VI : Interoperability for Agility, Resilience and Plasticity of Collaboration

    CERN Document Server

    Bénaben, Frédérick; Poler, Raúl; Bourrières, Jean-Paul

    2014-01-01

    A concise reference to the state of the art in systems interoperability, Enterprise Interoperability VI will be of great value to engineers and computer scientists working in manufacturing and other process industries and to software engineers and electronic and manufacturing engineers working in the academic environment. Over 40 papers, ranging from academic research through case studies to industrial and administrative experience of interoperability show how, in a scenario of globalised markets, where the capacity to cooperate with other firms efficiently starts to become essential in order to remain in the market in an economically, socially and environmentally cost-effective manner, the most innovative enterprises are beginning to redesign their business model to become interoperable. This goal of interoperability is essential, not only from the perspective of the individual enterprise but also in the new business structures that are now emerging, such as supply chains, virtual enterprises, interconnected...

  2. Semantic technologies and linked data for the Italian PA: the case of data.cnr.it

    Directory of Open Access Journals (Sweden)

    Aldo Gangemi

    2013-01-01

    Full Text Available Governmental data are being published in many countries, providing an unprecedented opportunity to create innovative services and to increase societal awareness about administration dynamics. In particular, semantic technologies for linked data production and exploitation prove to be ideal for managing identity and interoperability of administrative entities and data. This paper presents the current state of art, and evolution scenarios of these technologies, with reference to several case studies, including two of them from the Italian context: CNR's Semantic Scout, and DigitPA's Linked Open IPA.

  3. XML databases and the semantic web

    CERN Document Server

    Thuraisingham, Bhavani

    2002-01-01

    Efficient access to data, sharing data, extracting information from data, and making use of the information have become urgent needs for today''s corporations. With so much data on the Web, managing it with conventional tools is becoming almost impossible. New tools and techniques are necessary to provide interoperability as well as warehousing between multiple data sources and systems, and to extract information from the databases. XML Databases and the Semantic Web focuses on critical and new Web technologies needed for organizations to carry out transactions on the Web, to understand how to use the Web effectively, and to exchange complex documents on the Web.This reference for database administrators, database designers, and Web designers working in tandem with database technologists covers three emerging technologies of significant impact for electronic business: Extensible Markup Language (XML), semi-structured databases, and the semantic Web. The first two parts of the book explore these emerging techn...

  4. Achieving Interoperability Through Base Registries for Governmental Services and Document Management

    Science.gov (United States)

    Charalabidis, Yannis; Lampathaki, Fenareti; Askounis, Dimitris

    As digital infrastructures increase their presence worldwide, following the efforts of governments to provide citizens and businesses with high-quality one-stop services, there is a growing need for the systematic management of those newly defined and constantly transforming processes and electronic documents. E-government Interoperability Frameworks usually cater to the technical standards of e-government systems interconnection, but do not address service composition and use by citizens, businesses, or other administrations.

  5. Achieving control and interoperability through unified model-based systems and software engineering

    Science.gov (United States)

    Rasmussen, Robert; Ingham, Michel; Dvorak, Daniel

    2005-01-01

    Control and interoperation of complex systems is one of the most difficult challenges facing NASA's Exploration Systems Mission Directorate. An integrated but diverse array of vehicles, habitats, and supporting facilities, evolving over the long course of the enterprise, must perform ever more complex tasks while moving steadily away from the sphere of ground support and intervention.

  6. Buildings Interoperability Landscape

    Energy Technology Data Exchange (ETDEWEB)

    Hardin, Dave [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Stephan, Eric G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wang, Weimin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Corbin, Charles D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Widergren, Steven E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-12-31

    Through its Building Technologies Office (BTO), the United States Department of Energy’s Office of Energy Efficiency and Renewable Energy (DOE-EERE) is sponsoring an effort to advance interoperability for the integration of intelligent buildings equipment and automation systems, understanding the importance of integration frameworks and product ecosystems to this cause. This is important to BTO’s mission to enhance energy efficiency and save energy for economic and environmental purposes. For connected buildings ecosystems of products and services from various manufacturers to flourish, the ICT aspects of the equipment need to integrate and operate simply and reliably. Within the concepts of interoperability lie the specification, development, and certification of equipment with standards-based interfaces that connect and work. Beyond this, a healthy community of stakeholders that contribute to and use interoperability work products must be developed. On May 1, 2014, the DOE convened a technical meeting to take stock of the current state of interoperability of connected equipment and systems in buildings. Several insights from that meeting helped facilitate a draft description of the landscape of interoperability for connected buildings, which focuses mainly on small and medium commercial buildings. This document revises the February 2015 landscape document to address reviewer comments, incorporate important insights from the Buildings Interoperability Vision technical meeting, and capture thoughts from that meeting about the topics to be addressed in a buildings interoperability vision. In particular, greater attention is paid to the state of information modeling in buildings and the great potential for near-term benefits in this area from progress and community alignment.

  7. Clinical data integration model. Core interoperability ontology for research using primary care data.

    Science.gov (United States)

    Ethier, J-F; Curcin, V; Barton, A; McGilchrist, M M; Bastiaens, H; Andreasson, A; Rossiter, J; Zhao, L; Arvanitis, T N; Taweel, A; Delaney, B C; Burgun, A

    2015-01-01

    This article is part of the Focus Theme of METHODS of Information in Medicine on "Managing Interoperability and Complexity in Health Systems". Primary care data is the single richest source of routine health care data. However its use, both in research and clinical work, often requires data from multiple clinical sites, clinical trials databases and registries. Data integration and interoperability are therefore of utmost importance. TRANSFoRm's general approach relies on a unified interoperability framework, described in a previous paper. We developed a core ontology for an interoperability framework based on data mediation. This article presents how such an ontology, the Clinical Data Integration Model (CDIM), can be designed to support, in conjunction with appropriate terminologies, biomedical data federation within TRANSFoRm, an EU FP7 project that aims to develop the digital infrastructure for a learning healthcare system in European Primary Care. TRANSFoRm utilizes a unified structural / terminological interoperability framework, based on the local-as-view mediation paradigm. Such an approach mandates the global information model to describe the domain of interest independently of the data sources to be explored. Following a requirement analysis process, no ontology focusing on primary care research was identified and, thus we designed a realist ontology based on Basic Formal Ontology to support our framework in collaboration with various terminologies used in primary care. The resulting ontology has 549 classes and 82 object properties and is used to support data integration for TRANSFoRm's use cases. Concepts identified by researchers were successfully expressed in queries using CDIM and pertinent terminologies. As an example, we illustrate how, in TRANSFoRm, the Query Formulation Workbench can capture eligibility criteria in a computable representation, which is based on CDIM. A unified mediation approach to semantic interoperability provides a

  8. The Semantics of Web Services: An Examination in GIScience Applications

    Directory of Open Access Journals (Sweden)

    Xuan Shi

    2013-09-01

    Full Text Available Web service is a technological solution for software interoperability that supports the seamless integration of diverse applications. In the vision of web service architecture, web services are described by the Web Service Description Language (WSDL, discovered through Universal Description, Discovery and Integration (UDDI and communicate by the Simple Object Access Protocol (SOAP. Such a divination has never been fully accomplished yet. Although it was criticized that WSDL only has a syntactic definition of web services, but was not semantic, prior initiatives in semantic web services did not establish a correct methodology to resolve the problem. This paper examines the distinction and relationship between the syntactic and semantic definitions for web services that characterize different purposes in service computation. Further, this paper proposes that the semantics of web service are neutral and independent from the service interface definition, data types and platform. Such a conclusion can be a universal law in software engineering and service computing. Several use cases in the GIScience application are examined in this paper, while the formalization of geospatial services needs to be constructed by the GIScience community towards a comprehensive ontology of the conceptual definitions and relationships for geospatial computation. Advancements in semantic web services research will happen in domain science applications.

  9. The impact of SOA for achieving healthcare interoperability. An empirical investigation based on a hypothetical adoption.

    Science.gov (United States)

    Daskalakis, S; Mantas, J

    2009-01-01

    The evaluation of a service-oriented prototype implementation for healthcare interoperability. A prototype framework was developed, aiming to exploit the use of service-oriented architecture (SOA) concepts for achieving healthcare interoperability and to move towards a virtual patient record (VPR) paradigm. The prototype implementation was evaluated for its hypothetical adoption. The evaluation strategy was based on the initial proposition of the DeLone and McLean model of information systems (IS) success [1], as modeled by Iivari [2]. A set of SOA and VPR characteristics were empirically encapsulated within the dimensions of IS success model, combined with measures from previous research works. The data gathered was analyzed using partial least squares (PLS). The results highlighted that system quality is a partial predictor of system use but not of user satisfaction. On the contrary, information quality proved to be a significant predictor of user satisfaction and partially a strong significant predictor of system use. Moreover, system use did not prove to be a significant predictor of individual impact whereas the bi-directional relation between use and user satisfaction did not confirm. Additionally, user satisfaction was found to be a strong significant predictor of individual impact. Finally, individual impact proved to be a strong significant predictor of organizational impact. The empirical study attempted to obtain hypothetical, but still useful beliefs and perceptions regarding the SOA prototype implementation. The deduced observations can form the basis for further investigation regarding the adaptability of SOA implementations with VPR characteristics in the healthcare domain.

  10. Towards the understanding of the requirements of a communication language to support process interoperation in cross-disciplinary supply chains

    OpenAIRE

    DAS , BISHNU PADA; Young , R I M; Case , K; Rahimifard , S; Anumba , C; Bouchlaghem , N; Cutting Decelle , Anne-Francoise

    2007-01-01

    Abstract Many manufacturing organisations while doing business either directly or indirectly with other industrial sectors often encounter interoperability problems amongst software systems. This increases the business cost and reduces the efficiency. Research communities are exploring ways to reduce this cost. Incompatibility amongst the syntaxes and the semantics of the languages of application systems is the most common cause to this problem. The Process Specification Language (...

  11. iPad: Semantic annotation and markup of radiological images.

    Science.gov (United States)

    Rubin, Daniel L; Rodriguez, Cesar; Shah, Priyanka; Beaulieu, Chris

    2008-11-06

    Radiological images contain a wealth of information,such as anatomy and pathology, which is often not explicit and computationally accessible. Information schemes are being developed to describe the semantic content of images, but such schemes can be unwieldy to operationalize because there are few tools to enable users to capture structured information easily as part of the routine research workflow. We have created iPad, an open source tool enabling researchers and clinicians to create semantic annotations on radiological images. iPad hides the complexity of the underlying image annotation information model from users, permitting them to describe images and image regions using a graphical interface that maps their descriptions to structured ontologies semi-automatically. Image annotations are saved in a variety of formats,enabling interoperability among medical records systems, image archives in hospitals, and the Semantic Web. Tools such as iPad can help reduce the burden of collecting structured information from images, and it could ultimately enable researchers and physicians to exploit images on a very large scale and glean the biological and physiological significance of image content.

  12. Semantic aspects of the International Classification of Functioning, Disability and Health: towards sharing knowledge and unifying information.

    Science.gov (United States)

    Andronache, Adrian Stefan; Simoncello, Andrea; Della Mea, Vincenzo; Daffara, Carlo; Francescutti, Carlo

    2012-02-01

    During the last decade, under the World Health Organization's direction, the International Classification of Functioning, Disability and Health (ICF) has become a reference tool for monitoring and developing various policies addressing people with disability. This article presents three steps to increase the semantic interoperability of ICF: first, the representation of ICF using ontology tools; second, the alignment to upper-level ontologies; and third, the use of these tools to implement semantic mappings between ICF and other tools, such as disability assessment instruments, health classifications, and at least partially formalized terminologies.

  13. Towards a Semantic Web of Things: A Hybrid Semantic Annotation, Extraction, and Reasoning Framework for Cyber-Physical System

    Directory of Open Access Journals (Sweden)

    Zhenyu Wu

    2017-02-01

    Full Text Available Web of Things (WoT facilitates the discovery and interoperability of Internet of Things (IoT devices in a cyber-physical system (CPS. Moreover, a uniform knowledge representation of physical resources is quite necessary for further composition, collaboration, and decision-making process in CPS. Though several efforts have integrated semantics with WoT, such as knowledge engineering methods based on semantic sensor networks (SSN, it still could not represent the complex relationships between devices when dynamic composition and collaboration occur, and it totally depends on manual construction of a knowledge base with low scalability. In this paper, to addresses these limitations, we propose the semantic Web of Things (SWoT framework for CPS (SWoT4CPS. SWoT4CPS provides a hybrid solution with both ontological engineering methods by extending SSN and machine learning methods based on an entity linking (EL model. To testify to the feasibility and performance, we demonstrate the framework by implementing a temperature anomaly diagnosis and automatic control use case in a building automation system. Evaluation results on the EL method show that linking domain knowledge to DBpedia has a relative high accuracy and the time complexity is at a tolerant level. Advantages and disadvantages of SWoT4CPS with future work are also discussed.

  14. Towards a Semantic Web of Things: A Hybrid Semantic Annotation, Extraction, and Reasoning Framework for Cyber-Physical System.

    Science.gov (United States)

    Wu, Zhenyu; Xu, Yuan; Yang, Yunong; Zhang, Chunhong; Zhu, Xinning; Ji, Yang

    2017-02-20

    Web of Things (WoT) facilitates the discovery and interoperability of Internet of Things (IoT) devices in a cyber-physical system (CPS). Moreover, a uniform knowledge representation of physical resources is quite necessary for further composition, collaboration, and decision-making process in CPS. Though several efforts have integrated semantics with WoT, such as knowledge engineering methods based on semantic sensor networks (SSN), it still could not represent the complex relationships between devices when dynamic composition and collaboration occur, and it totally depends on manual construction of a knowledge base with low scalability. In this paper, to addresses these limitations, we propose the semantic Web of Things (SWoT) framework for CPS (SWoT4CPS). SWoT4CPS provides a hybrid solution with both ontological engineering methods by extending SSN and machine learning methods based on an entity linking (EL) model. To testify to the feasibility and performance, we demonstrate the framework by implementing a temperature anomaly diagnosis and automatic control use case in a building automation system. Evaluation results on the EL method show that linking domain knowledge to DBpedia has a relative high accuracy and the time complexity is at a tolerant level. Advantages and disadvantages of SWoT4CPS with future work are also discussed.

  15. A fuzzy-ontology-oriented case-based reasoning framework for semantic diabetes diagnosis.

    Science.gov (United States)

    El-Sappagh, Shaker; Elmogy, Mohammed; Riad, A M

    2015-11-01

    Case-based reasoning (CBR) is a problem-solving paradigm that uses past knowledge to interpret or solve new problems. It is suitable for experience-based and theory-less problems. Building a semantically intelligent CBR that mimic the expert thinking can solve many problems especially medical ones. Knowledge-intensive CBR using formal ontologies is an evolvement of this paradigm. Ontologies can be used for case representation and storage, and it can be used as a background knowledge. Using standard medical ontologies, such as SNOMED CT, enhances the interoperability and integration with the health care systems. Moreover, utilizing vague or imprecise knowledge further improves the CBR semantic effectiveness. This paper proposes a fuzzy ontology-based CBR framework. It proposes a fuzzy case-base OWL2 ontology, and a fuzzy semantic retrieval algorithm that handles many feature types. This framework is implemented and tested on the diabetes diagnosis problem. The fuzzy ontology is populated with 60 real diabetic cases. The effectiveness of the proposed approach is illustrated with a set of experiments and case studies. The resulting system can answer complex medical queries related to semantic understanding of medical concepts and handling of vague terms. The resulting fuzzy case-base ontology has 63 concepts, 54 (fuzzy) object properties, 138 (fuzzy) datatype properties, 105 fuzzy datatypes, and 2640 instances. The system achieves an accuracy of 97.67%. We compare our framework with existing CBR systems and a set of five machine-learning classifiers; our system outperforms all of these systems. Building an integrated CBR system can improve its performance. Representing CBR knowledge using the fuzzy ontology and building a case retrieval algorithm that treats different features differently improves the accuracy of the resulting systems. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. A state-of-the-art review of interoperability amongst heterogeneous software systems

    Directory of Open Access Journals (Sweden)

    Carlos Mario Zapata Jaramillo

    2009-05-01

    Full Text Available Information systems are sets of interacting elements aimed at supporting entrepreneurial or business activities; they cannot thus coexist in an isolated way but require their data to be shared so as to increase their productivity. Such systems’ interoperability is normally accomplished through mark-up standards, query languages and web services. The literature contains work related to software system interoperability; however, it presents some difficulties, such as the need for using the same platforms and different programming languages, the use of read only languages and the deficiencies in the formalism used for achieving it. This paper presents a critical review of the advances made regarding heterogeneous software systems’ interoperability.

  17. Dynamic Business Networks: A Headache for Sustainable Systems Interoperability

    Science.gov (United States)

    Agostinho, Carlos; Jardim-Goncalves, Ricardo

    Collaborative networked environments emerged with the spread of the internet, contributing to overcome past communication barriers, and identifying interoperability as an essential property. When achieved seamlessly, efficiency is increased in the entire product life cycle. Nowadays, most organizations try to attain interoperability by establishing peer-to-peer mappings with the different partners, or in optimized networks, by using international standard models as the core for information exchange. In current industrial practice, mappings are only defined once, and the morphisms that represent them, are hardcoded in the enterprise systems. This solution has been effective for static environments, where enterprise and product models are valid for decades. However, with an increasingly complex and dynamic global market, models change frequently to answer new customer requirements. This paper draws concepts from the complex systems science and proposes a framework for sustainable systems interoperability in dynamic networks, enabling different organizations to evolve at their own rate.

  18. Putting semantics into the semantic web: how well can it capture biology?

    Science.gov (United States)

    Kazic, Toni

    2006-01-01

    Could the Semantic Web work for computations of biological interest in the way it's intended to work for movie reviews and commercial transactions? It would be wonderful if it could, so it's worth looking to see if its infrastructure is adequate to the job. The technologies of the Semantic Web make several crucial assumptions. I examine those assumptions; argue that they create significant problems; and suggest some alternative ways of achieving the Semantic Web's goals for biology.

  19. Standardized headings as a foundation for semantic interoperability in EHR

    Directory of Open Access Journals (Sweden)

    Halilovic Amra

    2016-01-01

    Full Text Available The new Swedish Patient Act, which allows patients to choose health care in county councils other than their own, creates the need to be able to share health-related information contained in electronic health records [EHRs across county councils. This demands interoperability in terms of structured and standardized data. Headings in EHR could also be a part of structured and standardized data. The aim was to study to what extent terminology is shared and standardized across county councils in Sweden. Headings from three county councils were analyzed to see to what extent they were shared and to what extent they corresponded to concepts in SNOMED CT and the National Board of Health and Welfare’s term dictionary [NBHW’s TD. In total 41% of the headings were shared across two or three county councils. A third of the shared headings corresponded to concepts in SNOMED CT. Further, an eighth of the shared headings corresponded to concepts in NBHW’s TD. The results showed that the extent of shared and standardized terminology in terms of headings across the studied three county councils were negligible.

  20. Interoperability of CAD Standards and Robotics in CIME

    DEFF Research Database (Denmark)

    Sørensen, Torben

    The research presented in this dissertation concerns the identification of problems and provision of solutions for increasing the degree of interoperability between CAD, CACSD (Computer Aided Control Systems Design) and CAR (Computer Aided Robotics) in Computer Integrated Manufacturing and Engine......The research presented in this dissertation concerns the identification of problems and provision of solutions for increasing the degree of interoperability between CAD, CACSD (Computer Aided Control Systems Design) and CAR (Computer Aided Robotics) in Computer Integrated Manufacturing......· The development of a STEP based interface for general control system data and functions, especially related to robot motion control for interoperability of CAD, CACSD, and CAR systems for the extension of the inter-system communication capabilities beyond the stage achieved up to now.This interface development...... comprehends the following work:· The definition of the concepts of 'information' and 'information model', and the selection of a proper information modeling methodology within the STEP methodologies.· The elaboration of a general function model of a generic robot motion controller in IDEF0 for interface...

  1. A Collaborative System Software Solution for Modeling Business Flows Based on Automated Semantic Web Service Composition

    Directory of Open Access Journals (Sweden)

    Ion SMEUREANU

    2009-01-01

    Full Text Available Nowadays, business interoperability is one of the key factors for assuring competitive advantage for the participant business partners. In order to implement business cooperation, scalable, distributed and portable collaborative systems have to be implemented. This article presents some of the mostly used technologies in this field. Furthermore, it presents a software application architecture based on Business Process Modeling Notation standard and automated semantic web service coupling for modeling business flow in a collaborative manner. The main business processes will be represented in a single, hierarchic flow diagram. Each element of the diagram will represent calls to semantic web services. The business logic (the business rules and constraints will be structured with the help of OWL (Ontology Web Language. Moreover, OWL will also be used to create the semantic web service specifications.

  2. Interoperability of clinical decision-support systems and electronic health records using archetypes: a case study in clinical trial eligibility.

    Science.gov (United States)

    Marcos, Mar; Maldonado, Jose A; Martínez-Salvador, Begoña; Boscá, Diego; Robles, Montserrat

    2013-08-01

    patient recruitment in the framework of a clinical trial for colorectal cancer screening. The utilisation of archetypes not only has proved satisfactory to achieve interoperability between CDSSs and EHRs but also offers various advantages, in particular from a data model perspective. First, the VHR/data models we work with are of a high level of abstraction and can incorporate semantic descriptions. Second, archetypes can potentially deal with different EHR architectures, due to their deliberate independence of the reference model. Third, the archetype instances we obtain are valid instances of the underlying reference model, which would enable e.g. feeding back the EHR with data derived by abstraction mechanisms. Lastly, the medical and technical validity of archetype models would be assured, since in principle clinicians should be the main actors in their development. Copyright © 2013 Elsevier Inc. All rights reserved.

  3. Coupling ontology driven semantic representation with multilingual natural language generation for tuning international terminologies.

    Science.gov (United States)

    Rassinoux, Anne-Marie; Baud, Robert H; Rodrigues, Jean-Marie; Lovis, Christian; Geissbühler, Antoine

    2007-01-01

    The importance of clinical communication between providers, consumers and others, as well as the requisite for computer interoperability, strengthens the need for sharing common accepted terminologies. Under the directives of the World Health Organization (WHO), an approach is currently being conducted in Australia to adopt a standardized terminology for medical procedures that is intended to become an international reference. In order to achieve such a standard, a collaborative approach is adopted, in line with the successful experiment conducted for the development of the new French coding system CCAM. Different coding centres are involved in setting up a semantic representation of each term using a formal ontological structure expressed through a logic-based representation language. From this language-independent representation, multilingual natural language generation (NLG) is performed to produce noun phrases in various languages that are further compared for consistency with the original terms. Outcomes are presented for the assessment of the International Classification of Health Interventions (ICHI) and its translation into Portuguese. The initial results clearly emphasize the feasibility and cost-effectiveness of the proposed method for handling both a different classification and an additional language. NLG tools, based on ontology driven semantic representation, facilitate the discovery of ambiguous and inconsistent terms, and, as such, should be promoted for establishing coherent international terminologies.

  4. A Hypercat-enabled Semantic Internet of Things Data Hub: Technical Report

    OpenAIRE

    Tachmazidis, Ilias; Batsakis, Sotiris; Davies, John; Duke, Alistair; Vallati, Mauro; Antoniou, Grigoris; Clarke, Sandra Stincic

    2017-01-01

    An increasing amount of information is generated from the rapidly increasing number of sensor networks and smart devices. A wide variety of sources generate and publish information in different formats, thus highlighting interoperability as one of the key prerequisites for the success of Internet of Things (IoT). The BT Hypercat Data Hub provides a focal point for the sharing and consumption of available datasets from a wide range of sources. In this work, we propose a semantic enrichment of ...

  5. Political, policy and social barriers to health system interoperability: emerging opportunities of Web 2.0 and 3.0.

    Science.gov (United States)

    Juzwishin, Donald W M

    2009-01-01

    Achieving effective health informatics interoperability in a fragmented and uncoordinated health system is by definition not possible. Interoperability requires the simultaneous integration of health care processes and information across different types and levels of care (systems thinking). The fundamental argument of this paper is that information system interoperability will remain an unfulfilled hope until health reforms effectively address the governance (accountability), structural and process barriers to interoperability of health care delivery. The ascendency of Web 2.0 and 3.0, although still unproven, signals the opportunity to accelerate patients' access to health information and their health record. Policy suggestions for simultaneously advancing health system delivery and information system interoperability are posited.

  6. The interoperability force in the ERP field

    Science.gov (United States)

    Boza, Andrés; Cuenca, Llanos; Poler, Raúl; Michaelides, Zenon

    2015-04-01

    Enterprise resource planning (ERP) systems participate in interoperability projects and this participation sometimes leads to new proposals for the ERP field. The aim of this paper is to identify the role that interoperability plays in the evolution of ERP systems. To go about this, ERP systems have been first identified within interoperability frameworks. Second, the initiatives in the ERP field driven by interoperability requirements have been identified from two perspectives: technological and business. The ERP field is evolving from classical ERP as information system integrators to a new generation of fully interoperable ERP. Interoperability is changing the way of running business, and ERP systems are changing to adapt to the current stream of interoperability.

  7. Towards Automatic Semantic Labelling of 3D City Models

    Science.gov (United States)

    Rook, M.; Biljecki, F.; Diakité, A. A.

    2016-10-01

    The lack of semantic information in many 3D city models is a considerable limiting factor in their use, as a lot of applications rely on semantics. Such information is not always available, since it is not collected at all times, it might be lost due to data transformation, or its lack may be caused by non-interoperability in data integration from other sources. This research is a first step in creating an automatic workflow that semantically labels plain 3D city model represented by a soup of polygons, with semantic and thematic information, as defined in the CityGML standard. The first step involves the reconstruction of the topology, which is used in a region growing algorithm that clusters upward facing adjacent triangles. Heuristic rules, embedded in a decision tree, are used to compute a likeliness score for these regions that either represent the ground (terrain) or a RoofSurface. Regions with a high likeliness score, to one of the two classes, are used to create a decision space, which is used in a support vector machine (SVM). Next, topological relations are utilised to select seeds that function as a start in a region growing algorithm, to create regions of triangles of other semantic classes. The topological relationships of the regions are used in the aggregation of the thematic building features. Finally, the level of detail is detected to generate the correct output in CityGML. The results show an accuracy between 85 % and 99 % in the automatic semantic labelling on four different test datasets. The paper is concluded by indicating problems and difficulties implying the next steps in the research.

  8. Semantic Web integration of Cheminformatics resources with the SADI framework

    Directory of Open Access Journals (Sweden)

    Chepelev Leonid L

    2011-05-01

    Full Text Available Abstract Background The diversity and the largely independent nature of chemical research efforts over the past half century are, most likely, the major contributors to the current poor state of chemical computational resource and database interoperability. While open software for chemical format interconversion and database entry cross-linking have partially addressed database interoperability, computational resource integration is hindered by the great diversity of software interfaces, languages, access methods, and platforms, among others. This has, in turn, translated into limited reproducibility of computational experiments and the need for application-specific computational workflow construction and semi-automated enactment by human experts, especially where emerging interdisciplinary fields, such as systems chemistry, are pursued. Fortunately, the advent of the Semantic Web, and the very recent introduction of RESTful Semantic Web Services (SWS may present an opportunity to integrate all of the existing computational and database resources in chemistry into a machine-understandable, unified system that draws on the entirety of the Semantic Web. Results We have created a prototype framework of Semantic Automated Discovery and Integration (SADI framework SWS that exposes the QSAR descriptor functionality of the Chemistry Development Kit. Since each of these services has formal ontology-defined input and output classes, and each service consumes and produces RDF graphs, clients can automatically reason about the services and available reference information necessary to complete a given overall computational task specified through a simple SPARQL query. We demonstrate this capability by carrying out QSAR analysis backed by a simple formal ontology to determine whether a given molecule is drug-like. Further, we discuss parameter-based control over the execution of SADI SWS. Finally, we demonstrate the value of computational resource

  9. Standards-based Content Resources: A Prerequisite for Content Integration and Content Interoperability

    Directory of Open Access Journals (Sweden)

    Christian Galinski

    2010-05-01

    Full Text Available Objective: to show how standards-based approaches for content standardization, content management, content related services and tools as well as the respective certification systems not only guarantee reliable content integration and content interoperability, but also are of particular benefit to people with special needs in eAccessibility/eInclusion. Method: document MoU/MG/05 N0221 ''Semantic Interoperability and the need for a coherent policy for a framework of distributed, possibly federated repositories for all kinds of content items on a world-wide scale''2, which was adopted in 2005, was a first step towards the formulation of global interoperability requirements for structured content. These requirements -based on advanced terminological principles- were taken up in EU-projects such as IN-SAFETY (INfrastructure and SAFETY and OASIS (Open architecture for Accessible Services Integration and Standardization. Results: Content integration and content interoperability are key concepts in connection with the emergence of state-of-the-art distributed and federated databases/repositories of structured content. Given the fact that linguistic content items are increasingly combined with or embedded in non-linguistic content items (and vice versa, a systemic and generic approach to data modelling and content management has become the order of the day. Fulfilling the requirements of capability for multilinguality and multimodality, based on open standards makes software and database design fit for eAccessibility/eInclusion from the outset. It also makes structured content capable for global content integration and content interoperability, because it enhances its potential for being re-used and re-purposed in totally different eApplications. Such content as well as the methods, tools and services applied can be subject to new kinds of certification schemes which also should be based on standards. Conclusions: Content must be totally reliable in some

  10. Environmental Models as a Service: Enabling Interoperability ...

    Science.gov (United States)

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantage of streamlined deployment processes and affordable cloud access to move algorithms and data to the web for discoverability and consumption. In these deployments, environmental models can become available to end users through RESTful web services and consistent application program interfaces (APIs) that consume, manipulate, and store modeling data. RESTful modeling APIs also promote discoverability and guide usability through self-documentation. Embracing the RESTful paradigm allows models to be accessible via a web standard, and the resulting endpoints are platform- and implementation-agnostic while simultaneously presenting significant computational capabilities for spatial and temporal scaling. RESTful APIs present data in a simple verb-noun web request interface: the verb dictates how a resource is consumed using HTTP methods (e.g., GET, POST, and PUT) and the noun represents the URL reference of the resource on which the verb will act. The RESTful API can self-document in both the HTTP response and an interactive web page using the Open API standard. This lets models function as an interoperable service that promotes sharing, documentation, and discoverability. Here, we discuss the

  11. Focus for 3D city models should be on interoperability

    DEFF Research Database (Denmark)

    Bodum, Lars; Kjems, Erik; Jaegly, Marie Michele Helena

    2006-01-01

    that would make it useful for other purposes than visualisation. Time has come to try to change this trend and to convince the municipalities that interoperability and semantics are important issues for the future. It is important for them to see that 3D modelling, mapping and geographic information...... developments in Geographical Exploration Systems. Centralized and proprietary Geographical Exploration Systems only give us their own perspective on the world. On the contrary, GRIFINOR is decentralized and available for everyone to use, empowering people to promote their own world vision....... are subjects on the same agenda towards an integrated solution for an object-oriented mapping of multidimensional geographic objects in the urban environment. Many relevant subjects could be discussed regarding these matters, but in this paper we will narrow the discussion down to the ideas behind...

  12. Innovation in OGC: The Interoperability Program

    Directory of Open Access Journals (Sweden)

    George Percivall

    2015-10-01

    Full Text Available The OGC Interoperability Program is a source of innovation in the development of open standards. The approach to innovation is based on hands-on; collaborative engineering leading to more mature standards and implementations. The process of the Interoperability Program engages a community of sponsors and participants based on an economic model that benefits all involved. Each initiative begins with an innovative approach to identify interoperability needs followed by agile software development to advance the state of technology to the benefit of society. Over eighty initiatives have been conducted in the Interoperability Program since the breakthrough Web Mapping Testbed began the program in 1999. OGC standards that were initiated in Interoperability Program are the basis of two thirds of the certified compliant products.

  13. Towards an enterprise interoperability framework

    CSIR Research Space (South Africa)

    Kotzé, P

    2010-06-01

    Full Text Available This paper presents relevant interoperability approaches and solutions applied to global/international networked (collaborative) enterprises or organisations and conceptualise an enhanced enterprise interoperability framework. The paper covers...

  14. A Theory of Interoperability Failures

    National Research Council Canada - National Science Library

    McBeth, Michael S

    2003-01-01

    This paper develops a theory of interoperability failures. Interoperability in this paper refers to the exchange of information and the use of information, once exchanged, between two or more systems...

  15. Model and Interoperability using Meta Data Annotations

    Science.gov (United States)

    David, O.

    2011-12-01

    Software frameworks and architectures are in need for meta data to efficiently support model integration. Modelers have to know the context of a model, often stepping into modeling semantics and auxiliary information usually not provided in a concise structure and universal format, consumable by a range of (modeling) tools. XML often seems the obvious solution for capturing meta data, but its wide adoption to facilitate model interoperability is limited by XML schema fragmentation, complexity, and verbosity outside of a data-automation process. Ontologies seem to overcome those shortcomings, however the practical significance of their use remains to be demonstrated. OMS version 3 took a different approach for meta data representation. The fundamental building block of a modular model in OMS is a software component representing a single physical process, calibration method, or data access approach. Here, programing language features known as Annotations or Attributes were adopted. Within other (non-modeling) frameworks it has been observed that annotations lead to cleaner and leaner application code. Framework-supported model integration, traditionally accomplished using Application Programming Interfaces (API) calls is now achieved using descriptive code annotations. Fully annotated components for various hydrological and Ag-system models now provide information directly for (i) model assembly and building, (ii) data flow analysis for implicit multi-threading or visualization, (iii) automated and comprehensive model documentation of component dependencies, physical data properties, (iv) automated model and component testing, calibration, and optimization, and (v) automated audit-traceability to account for all model resources leading to a particular simulation result. Such a non-invasive methodology leads to models and modeling components with only minimal dependencies on the modeling framework but a strong reference to its originating code. Since models and

  16. Interoperability in practice: case study of the Slovenian independence war of 1991

    Directory of Open Access Journals (Sweden)

    Vladimir Prebilič

    2015-08-01

    Full Text Available The paper will examine the theory of the interoperability of armed forces through the case of he Slovenian Independence War of 1991. Although defense system interoperability is a well-established concept, there are many obstacles to its implementation. Some defense systems do not deliberately support the idea of interoperability. One such example is the total defense system in SFR Yugoslavia, which is comprised of two defense components: the Yugoslav People’s Army (YPA and territorial defense structures organized by the federal republic. The question of interoperability is highly relevant since the war was fought between the YPA and the defense forces of the newly proclaimed independent state, Slovenia, who were partners in the total defense concept. Due to the clear asymmetry, interoperability offered a great advantage in the independence war. The Slovenian defense forces were combined into three structures: the former militia as an internal security element, the territorial defense as a military component, and the national protection forces as a “civil” defense element. Although each structure had its own command and organizational structure, during the Slovenian War they were combined into a well-structured and organized defense element that achieved victory against a much stronger, better equipped, and better supported army.

  17. Smart Grid Interoperability Maturity Model

    Energy Technology Data Exchange (ETDEWEB)

    Widergren, Steven E.; Levinson, Alex; Mater, J.; Drummond, R.

    2010-04-28

    The integration of automation associated with electricity resources (including transmission and distribution automation and demand-side resources operated by end-users) is key to supporting greater efficiencies and incorporating variable renewable resources and electric vehicles into the power system. The integration problems faced by this community are analogous to those faced in the health industry, emergency services, and other complex communities with many stakeholders. To highlight this issue and encourage communication and the development of a smart grid interoperability community, the GridWise Architecture Council (GWAC) created an Interoperability Context-Setting Framework. This "conceptual model" has been helpful to explain the importance of organizational alignment in addition to technical and informational interface specifications for "smart grid" devices and systems. As a next step to building a community sensitive to interoperability, the GWAC is investigating an interoperability maturity model (IMM) based on work done by others to address similar circumstances. The objective is to create a tool or set of tools that encourages a culture of interoperability in this emerging community. The tools would measure status and progress, analyze gaps, and prioritize efforts to improve the situation.

  18. Semantic Web

    Directory of Open Access Journals (Sweden)

    Anna Lamandini

    2011-06-01

    Full Text Available The semantic Web is a technology at the service of knowledge which is aimed at accessibility and the sharing of content; facilitating interoperability between different systems and as such is one of the nine key technological pillars of TIC (technologies for information and communication within the third theme, programme specific cooperation of the seventh programme framework for research and development (7°PQRS, 2007-2013. As a system it seeks to overcome overload or excess of irrelevant information in Internet, in order to facilitate specific or pertinent research. It is an extension of the existing Web in which the aim is for cooperation between and the computer and people (the dream of Sir Tim Berners –Lee where machines can give more support to people when integrating and elaborating data in order to obtain inferences and a global sharing of data. It is a technology that is able to favour the development of a “data web” in other words the creation of a space in both sets of interconnected and shared data (Linked Data which allows users to link different types of data coming from different sources. It is a technology that will have great effect on everyday life since it will permit the planning of “intelligent applications” in various sectors such as education and training, research, the business world, public information, tourism, health, and e-government. It is an innovative technology that activates a social transformation (socio-semantic Web on a world level since it redefines the cognitive universe of users and enables the sharing not only of information but of significance (collective and connected intelligence.

  19. Towards Semantic Interpretation of Movement Behavior

    NARCIS (Netherlands)

    Baglioni, M.; Macedo, J.; Renso, C.; Trasarti, R.; Wachowicz, M.

    2009-01-01

    In this paper we aim at providing a model for the conceptual representation and deductive reasoning of trajectory patterns obtained from mining raw trajectories. This has been achieved by means of a semantic enrichment process, where raw trajectories are enhanced with semantic information and

  20. Semantic Mediation via Access Broker: the OWS-9 experiment

    Science.gov (United States)

    Santoro, Mattia; Papeschi, Fabrizio; Craglia, Massimo; Nativi, Stefano

    2013-04-01

    Even with the use of common data models standards to publish and share geospatial data, users may still face semantic inconsistencies when they use Spatial Data Infrastructures - especially in multidisciplinary contexts. Several semantic mediation solutions exist to address this issue; they span from simple XSLT documents to transform from one data model schema to another, to more complex services based on the use of ontologies. This work presents the activity done in the context of the OGC Web Services Phase 9 (OWS-9) Cross Community Interoperability to develop a semantic mediation solution by enhancing the GEOSS Discovery and Access Broker (DAB). This is a middleware component that provides harmonized access to geospatial datasets according to client applications preferred service interface (Nativi et al. 2012, Vaccari et al. 2012). Given a set of remote feature data encoded in different feature schemas, the objective of the activity was to use the DAB to enable client applications to transparently access the feature data according to one single schema. Due to the flexible architecture of the Access Broker, it was possible to introduce a new transformation type in the configured chain of transformations. In fact, the Access Broker already provided the following transformations: Coordinate Reference System (CRS), spatial resolution, spatial extent (e.g., a subset of a data set), and data encoding format. A new software module was developed to invoke the needed external semantic mediation service and harmonize the accessed features. In OWS-9 the Access Broker invokes a SPARQL WPS to retrieve mapping rules for the OWS-9 schemas: USGS, and NGA schema. The solution implemented to address this problem shows the flexibility and extensibility of the brokering framework underpinning the GEO DAB: new services can be added to augment the number of supported schemas without the need to modify other components and/or software modules. Moreover, all other transformations (CRS

  1. Unmanned Ground Vehicle (UGV) Interoperability Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The UGV Interoperability Lab provides the capability to verify vendor conformance against government-defined interoperability profiles (IOPs). This capability allows...

  2. Toward an E-Government Semantic Platform

    Science.gov (United States)

    Sbodio, Marco Luca; Moulin, Claude; Benamou, Norbert; Barthès, Jean-Paul

    This chapter describes the major aspects of an e-government platform in which semantics underpins more traditional technologies in order to enable new capabilities and to overcome technical and cultural challenges. The design and development of such an e-government Semantic Platform has been conducted with the financial support of the European Commission through the Terregov research project: "Impact of e-government on Territorial Government Services" (Terregov 2008). The goal of this platform is to let local government and government agencies offer online access to their services in an interoperable way, and to allow them to participate in orchestrated processes involving services provided by multiple agencies. Implementing a business process through an electronic procedure is indeed a core goal in any networked organization. However, the field of e-government brings specific constraints to the operations allowed in procedures, especially concerning the flow of private citizens' data: because of legal reasons in most countries, such data are allowed to circulate only from agency to agency directly. In order to promote transparency and responsibility in e-government while respecting the specific constraints on data flows, Terregov supports the creation of centrally controlled orchestrated processes; while the cross agencies data flows are centrally managed, data flow directly across agencies.

  3. Ontology-based semantic information technology for safeguards: opportunities and challenges

    International Nuclear Information System (INIS)

    McDaniel, Michael

    2014-01-01

    The challenge of efficiently handling large volumes of heterogeneous information is a barrier to more effective safeguards implementation. With the emergence of new technologies for generating and collecting information this is an issue common to many industries and problem domains. Several diverse information‑intensive fields are developing and adopting ontology‑based semantic information technology solutions to address issues of information integration, federation and interoperability. Ontology, in this context, refers to the formal specification of the content, structure, and logic of knowledge within a domain of interest. Ontology‑based semantic information technologies have the potential to impact nearly every level of safeguards implementation, from information collection and integration, to personnel training and knowledge retention, to planning and analysis. However, substantial challenges remain before the full benefits of semantic technology can be realized. Perhaps the most significant challenge is the development of a nuclear fuel cycle ontology. For safeguards, existing knowledge resources such as the IAEA’s Physical Model and established upper level ontologies can be used as starting points for ontology development, but a concerted effort must be taken by the safeguards community for such an activity to be successful. This paper provides a brief background of ontologies and semantic information technology, demonstrates how these technologies are used in other areas, offers examples of how ontologies can be applied to safeguards, and discusses the challenges of developing and implementing this technology as well as a possible path forward.

  4. A layered semantics for a parallel object-oriented language

    NARCIS (Netherlands)

    P.H.M. America (Pierre); J.J.M.M. Rutten (Jan)

    1990-01-01

    textabstractWe develop a denotational semantics for POOL, a parallel object-oriented programming language. The main contribution of this semantics is an accurate mathematical model of the most important concept in object-oriented programming: the object. This is achieved by structuring the semantics

  5. Linked data for transaction based enterprise interoperability

    NARCIS (Netherlands)

    Folmer, E.J.A.; Krukkert, D.

    2015-01-01

    Interoperability is of major importance in B2B environments. Starting with EDI in the ‘80s, currently interoperability relies heavily on XMLbased standards. Although having great impact, still issues remain to be solved for improving B2B interoperability. These issues include lack of dynamics, cost

  6. Achieving semantic interoperability in multi-agent systems: A dialogue-based approach

    NARCIS (Netherlands)

    Diggelen, J. van

    2007-01-01

    Software agents sharing the same ontology can exchange their knowledge fluently as their knowledge representations are compatible with respect to the concepts regarded as relevant and with respect to the names given to these concepts. However, in open heterogeneous multi-agent systems, this scenario

  7. European Interoperability Assets Register and Quality Framework Implementation.

    Science.gov (United States)

    Moreno-Conde, Alberto; Thienpont, Geert; Lamote, Inge; Coorevits, Pascal; Parra, Carlos; Kalra, Dipak

    2016-01-01

    Interoperability assets is the term applied to refer to any resource that can support the design, implementation and successful adoption of eHealth services that can exchange data meaningfully. Some examples may include functional requirements, specifications, standards, clinical models and term lists, guidance on how standards may be used concurrently, implementation guides, educational resources, and other resources. Unfortunately, these are largely accessible in ad hoc ways and result in scattered fragments of a solution space that urgently need to be brought together. At present, it is well known that new initiatives and projects will reinvent assets of which they were unaware, while those assets which were potentially of great value are forgotten, not maintained and eventually fall into disuse. This research has defined a quality in use model and assessed the suitability of this quality framework based on the feedback and opinion of a representative sample of potential end users. This quality framework covers the following domains of asset development and adoption: (i) Development process, (ii) Maturity level, (iii) Trustworthiness, (iv) Support & skills, (v) Sustainability, (vi) Semantic interoperability, (vii) Cost & effort of adoption (viii) Maintenance. When participants were requested to evaluate how the overall quality in use framework, 70% would recommend using the register to their colleagues, 70% felt that it could provide relevant benefits for discovering new assets, and 50% responded that it would support their decision making about the recommended asset to adopt or implement in their organisation. Several European projects have expressed interest in using the register, which will now be sustained and promoted by the the European Institute for Innovation through Health Data.

  8. Toward semantic interoperability in home health care: formally representing OASIS items for integration into a concept-oriented terminology.

    Science.gov (United States)

    Choi, Jeungok; Jenkins, Melinda L; Cimino, James J; White, Thomas M; Bakken, Suzanne

    2005-01-01

    The authors aimed to (1) formally represent OASIS-B1 concepts using the Logical Observation Identifiers, Names, and Codes (LOINC) semantic structure; (2) demonstrate integration of OASIS-B1 concepts into a concept-oriented terminology, the Medical Entities Dictionary (MED); (3) examine potential hierarchical structures within LOINC among OASIS-B1 and other nursing terms; and (4) illustrate a Web-based implementation for OASIS-B1 data entry using Dialogix, a software tool with a set of functions that supports complex data entry. Two hundred nine OASIS-B1 items were dissected into the six elements of the LOINC semantic structure and then integrated into the MED hierarchy. Each OASIS-B1 term was matched to LOINC-coded nursing terms, Home Health Care Classification, the Omaha System, and the Sign and Symptom Check-List for Persons with HIV, and the extent of the match was judged based on a scale of 0 (no match) to 4 (exact match). OASIS-B1 terms were implemented as a Web-based survey using Dialogix. Of 209 terms, 204 were successfully dissected into the elements of the LOINC semantics structure and integrated into the MED with minor revisions of MED semantics. One hundred fifty-one OASIS-B1 terms were mapped to one or more of the LOINC-coded nursing terms. The LOINC semantic structure offers a standard way to add home health care data to a comprehensive patient record to facilitate data sharing for monitoring outcomes across sites and to further terminology management, decision support, and accurate information retrieval for evidence-based practice. The cross-mapping results support the possibility of a hierarchical structure of the OASIS-B1 concepts within nursing terminologies in the LOINC database.

  9. Advancing translational research with the Semantic Web

    Science.gov (United States)

    Ruttenberg, Alan; Clark, Tim; Bug, William; Samwald, Matthias; Bodenreider, Olivier; Chen, Helen; Doherty, Donald; Forsberg, Kerstin; Gao, Yong; Kashyap, Vipul; Kinoshita, June; Luciano, Joanne; Marshall, M Scott; Ogbuji, Chimezie; Rees, Jonathan; Stephens, Susie; Wong, Gwendolyn T; Wu, Elizabeth; Zaccagnini, Davide; Hongsermeier, Tonya; Neumann, Eric; Herman, Ivan; Cheung, Kei-Hoi

    2007-01-01

    Background A fundamental goal of the U.S. National Institute of Health (NIH) "Roadmap" is to strengthen Translational Research, defined as the movement of discoveries in basic research to application at the clinical level. A significant barrier to translational research is the lack of uniformly structured data across related biomedical domains. The Semantic Web is an extension of the current Web that enables navigation and meaningful use of digital resources by automatic processes. It is based on common formats that support aggregation and integration of data drawn from diverse sources. A variety of technologies have been built on this foundation that, together, support identifying, representing, and reasoning across a wide range of biomedical data. The Semantic Web Health Care and Life Sciences Interest Group (HCLSIG), set up within the framework of the World Wide Web Consortium, was launched to explore the application of these technologies in a variety of areas. Subgroups focus on making biomedical data available in RDF, working with biomedical ontologies, prototyping clinical decision support systems, working on drug safety and efficacy communication, and supporting disease researchers navigating and annotating the large amount of potentially relevant literature. Results We present a scenario that shows the value of the information environment the Semantic Web can support for aiding neuroscience researchers. We then report on several projects by members of the HCLSIG, in the process illustrating the range of Semantic Web technologies that have applications in areas of biomedicine. Conclusion Semantic Web technologies present both promise and challenges. Current tools and standards are already adequate to implement components of the bench-to-bedside vision. On the other hand, these technologies are young. Gaps in standards and implementations still exist and adoption is limited by typical problems with early technology, such as the need for a critical mass of

  10. Advancing translational research with the Semantic Web.

    Science.gov (United States)

    Ruttenberg, Alan; Clark, Tim; Bug, William; Samwald, Matthias; Bodenreider, Olivier; Chen, Helen; Doherty, Donald; Forsberg, Kerstin; Gao, Yong; Kashyap, Vipul; Kinoshita, June; Luciano, Joanne; Marshall, M Scott; Ogbuji, Chimezie; Rees, Jonathan; Stephens, Susie; Wong, Gwendolyn T; Wu, Elizabeth; Zaccagnini, Davide; Hongsermeier, Tonya; Neumann, Eric; Herman, Ivan; Cheung, Kei-Hoi

    2007-05-09

    A fundamental goal of the U.S. National Institute of Health (NIH) "Roadmap" is to strengthen Translational Research, defined as the movement of discoveries in basic research to application at the clinical level. A significant barrier to translational research is the lack of uniformly structured data across related biomedical domains. The Semantic Web is an extension of the current Web that enables navigation and meaningful use of digital resources by automatic processes. It is based on common formats that support aggregation and integration of data drawn from diverse sources. A variety of technologies have been built on this foundation that, together, support identifying, representing, and reasoning across a wide range of biomedical data. The Semantic Web Health Care and Life Sciences Interest Group (HCLSIG), set up within the framework of the World Wide Web Consortium, was launched to explore the application of these technologies in a variety of areas. Subgroups focus on making biomedical data available in RDF, working with biomedical ontologies, prototyping clinical decision support systems, working on drug safety and efficacy communication, and supporting disease researchers navigating and annotating the large amount of potentially relevant literature. We present a scenario that shows the value of the information environment the Semantic Web can support for aiding neuroscience researchers. We then report on several projects by members of the HCLSIG, in the process illustrating the range of Semantic Web technologies that have applications in areas of biomedicine. Semantic Web technologies present both promise and challenges. Current tools and standards are already adequate to implement components of the bench-to-bedside vision. On the other hand, these technologies are young. Gaps in standards and implementations still exist and adoption is limited by typical problems with early technology, such as the need for a critical mass of practitioners and installed base

  11. Advancing translational research with the Semantic Web

    Directory of Open Access Journals (Sweden)

    Marshall M Scott

    2007-05-01

    Full Text Available Abstract Background A fundamental goal of the U.S. National Institute of Health (NIH "Roadmap" is to strengthen Translational Research, defined as the movement of discoveries in basic research to application at the clinical level. A significant barrier to translational research is the lack of uniformly structured data across related biomedical domains. The Semantic Web is an extension of the current Web that enables navigation and meaningful use of digital resources by automatic processes. It is based on common formats that support aggregation and integration of data drawn from diverse sources. A variety of technologies have been built on this foundation that, together, support identifying, representing, and reasoning across a wide range of biomedical data. The Semantic Web Health Care and Life Sciences Interest Group (HCLSIG, set up within the framework of the World Wide Web Consortium, was launched to explore the application of these technologies in a variety of areas. Subgroups focus on making biomedical data available in RDF, working with biomedical ontologies, prototyping clinical decision support systems, working on drug safety and efficacy communication, and supporting disease researchers navigating and annotating the large amount of potentially relevant literature. Results We present a scenario that shows the value of the information environment the Semantic Web can support for aiding neuroscience researchers. We then report on several projects by members of the HCLSIG, in the process illustrating the range of Semantic Web technologies that have applications in areas of biomedicine. Conclusion Semantic Web technologies present both promise and challenges. Current tools and standards are already adequate to implement components of the bench-to-bedside vision. On the other hand, these technologies are young. Gaps in standards and implementations still exist and adoption is limited by typical problems with early technology, such as the need

  12. Advancing Smart Grid Interoperability and Implementing NIST's Interoperability Roadmap

    Energy Technology Data Exchange (ETDEWEB)

    Basso,T.; DeBlasio, R.

    2010-04-01

    The IEEE American National Standards project P2030TM addressing smart grid interoperability and the IEEE 1547 series of standards addressing distributed resources interconnection with the grid have been identified in priority action plans in the Report to NIST on the Smart Grid Interoperability Standards Roadmap. This paper presents the status of the IEEE P2030 development, the IEEE 1547 series of standards publications and drafts, and provides insight on systems integration and grid infrastructure. The P2030 and 1547 series of standards are sponsored by IEEE Standards Coordinating Committee 21.

  13. Interoperability for electronic ID

    OpenAIRE

    Zygadlo, Zuzanna

    2009-01-01

    Electronic Business, including eBanking, eCommerce and eGovernmental services, is today based on a large variety of security solutions, comprising electronic IDs provided by a broad community of Public Key Infrastructure (PKI) vendors. Significant differences in implementations of those solutions introduce a problem of lack of interoperability in electronic business, which have not yet been resolved by standardization and interoperability initiatives based on existing PKI trust models. It i...

  14. Impact of coalition interoperability on PKI

    Science.gov (United States)

    Krall, Edward J.

    2003-07-01

    This paper examines methods for providing PKI interoperability among units of a coalition of armed forces drawn from different nations. The area in question is tactical identity management, for the purposes of confidentiality, integrity and non-repudiation in such a dynamic coalition. The interoperating applications under consideration range from email and other forms of store-and-forward messaging to TLS and IPSEC-protected real-time communications. Six interoperability architectures are examined with advantages and disadvantages of each described in the paper.

  15. Topics in Semantics-based Program Manipulation

    DEFF Research Database (Denmark)

    Grobauer, Bernt

    four articles in the field of semantics-based techniques for program manipulation: three articles are about partial evaluation, a method for program specialization; the fourth article treats an approach to automatic cost analysis. Partial evaluation optimizes programs by specializing them with respect...... article in this dissertation describes how the second Futamura projection can be achieved for type-directed partial evaluation (TDPE), a relatively recent approach to partial evaluation: We derive an ML implementation of the second Futamura projection for TDPE. Due to the differences between ‘traditional...... denotational semantics—allows us to relate various possible semantics to each other both conceptually and formally. We thus are able to explain goal-directed evaluation using an intuitive list-based semantics, while using a continuation semantics for semantics-based compilation through partial evaluation...

  16. Semantic enrichment of medical forms - semi-automated coding of ODM-elements via web services.

    Science.gov (United States)

    Breil, Bernhard; Watermann, Andreas; Haas, Peter; Dziuballe, Philipp; Dugas, Martin

    2012-01-01

    Semantic interoperability is an unsolved problem which occurs while working with medical forms from different information systems or institutions. Standards like ODM or CDA assure structural homogenization but in order to compare elements from different data models it is necessary to use semantic concepts and codes on an item level of those structures. We developed and implemented a web-based tool which enables a domain expert to perform semi-automated coding of ODM-files. For each item it is possible to inquire web services which result in unique concept codes without leaving the context of the document. Although it was not feasible to perform a totally automated coding we have implemented a dialog based method to perform an efficient coding of all data elements in the context of the whole document. The proportion of codable items was comparable to results from previous studies.

  17. IHE cross-enterprise document sharing for imaging: interoperability testing software

    Directory of Open Access Journals (Sweden)

    Renaud Bérubé

    2010-09-01

    Full Text Available Abstract Background With the deployments of Electronic Health Records (EHR, interoperability testing in healthcare is becoming crucial. EHR enables access to prior diagnostic information in order to assist in health decisions. It is a virtual system that results from the cooperation of several heterogeneous distributed systems. Interoperability between peers is therefore essential. Achieving interoperability requires various types of testing. Implementations need to be tested using software that simulates communication partners, and that provides test data and test plans. Results In this paper we describe a software that is used to test systems that are involved in sharing medical images within the EHR. Our software is used as part of the Integrating the Healthcare Enterprise (IHE testing process to test the Cross Enterprise Document Sharing for imaging (XDS-I integration profile. We describe its architecture and functionalities; we also expose the challenges encountered and discuss the elected design solutions. Conclusions EHR is being deployed in several countries. The EHR infrastructure will be continuously evolving to embrace advances in the information technology domain. Our software is built on a web framework to allow for an easy evolution with web technology. The testing software is publicly available; it can be used by system implementers to test their implementations. It can also be used by site integrators to verify and test the interoperability of systems, or by developers to understand specifications ambiguities, or to resolve implementations difficulties.

  18. Open Health Tools: Tooling for Interoperable Healthcare

    Directory of Open Access Journals (Sweden)

    Skip McGaughey

    2008-11-01

    Full Text Available The Open Health Tools initiative is creating an ecosystem focused on the production of software tooling that promotes the exchange of medical information across political, geographic, cultural, product, and technology lines. At its core, OHT believes that the availability of high-quality tooling that interoperates will propel the industry forward, enabling organizations and vendors to build products and systems that effectively work together. This will ?raise the interoperability bar? as a result of having tools that just work. To achieve these lofty goals, careful consideration must be made to the constituencies that will be most affected by an OHT-influenced world. This document outlines a vision of OHT?s impact to these stakeholders. It does not explain the OHT process itself or how the OHT community operates. Instead, we place emphasis on the impact of that process within the health industry. The catchphrase ?code is king? underpins this document, meaning that the manifestation of any open source community lies in the products and technology it produces.

  19. Interoperability and health policy in Spain. The case of digital medical records from the intergovernmental perspective / Interoperabilidad y política sanitaria en España. El caso de la historia clínica digital desde una perspectiva intergubernamental

    Directory of Open Access Journals (Sweden)

    J. Ignacio Criado Grande

    2013-10-01

    Full Text Available During the last years, different interoperability projects have been undertaken around the world. Interoperability of Electronic Government is an enabler to share information and knowledge, and it is essential to fulfill some of the promises of Information and Communication Technologies in the public sector. This article presents the preliminary results of a research about interoperability of the health policy in Spain, regarding to the central and regional levels of government. They share the competences of regulation and management of this policy sector. Also, this study will draw the evolution of the technical, semantic, and organizational interoperability dimensions, together with the governance system, with special attention to central and regional authorities’ relations. In so doing, we present the case of the Electronic Records of the National Health System, that it is one of the most developed projects in the field of interoperability in Spain. This analysis of this case study is devoted to shed light on the new challenges and opportunities for the application of a multi-level governance system fostering ICTs in the health policy in Spain.

  20. Maturity Model for Advancing Smart Grid Interoperability

    Energy Technology Data Exchange (ETDEWEB)

    Knight, Mark; Widergren, Steven E.; Mater, J.; Montgomery, Austin

    2013-10-28

    Abstract—Interoperability is about the properties of devices and systems to connect and work properly. Advancing interoperability eases integration and maintenance of the resulting interconnection. This leads to faster integration, lower labor and component costs, predictability of projects and the resulting performance, and evolutionary paths for upgrade. When specifications are shared and standardized, competition and novel solutions can bring new value streams to the community of stakeholders involved. Advancing interoperability involves reaching agreement for how things join at their interfaces. The quality of the agreements and the alignment of parties involved in the agreement present challenges that are best met with process improvement techniques. The GridWise® Architecture Council (GWAC) sponsored by the United States Department of Energy is supporting an effort to use concepts from capability maturity models used in the software industry to advance interoperability of smart grid technology. An interoperability maturity model has been drafted and experience is being gained through trials on various types of projects and community efforts. This paper describes the value and objectives of maturity models, the nature of the interoperability maturity model and how it compares with other maturity models, and experiences gained with its use.

  1. Grid interoperability: joining grid information systems

    International Nuclear Information System (INIS)

    Flechl, M; Field, L

    2008-01-01

    A grid is defined as being 'coordinated resource sharing and problem solving in dynamic, multi-institutional virtual organizations'. Over recent years a number of grid projects, many of which have a strong regional presence, have emerged to help coordinate institutions and enable grids. Today, we face a situation where a number of grid projects exist, most of which are using slightly different middleware. Grid interoperation is trying to bridge these differences and enable Virtual Organizations to access resources at the institutions independent of their grid project affiliation. Grid interoperation is usually a bilateral activity between two grid infrastructures. Recently within the Open Grid Forum, the Grid Interoperability Now (GIN) Community Group is trying to build upon these bilateral activities. The GIN group is a focal point where all the infrastructures can come together to share ideas and experiences on grid interoperation. It is hoped that each bilateral activity will bring us one step closer to the overall goal of a uniform grid landscape. A fundamental aspect of a grid is the information system, which is used to find available grid services. As different grids use different information systems, interoperation between these systems is crucial for grid interoperability. This paper describes the work carried out to overcome these differences between a number of grid projects and the experiences gained. It focuses on the different techniques used and highlights the important areas for future standardization

  2. Addressing issues in foundational ontology mediation

    CSIR Research Space (South Africa)

    Khan, ZC

    2013-09-01

    Full Text Available An approach in achieving semantic interoperability among heterogeneous systems is to offer infrastructure to assist with linking and integration using a foundational ontology. Due to the creation of multiple foundational ontologies, this also means...

  3. FLTSATCOM interoperability applications

    Science.gov (United States)

    Woolford, Lynn

    A mobile Fleet Satellite Communications (FLTSATCOM) system called the Mobile Operational Control Center (MOCC) was developed which has demonstrated the ability to be interoperable with many of the current FLTSATCOM command and control channels. This low-cost system is secure in all its communications, is lightweight, and provides a gateway for other communications formats. The major elements of this system are made up of a personal computer, a protocol microprocessor, and off-the-shelf mobile communication components. It is concluded that with both FLTSATCOM channel protocol and data format interoperability, the MOCC has the ability provide vital information in or near real time, which significantly improves mission effectiveness.

  4. Key pillars of data interoperability in Earth Sciences - INSPIRE and beyond

    Science.gov (United States)

    Tomas, Robert; Lutz, Michael

    2013-04-01

    The well-known heterogeneity and fragmentation of data models, formats and controlled vocabularies of environmental data limit potential data users from utilising the wealth of environmental information available today across Europe. The main aim of INSPIRE1 is to improve this situation and give users possibility to access, use and correctly interpret environmental data. Over the past years number of INSPIRE technical guidelines (TG) and implementing rules (IR) for interoperability have been developed, involving hundreds of domain experts from across Europe. The data interoperability specifications, which have been developed for all 34 INSPIRE spatial data themes2, are the central component of the TG and IR. Several of these themes are related to the earth sciences, e.g. geology (including hydrogeology, geophysics and geomorphology), mineral and energy resources, soil science, natural hazards, meteorology, oceanography, hydrology and land cover. The following main pillars for data interoperability and harmonisation have been identified during the development of the specifications: Conceptual data models describe the spatial objects and their properties and relationships for the different spatial data themes. To achieve cross-domain harmonization, the data models for all themes are based on a common modelling framework (the INSPIRE Generic Conceptual Model3) and managed in a common UML repository. Harmonised vocabularies (or code lists) are to be used in data exchange in order to overcome interoperability issues caused by heterogeneous free-text and/or multi-lingual content. Since a mapping to a harmonized vocabulary could be difficult, the INSPIRE data models typically allow the provision of more specific terms from local vocabularies in addition to the harmonized terms - utilizing either the extensibility options or additional terminological attributes. Encoding. Currently, specific XML profiles of the Geography Markup Language (GML) are promoted as the standard

  5. Interoperability for Enterprise Systems and Applications

    CERN Document Server

    Jardim-Gonçalves, Ricardo; Popplewell, Keith; Mendonça, João

    2016-01-01

    A concise reference to the state of the art in systems interoperability, Enterprise Interoperability VII will be of great value to engineers and computer scientists working in manufacturing and other process industries and to software engineers and electronic and manufacturing engineers working in the academic environment. Furthermore, it shows how knowledge of the meaning within information and the use to which it will be put have to be held in common between enterprises for consistent and efficient inter-enterprise networks. Over 30 papers, ranging from academic research through case studies to industrial and administrative experience of interoperability show how, in a scenario of globalised markets, where the capacity to cooperate with other organizations efficiently is essential in order to remain economically, socially and environmentally cost-effective, the most innovative digitized and networked enterprises ensure that their systems and applications are able to interoperate across heterogeneous collabo...

  6. Model for Trans-sector Digital Interoperability

    NARCIS (Netherlands)

    Madureira, António; den Hartog, Frank; Goncalves da Silva, Eduardo; Baken, Nico; Zhao, L.; Macaulay, L.

    2009-01-01

    Interoperability refers to the ability of two or more systems or components to exchange information and to use the information that has been exchanged. The importance of interoperability has grown together with the adoption of Digital Information Networks (DINs). DINs refer to information networks

  7. Model for Trans-sector Digital Interoperability

    NARCIS (Netherlands)

    Madureira, A.; Den Hartog, F.; Silva, E.; Baken, N.

    2010-01-01

    Interoperability refers to the ability of two or more systems or components to exchange information and to use the information that has been exchanged. The importance of interoperability has grown together with the adoption of Digital Information Networks (DINs). DINs refer to information networks

  8. Model for Trans-sector Digital Interoperability

    NARCIS (Netherlands)

    Popplewell, Keith; Madureira, António; Harding, Jenny; den Hartog, Frank; Goncalves da Silva, Eduardo; Poler, Raul; Chalmeta, Ricardo; Baken, Nico

    Interoperability refers to the ability of two or more systems or components to exchange information and to use the information that has been exchanged. The importance of interoperability has grown together with the adoption of Digital Information Networks (DINs). DINs refer to information networks

  9. Clever generation of rich SPARQL queries from annotated relational schema: application to Semantic Web Service creation for biological databases.

    Science.gov (United States)

    Wollbrett, Julien; Larmande, Pierre; de Lamotte, Frédéric; Ruiz, Manuel

    2013-04-15

    In recent years, a large amount of "-omics" data have been produced. However, these data are stored in many different species-specific databases that are managed by different institutes and laboratories. Biologists often need to find and assemble data from disparate sources to perform certain analyses. Searching for these data and assembling them is a time-consuming task. The Semantic Web helps to facilitate interoperability across databases. A common approach involves the development of wrapper systems that map a relational database schema onto existing domain ontologies. However, few attempts have been made to automate the creation of such wrappers. We developed a framework, named BioSemantic, for the creation of Semantic Web Services that are applicable to relational biological databases. This framework makes use of both Semantic Web and Web Services technologies and can be divided into two main parts: (i) the generation and semi-automatic annotation of an RDF view; and (ii) the automatic generation of SPARQL queries and their integration into Semantic Web Services backbones. We have used our framework to integrate genomic data from different plant databases. BioSemantic is a framework that was designed to speed integration of relational databases. We present how it can be used to speed the development of Semantic Web Services for existing relational biological databases. Currently, it creates and annotates RDF views that enable the automatic generation of SPARQL queries. Web Services are also created and deployed automatically, and the semantic annotations of our Web Services are added automatically using SAWSDL attributes. BioSemantic is downloadable at http://southgreen.cirad.fr/?q=content/Biosemantic.

  10. Standards to open and interoperable digital libraries

    Directory of Open Access Journals (Sweden)

    Luís Fernando Sayão

    2007-12-01

    Full Text Available Interoperability is one of the main issues in creating a networked system of digital libraries. However, the interoperability as the way to accomplish data exchange and service collaboration requires adoption of a set of open standards covering all digital repository processes. The aim of this document is to revise the most important standards, protocols and the best pratices that form the framework to an open and fully interoperable digital library.

  11. A Review of Interoperability Standards in E-health and Imperatives for their Adoption in Africa

    Directory of Open Access Journals (Sweden)

    Funmi Adebesin

    2013-07-01

    Full Text Available The ability of healthcare information systems to share and exchange information (interoperate is essential to facilitate the quality and effectiveness of healthcare services. Although standardization is considered key to addressing the fragmentation currently challenging the healthcare environment, e-health standardization can be difficult for many reasons, one of which is making sense of the e-health interoperability standards landscape. Specifically aimed at the African health informatics community, this paper aims to provide an overview of e-health interoperability and the significance of standardization in its achievement. We conducted a literature study of e-health standards, their development, and the degree of participation by African countries in the process. We also provide a review of a selection of prominent e-health interoperability standards that have been widely adopted especially by developed countries, look at some of the factors that affect their adoption in Africa, and provide an overview of ongoing global initiatives to address the identified barriers. Although the paper is specifically aimed at the African community, its findings would be equally applicable to many other developing countries.

  12. OlyMPUS - The Ontology-based Metadata Portal for Unified Semantics

    Science.gov (United States)

    Huffer, E.; Gleason, J. L.

    2015-12-01

    The Ontology-based Metadata Portal for Unified Semantics (OlyMPUS), funded by the NASA Earth Science Technology Office Advanced Information Systems Technology program, is an end-to-end system designed to support data consumers and data providers, enabling the latter to register their data sets and provision them with the semantically rich metadata that drives the Ontology-Driven Interactive Search Environment for Earth Sciences (ODISEES). OlyMPUS leverages the semantics and reasoning capabilities of ODISEES to provide data producers with a semi-automated interface for producing the semantically rich metadata needed to support ODISEES' data discovery and access services. It integrates the ODISEES metadata search system with multiple NASA data delivery tools to enable data consumers to create customized data sets for download to their computers, or for NASA Advanced Supercomputing (NAS) facility registered users, directly to NAS storage resources for access by applications running on NAS supercomputers. A core function of NASA's Earth Science Division is research and analysis that uses the full spectrum of data products available in NASA archives. Scientists need to perform complex analyses that identify correlations and non-obvious relationships across all types of Earth System phenomena. Comprehensive analytics are hindered, however, by the fact that many Earth science data products are disparate and hard to synthesize. Variations in how data are collected, processed, gridded, and stored, create challenges for data interoperability and synthesis, which are exacerbated by the sheer volume of available data. Robust, semantically rich metadata can support tools for data discovery and facilitate machine-to-machine transactions with services such as data subsetting, regridding, and reformatting. Such capabilities are critical to enabling the research activities integral to NASA's strategic plans. However, as metadata requirements increase and competing standards emerge

  13. Semantics-informed cartography: the case of Piemonte Geological Map

    Science.gov (United States)

    Piana, Fabrizio; Lombardo, Vincenzo; Mimmo, Dario; Giardino, Marco; Fubelli, Giandomenico

    2016-04-01

    In modern digital geological maps, namely those supported by a large geo-database and devoted to dynamical, interactive representation on WMS-WebGIS services, there is the need to provide, in an explicit form, the geological assumptions used for the design and compilation of the database of the Map, and to get a definition and/or adoption of semantic representation and taxonomies, in order to achieve a formal and interoperable representation of the geologic knowledge. These approaches are fundamental for the integration and harmonisation of geological information and services across cultural (e.g. different scientific disciplines) and/or physical barriers (e.g. administrative boundaries). Initiatives such as GeoScience Markup Language (last version is GeoSciML 4.0, 2015, http://www.geosciml.org) and the INSPIRE "Data Specification on Geology" http://inspire.jrc.ec.europa.eu/documents/Data_Specifications/INSPIRE_DataSpecification_GE_v3.0rc3.pdf (an operative simplification of GeoSciML, last version is 3.0 rc3, 2013), as well as the recent terminological shepherding of the Geoscience Terminology Working Group (GTWG) have been promoting information exchange of the geologic knowledge. Grounded on these standard vocabularies, schemas and data models, we provide a shared semantic classification of geological data referring to the study case of the synthetic digital geological map of the Piemonte region (NW Italy), named "GEOPiemonteMap", developed by the CNR Institute of Geosciences and Earth Resources, Torino (CNR IGG TO) and hosted as a dynamical interactive map on the geoportal of ARPA Piemonte Environmental Agency. The Piemonte Geological Map is grounded on a regional-scale geo-database consisting of some hundreds of GeologicUnits whose thousands instances (Mapped Features, polygons geometry) widely occur in Piemonte region, and each one is bounded by GeologicStructures (Mapped Features, line geometry). GeologicUnits and GeologicStructures have been spatially

  14. Interoperability does matter

    Directory of Open Access Journals (Sweden)

    Manfred Goepel

    2006-04-01

    Full Text Available In companies, the historically developed IT systems are mostly application islands. They always produce good results if the system's requirements and surroundings are not changed and as long as a system interface is not needed. With the ever increas-ing dynamic and globalization of the market, however, these IT islands are certain to collapse. Interoperability (IO is the bid of the hour, assuming the integration of users, data, applications and processes. In the following, important IO enablers such as ETL, EAI, and SOA will be examined on the basis of practica-bility. It will be shown that especially SOA produces a surge of interoperability that could rightly be referred to as IT evolution.

  15. Requirements for Interoperability in Healthcare Information Systems

    Directory of Open Access Journals (Sweden)

    Rita Noumeir

    2012-01-01

    Full Text Available Interoperability is a requirement for the successful deployment of Electronic Health Records (EHR. EHR improves the quality of healthcare by enabling access to all relevant information at the diagnostic decision moment, regardless of location. It is a system that results from the cooperation of several heterogeneous distributed subsystems that need to successfully exchange information relative to a specific healthcare process. This paper analyzes interoperability impediments in healthcare by first defining them and providing concrete healthcare examples, followed by discussion of how specifications can be defined and how verification can be conducted to eliminate those impediments and ensure interoperability in healthcare. This paper also analyzes how Integrating the Healthcare Enterprise (IHE has been successful in enabling interoperability, and identifies some neglected aspects that need attention.

  16. On MDA - SOA based Intercloud Interoperability framework

    Directory of Open Access Journals (Sweden)

    Tahereh Nodehi

    2013-01-01

    Full Text Available Cloud computing has been one of the latest technologies which assures reliable delivery of on - demand computing services over the Internet. Cloud service providers have established geographically distributed data centers and computing resources, which are available online as service. The clouds operated by different service providers working together in collaboration can open up lots more spaces for innovative scenarios with huge amount of resources provisioning on demand. However, current cloud systems do not support intercloud interoperability. This paper is thus motivated to address Intercloud Interoperabilityby analyzing different methodologies that have been applied to resolve various scenarios of interoperability. Model Driven Architecture (MDA and Service Oriented Architecture (SOA method have been used to address interoperability in various scenarios, which also opens up spaces to address intercloud interoperability by making use of these well accepted methodologies. The focus of this document is to show Intercloud Interoperability can be supported through a Model Driven approach and Service Oriented systems. Moreover, the current state of the art in Intercloud, concept and benefits of MDA and SOA are discussed in the paper. At the same time this paper also proposes a generic architecture for MDA - SOA based framework, which can be useful for developing applications which will require intercloud interoperability. The paper justi fies the usability of the framework by a use - case scenario for dynamic workload migration among heterogeneous clouds.

  17. IoT interoperability : a hub-based approach

    OpenAIRE

    Blackstock, Michael; Lea, Rodger

    2014-01-01

    Interoperability in the Internet of Things is critical for emerging services and applications. In this paper we advocate the use of IoT ‘hubs’ to aggregate things using web protocols, and suggest a staged approach to interoperability. In the context of a UK government funded project involving 8 IoT projects to address cross-domain IoT interoperability, we introduce the HyperCat IoT catalogue specification. We then describe the tools and techniques we developed to adapt an existing data portal...

  18. BIM Interoperability Limitations: Australian and Malaysian Rail Projects

    Directory of Open Access Journals (Sweden)

    Kenley Russell

    2016-01-01

    Full Text Available Building information modelling (BIM is defined as a process involving the generation and management of digital representation of physical and functional characteristics of a facility. The purpose of interoperability in integrated or “open” BIM is to facilitate the information exchange between different digital systems, models and tools. There has been effort towards data interoperability with development of open source standards and object-oriented models, such as industry foundation classes (IFC for vertical infrastructure. However, the lack of open data standards for the information exchange for horizontal infrastructure limits the adoption and effectiveness of integrated BIM. The paper outlines two interoperability issues for construction of rail infrastructure. The issues are presented in two case study reports, one from Australia and one from Malaysia. The each case study includes: a description of the project, the application of BIM in the project, a discussion of the promised BIM interoperability solution plus the identification of the unresolved lack of interoperability for horizontal infrastructure project management. The Moreton Bay Rail project in Australia introduces general software interoperability issues. The Light Rail Extension project in Kuala Lumpur outlines an example of the integration problems related to two different location data structures. The paper highlights how the continuing lack of data interoperability limits utilisation of integrated BIM for horizontal infrastructure rail projects.

  19. Personalized-Detailed Clinical Model for Data Interoperability Among Clinical Standards

    OpenAIRE

    Khan, Wajahat Ali; Hussain, Maqbool; Afzal, Muhammad; Amin, Muhammad Bilal; Saleem, Muhammad Aamir; Lee, Sungyoung

    2013-01-01

    Objective: Data interoperability among health information exchange (HIE) systems is a major concern for healthcare practitioners to enable provisioning of telemedicine-related services. Heterogeneity exists in these systems not only at the data level but also among different heterogeneous healthcare standards with which these are compliant. The relationship between healthcare organization data and different heterogeneous standards is necessary to achieve the goal of data level interoperabi...

  20. Using Semantic Web technologies for the generation of domain-specific templates to support clinical study metadata standards.

    Science.gov (United States)

    Jiang, Guoqian; Evans, Julie; Endle, Cory M; Solbrig, Harold R; Chute, Christopher G

    2016-01-01

    The Biomedical Research Integrated Domain Group (BRIDG) model is a formal domain analysis model for protocol-driven biomedical research, and serves as a semantic foundation for application and message development in the standards developing organizations (SDOs). The increasing sophistication and complexity of the BRIDG model requires new approaches to the management and utilization of the underlying semantics to harmonize domain-specific standards. The objective of this study is to develop and evaluate a Semantic Web-based approach that integrates the BRIDG model with ISO 21090 data types to generate domain-specific templates to support clinical study metadata standards development. We developed a template generation and visualization system based on an open source Resource Description Framework (RDF) store backend, a SmartGWT-based web user interface, and a "mind map" based tool for the visualization of generated domain-specific templates. We also developed a RESTful Web Service informed by the Clinical Information Modeling Initiative (CIMI) reference model for access to the generated domain-specific templates. A preliminary usability study is performed and all reviewers (n = 3) had very positive responses for the evaluation questions in terms of the usability and the capability of meeting the system requirements (with the average score of 4.6). Semantic Web technologies provide a scalable infrastructure and have great potential to enable computable semantic interoperability of models in the intersection of health care and clinical research.

  1. Semantic Web repositories for genomics data using the eXframe platform.

    Science.gov (United States)

    Merrill, Emily; Corlosquet, Stéphane; Ciccarese, Paolo; Clark, Tim; Das, Sudeshna

    2014-01-01

    With the advent of inexpensive assay technologies, there has been an unprecedented growth in genomics data as well as the number of databases in which it is stored. In these databases, sample annotation using ontologies and controlled vocabularies is becoming more common. However, the annotation is rarely available as Linked Data, in a machine-readable format, or for standardized queries using SPARQL. This makes large-scale reuse, or integration with other knowledge bases very difficult. To address this challenge, we have developed the second generation of our eXframe platform, a reusable framework for creating online repositories of genomics experiments. This second generation model now publishes Semantic Web data. To accomplish this, we created an experiment model that covers provenance, citations, external links, assays, biomaterials used in the experiment, and the data collected during the process. The elements of our model are mapped to classes and properties from various established biomedical ontologies. Resource Description Framework (RDF) data is automatically produced using these mappings and indexed in an RDF store with a built-in Sparql Protocol and RDF Query Language (SPARQL) endpoint. Using the open-source eXframe software, institutions and laboratories can create Semantic Web repositories of their experiments, integrate it with heterogeneous resources and make it interoperable with the vast Semantic Web of biomedical knowledge.

  2. Agent Based Knowledge Management Solution using Ontology, Semantic Web Services and GIS

    Directory of Open Access Journals (Sweden)

    Andreea DIOSTEANU

    2009-01-01

    Full Text Available The purpose of our research is to develop an agent based knowledge management application framework using a specific type of ontology that is able to facilitate semantic web service search and automatic composition. This solution can later on be used to develop complex solutions for location based services, supply chain management, etc. This application for modeling knowledge highlights the importance of agent interaction that leads to efficient enterprise interoperability. Furthermore, it proposes an "agent communication language" ontology that extends the OWL Lite standard approach and makes it more flexible in retrieving proper data for identifying the agents that can best communicate and negotiate.

  3. PACS/information systems interoperability using Enterprise Communication Framework.

    Science.gov (United States)

    alSafadi, Y; Lord, W P; Mankovich, N J

    1998-06-01

    Interoperability among healthcare applications goes beyond connectivity to allow components to exchange structured information and work together in a predictable, coordinated fashion. To facilitate building an interoperability infrastructure, an Enterprise Communication Framework (ECF) was developed by the members of the Andover Working Group for Healthcare Interoperability (AWG-OHI). The ECF consists of four models: 1) Use Case Model, 2) Domain Information Model (DIM), 3) Interaction Model, and 4) Message Model. To realize this framework, a software component called the Enterprise Communicator (EC) is used. In this paper, we will demonstrate the use of the framework in interoperating a picture archiving and communication system (PACS) with a radiology information system (RIS).

  4. Clever generation of rich SPARQL queries from annotated relational schema: application to Semantic Web Service creation for biological databases

    Science.gov (United States)

    2013-01-01

    Background In recent years, a large amount of “-omics” data have been produced. However, these data are stored in many different species-specific databases that are managed by different institutes and laboratories. Biologists often need to find and assemble data from disparate sources to perform certain analyses. Searching for these data and assembling them is a time-consuming task. The Semantic Web helps to facilitate interoperability across databases. A common approach involves the development of wrapper systems that map a relational database schema onto existing domain ontologies. However, few attempts have been made to automate the creation of such wrappers. Results We developed a framework, named BioSemantic, for the creation of Semantic Web Services that are applicable to relational biological databases. This framework makes use of both Semantic Web and Web Services technologies and can be divided into two main parts: (i) the generation and semi-automatic annotation of an RDF view; and (ii) the automatic generation of SPARQL queries and their integration into Semantic Web Services backbones. We have used our framework to integrate genomic data from different plant databases. Conclusions BioSemantic is a framework that was designed to speed integration of relational databases. We present how it can be used to speed the development of Semantic Web Services for existing relational biological databases. Currently, it creates and annotates RDF views that enable the automatic generation of SPARQL queries. Web Services are also created and deployed automatically, and the semantic annotations of our Web Services are added automatically using SAWSDL attributes. BioSemantic is downloadable at http://southgreen.cirad.fr/?q=content/Biosemantic. PMID:23586394

  5. Reference architecture and interoperability model for data mining and fusion in scientific cross-domain infrastructures

    Science.gov (United States)

    Haener, Rainer; Waechter, Joachim; Grellet, Sylvain; Robida, Francois

    2017-04-01

    Interoperability is the key factor in establishing scientific research environments and infrastructures, as well as in bringing together heterogeneous, geographically distributed risk management, monitoring, and early warning systems. Based on developments within the European Plate Observing System (EPOS), a reference architecture has been devised that comprises architectural blue-prints and interoperability models regarding the specification of business processes and logic as well as the encoding of data, metadata, and semantics. The architectural blueprint is developed on the basis of the so called service-oriented architecture (SOA) 2.0 paradigm, which combines intelligence and proactiveness of event-driven with service-oriented architectures. SOA 2.0 supports analysing (Data Mining) both, static and real-time data in order to find correlations of disparate information that do not at first appear to be intuitively obvious: Analysed data (e.g., seismological monitoring) can be enhanced with relationships discovered by associating them (Data Fusion) with other data (e.g., creepmeter monitoring), with digital models of geological structures, or with the simulation of geological processes. The interoperability model describes the information, communication (conversations) and the interactions (choreographies) of all participants involved as well as the processes for registering, providing, and retrieving information. It is based on the principles of functional integration, implemented via dedicated services, communicating via service-oriented and message-driven infrastructures. The services provide their functionality via standardised interfaces: Instead of requesting data directly, users share data via services that are built upon specific adapters. This approach replaces the tight coupling at data level by a flexible dependency on loosely coupled services. The main component of the interoperability model is the comprehensive semantic description of the information

  6. Convergence of Health Level Seven Version 2 Messages to Semantic Web Technologies for Software-Intensive Systems in Telemedicine Trauma Care.

    Science.gov (United States)

    Menezes, Pedro Monteiro; Cook, Timothy Wayne; Cavalini, Luciana Tricai

    2016-01-01

    To present the technical background and the development of a procedure that enriches the semantics of Health Level Seven version 2 (HL7v2) messages for software-intensive systems in telemedicine trauma care. This study followed a multilevel model-driven approach for the development of semantically interoperable health information systems. The Pre-Hospital Trauma Life Support (PHTLS) ABCDE protocol was adopted as the use case. A prototype application embedded the semantics into an HL7v2 message as an eXtensible Markup Language (XML) file, which was validated against an XML schema that defines constraints on a common reference model. This message was exchanged with a second prototype application, developed on the Mirth middleware, which was also used to parse and validate both the original and the hybrid messages. Both versions of the data instance (one pure XML, one embedded in the HL7v2 message) were equally validated and the RDF-based semantics recovered by the receiving side of the prototype from the shared XML schema. This study demonstrated the semantic enrichment of HL7v2 messages for intensive-software telemedicine systems for trauma care, by validating components of extracts generated in various computing environments. The adoption of the method proposed in this study ensures the compliance of the HL7v2 standard in Semantic Web technologies.

  7. Enhancing Data Interoperability with Web Services

    Science.gov (United States)

    Shrestha, S. R.; Zimble, D. A.; Wang, W.; Herring, D.; Halpert, M.

    2014-12-01

    In an effort to improve data access and interoperability of climate and weather data, the National Oceanic and Atmospheric Administration's (NOAA) Climate.gov and Climate Prediction Center (CPC) are exploring various platform solutions to enhance a user's ability to locate, preview, and acquire the data. The Climate.gov and CPC data team faces multiple challenges including the various kinds of data and formats, inconsistency of metadata records, variety of data service implementations, very large volumes of data and geographically distributed locations. We have created the Data Access and Interoperability project to design a web-based platform, where interoperability between systems can be leveraged to allow greater data discovery, access, visualization and delivery. In the interoperable data platform, systems can integrate with each other to support the synthesis of climate and weather data. Interoperability is the ability for users to discover the available climate and weather data, preview and interact with the data, and acquire the data in common digital formats through a simple web-based interface. The goal of the interoperable data platform is to leverage existing web services, implement the established standards and integrate with existing solutions across the earth sciences domain instead of creating new technologies. Towards this effort to improve the interoperability of the platform, we are collaborating with ESRI Inc. to provide climate and weather data via web services. In this presentation, we will discuss and demonstrate how to use ArcGIS to author RESTful based scientific web services using open standards. These web services are able to encapsulate the logic required to handle and describe scientific data through a variety of service types including, image, map, feature, geoprocessing, and their respective service methods. Combining these types of services and leveraging well-documented APIs, including the ArcGIS JavaScript API, we can afford to

  8. A semantic security framework for systems of systems

    NARCIS (Netherlands)

    Trivellato, Daniel; Zannone, Nicola; Glaundrup, Maurice; Skowronek, Jacek; Etalle, Sandro

    2013-01-01

    Systems of systems (SoS) are dynamic coalitions of distributed, autonomous and heterogeneous systems that collaborate to achieve a common goal. While offering several advantages in terms of scalability and flexibility, the SoS paradigm has a strong impact on systems interoperability and on the

  9. Developing data aggregation applications from a community standard semantic resource (Invited)

    Science.gov (United States)

    Leadbetter, A.; Lowry, R. K.

    2013-12-01

    The semantic content of the NERC Vocabulary Server (NVS) has been developed over thirty years. It has been used to mark up metadata and data in a wide range of international projects, including the European Commission (EC) Framework Programme 7 projects SeaDataNet and The Open Service Network for Marine Environmental Data (NETMAR). Within the United States, the National Science Foundation projects Rolling Deck to Repository and Biological & Chemical Data Management Office (BCO-DMO) use concepts from NVS for markup. Further, typed relationships between NVS concepts and terms served by the Marine Metadata Interoperability Ontology Registry and Repository. The vast majority of the concepts publicly served from NVS (35% of ~82,000) form the British Oceanographic Data Centre (BODC) Parameter Usage Vocabulary (PUV). The PUV is instantiated on the NVS as a SKOS concept collection. These terms are used to describe the individual channels in data and metadata served by, for example, BODC, SeaDataNet and BCO-DMO. The PUV terms are designed to be very precise and may contain a high level of detail. Some users have reported that the PUV is difficult to navigate due to its size and complexity (a problem CSIRO have begun to address by deploying a SISSVoc interface to the NVS), and it has been difficult to aggregate data as multiple PUV terms can - with full validity - be used to describe the same data channels. Better approaches to data aggregation are required as a use case for the PUV from the EC European Marine Observation and Data Network (EMODnet) Chemistry project. One solution, proposed and demonstrated during the course of the NETMAR project, is to build new SKOS concept collections which formalise the desired aggregations for given applications, and uses typed relationships to state which PUV concepts contribute to a specific aggregation. Development of these new collections requires input from a group of experts in the application domain who can decide which PUV

  10. Medical Device Plug-and-Play Interoperability Standards and Technology Leadership

    Science.gov (United States)

    2017-10-01

    Award Number: W81XWH-09-1-0705 TITLE: “Medical Device Plug-and-Play Interoperability Standards and Technology Leadership” PRINCIPAL INVESTIGATOR...Sept 2016 – 20 Sept 2017 4. TITLE AND SUBTITLE “Medical Device Plug-and-Play Interoperability 5a. CONTRACT NUMBER Standards and Technology ...efficiency through interoperable medical technologies . We played a leadership role on interoperability safety standards (AAMI, AAMI/UL Joint

  11. Interoperability for Entreprise Systems and Applications '12

    CERN Document Server

    Doumeingts, Guy; Katzy, Bernhard; Chalmeta, Ricardo

    2012-01-01

    Within a scenario of globalised markets, where the capacity to efficiently cooperate with other firms starts to become essential in order to remain in the market in an economically, socially and environmentally cost-effective manner, it can be seen how the most innovative enterprises are beginning to redesign their business model to become interoperable. This goal of interoperability is essential, not only from the perspective of the individual enterprise but also in the new business structures that are now emerging, such as supply chains, virtual enterprises, interconnected organisations or extended enterprises, as well as in mergers and acquisitions. Composed of over 40 papers, Enterprise Interoperability V ranges from academic research through case studies to industrial and administrative experience of interoperability. The international nature of the authorship contnues to broaden. Many of the papers have examples and illustrations calculated to deepen understanding and generate new ideas. The I-ESA'12 Co...

  12. Semantic Context Detection Using Audio Event Fusion

    Directory of Open Access Journals (Sweden)

    Cheng Wen-Huang

    2006-01-01

    Full Text Available Semantic-level content analysis is a crucial issue in achieving efficient content retrieval and management. We propose a hierarchical approach that models audio events over a time series in order to accomplish semantic context detection. Two levels of modeling, audio event and semantic context modeling, are devised to bridge the gap between physical audio features and semantic concepts. In this work, hidden Markov models (HMMs are used to model four representative audio events, that is, gunshot, explosion, engine, and car braking, in action movies. At the semantic context level, generative (ergodic hidden Markov model and discriminative (support vector machine (SVM approaches are investigated to fuse the characteristics and correlations among audio events, which provide cues for detecting gunplay and car-chasing scenes. The experimental results demonstrate the effectiveness of the proposed approaches and provide a preliminary framework for information mining by using audio characteristics.

  13. How Chinese Semantics Capability Improves Interpretation in Visual Communication

    Science.gov (United States)

    Cheng, Chu-Yu; Ou, Yang-Kun; Kin, Ching-Lung

    2017-01-01

    A visual representation involves delivering messages through visually communicated images. The study assumed that semantic recognition can affect visual interpretation ability, and the result showed that students graduating from a general high school achieve satisfactory results in semantic recognition and image interpretation tasks than students…

  14. Development of high performance scientific components for interoperability of computing packages

    Energy Technology Data Exchange (ETDEWEB)

    Gulabani, Teena Pratap [Iowa State Univ., Ames, IA (United States)

    2008-01-01

    Three major high performance quantum chemistry computational packages, NWChem, GAMESS and MPQC have been developed by different research efforts following different design patterns. The goal is to achieve interoperability among these packages by overcoming the challenges caused by the different communication patterns and software design of each of these packages. A chemistry algorithm is hard to develop as well as being a time consuming process; integration of large quantum chemistry packages will allow resource sharing and thus avoid reinvention of the wheel. Creating connections between these incompatible packages is the major motivation of the proposed work. This interoperability is achieved by bringing the benefits of Component Based Software Engineering through a plug-and-play component framework called Common Component Architecture (CCA). In this thesis, I present a strategy and process used for interfacing two widely used and important computational chemistry methodologies: Quantum Mechanics and Molecular Mechanics. To show the feasibility of the proposed approach the Tuning and Analysis Utility (TAU) has been coupled with NWChem code and its CCA components. Results show that the overhead is negligible when compared to the ease and potential of organizing and coping with large-scale software applications.

  15. Electronic Toll Collection Systems and their Interoperability: The State of Art

    Energy Technology Data Exchange (ETDEWEB)

    Heras Molina, J. de la; Gomez Sanchez, J.; Vassallo Magro, J.M.

    2016-07-01

    The European Electronic Toll Service (EETS) was created in 2004 with the aim of ensuring interoperability among the existing electronic toll collection (ETC) systems in Europe. However, the lack of cooperation between groups of stakeholders has not made possible to achieve this goal ten years later. The purpose of this research is to determine the better way to achieve interoperability among the different ETC systems in Europe. Our study develops a review of the six main ETC systems available worldwide: Automatic Number Plate Recognition (ANPR), Dedicated Short-Range Communications (DSRC), Radio Frequency Identification (RFID), Satellite systems (GNSS), Tachograph, and Mobile communications tolling systems. The research also provides some insight on different emerging technologies. By focusing on different operational and strategic aspects offered by each technology, we identify their main strengths, weaknesses, opportunities and threats and makes different recommendations to improve the current framework. The research concludes that given the diversity of advantages and inconveniences offered by each system, the selection of a certain ETC technology should also take into account its potential to overcome the weaknesses in the current ETC framework. In this line, different policy recommendations are proposed to improve the present ETC strategy at the EU. (Author)

  16. Social Semantics for an Effective Enterprise

    Science.gov (United States)

    Berndt, Sarah; Doane, Mike

    2012-01-01

    An evolution of the Semantic Web, the Social Semantic Web (s2w), facilitates knowledge sharing with "useful information based on human contributions, which gets better as more people participate." The s2w reaches beyond the search box to move us from a collection of hyperlinked facts, to meaningful, real time context. When focused through the lens of Enterprise Search, the Social Semantic Web facilitates the fluid transition of meaningful business information from the source to the user. It is the confluence of human thought and computer processing structured with the iterative application of taxonomies, folksonomies, ontologies, and metadata schemas. The importance and nuances of human interaction are often deemphasized when focusing on automatic generation of semantic markup, which results in dissatisfied users and unrealized return on investment. Users consistently qualify the value of information sets through the act of selection, making them the de facto stakeholders of the Social Semantic Web. Employers are the ultimate beneficiaries of s2w utilization with a better informed, more decisive workforce; one not achieved with an IT miracle technology, but by improved human-computer interactions. Johnson Space Center Taxonomist Sarah Berndt and Mike Doane, principal owner of Term Management, LLC discuss the planning, development, and maintenance stages for components of a semantic system while emphasizing the necessity of a Social Semantic Web for the Enterprise. Identification of risks and variables associated with layering the successful implementation of a semantic system are also modeled.

  17. Improvements to the Ontology-based Metadata Portal for Unified Semantics (OlyMPUS)

    Science.gov (United States)

    Linsinbigler, M. A.; Gleason, J. L.; Huffer, E.

    2016-12-01

    The Ontology-based Metadata Portal for Unified Semantics (OlyMPUS), funded by the NASA Earth Science Technology Office Advanced Information Systems Technology program, is an end-to-end system designed to support Earth Science data consumers and data providers, enabling the latter to register data sets and provision them with the semantically rich metadata that drives the Ontology-Driven Interactive Search Environment for Earth Sciences (ODISEES). OlyMPUS complements the ODISEES' data discovery system with an intelligent tool to enable data producers to auto-generate semantically enhanced metadata and upload it to the metadata repository that drives ODISEES. Like ODISEES, the OlyMPUS metadata provisioning tool leverages robust semantics, a NoSQL database and query engine, an automated reasoning engine that performs first- and second-order deductive inferencing, and uses a controlled vocabulary to support data interoperability and automated analytics. The ODISEES data discovery portal leverages this metadata to provide a seamless data discovery and access experience for data consumers who are interested in comparing and contrasting the multiple Earth science data products available across NASA data centers. Olympus will support scientists' services and tools for performing complex analyses and identifying correlations and non-obvious relationships across all types of Earth System phenomena using the full spectrum of NASA Earth Science data available. By providing an intelligent discovery portal that supplies users - both human users and machines - with detailed information about data products, their contents and their structure, ODISEES will reduce the level of effort required to identify and prepare large volumes of data for analysis. This poster will explain how OlyMPUS leverages deductive reasoning and other technologies to create an integrated environment for generating and exploiting semantically rich metadata.

  18. Interoperability Guidelines for Lithuanian E-Learning Management Systems

    Directory of Open Access Journals (Sweden)

    Eugenijus Kurilovas

    2011-08-01

    Full Text Available Purpose – the paper aims to analyse e-learning content and repositories along with the problems of learning organisation interoperability. The main objective of the paper is to analyse scientific research results and the newest international experience in the area and to provide interoperability guidelines and recommendations for the implementation of appropriate Lithuanian state programmes. The learning content and repositories recommendations are designed for the implementation of the Lithuanian education portal project as well as Lithuanian Virtual University (LVU programme’s information services’ (LABT / eLABa and e-learning services’ (LieDM sub-programmes. The whole education institution recommendations are designed for the maintenance and development of LVU programme’s management services’ (LieMSIS system.Design/methodology/approach – methods used for the general analysis of proposed interoperability guidelines (reccomendations were bibliographic research and comparative analysis of Lithuanian and foreign scientific works published in periodicals and large-scale EU-funded interoperability projects deliverables. System analysis and comparative analysis methods were used in order to formulate and analyse systems’ interoperability guidelines and recommendations. The author employed the experimental research method while working in the appropriate EU-funded interoperability projects to form the guidelines (recommendations. In order to summarize the results, the evaluative research method was used..Findings – the international guidelines and recommendations presented in the paper could be suitable for implementation while developing Lithuanian state education information systems such as the Lithuanian education portal, the Lithuanian academic libraries’ (eLABa system, the Lithuanian distance learning system (LieDM, and the Lithuanian universities’ management system (LieMSIS.Research limitations/implications – the paper

  19. Interoperability Guidelines for Lithuanian E-Learning Management Systems

    Directory of Open Access Journals (Sweden)

    Eugenijus Kurilovas

    2013-08-01

    Full Text Available Purpose – the paper aims to analyse e-learning content and repositories along with the problems of learning organisation interoperability. The main objective of the paper is to analyse scientific research results and the newest international experience in the area and to provide interoperability guidelines and recommendations for the implementation of appropriate Lithuanian state programmes. The learning content and repositories recommendations are designed for the implementation of the Lithuanian education portal project as well as Lithuanian Virtual University (LVU programme’s information services’ (LABT / eLABa and e-learning services’ (LieDM sub-programmes. The whole education institution recommendations are designed for the maintenance and development of LVU programme’s management services’ (LieMSIS system.Design/methodology/approach – methods used for the general analysis of proposed interoperability guidelines (reccomendations were bibliographic research and comparative analysis of Lithuanian and foreign scientific works published in periodicals and large-scale EU-funded interoperability projects deliverables. System analysis and comparative analysis methods were used in order to formulate and analyse systems’ interoperability guidelines and recommendations. The author employed the experimental research method while working in the appropriate EU-funded interoperability projects to form the guidelines (recommendations. In order to summarize the results, the evaluative research method was used.Findings – the international guidelines and recommendations presented in the paper could be suitable for implementation while developing Lithuanian state education information systems such as the Lithuanian education portal, the Lithuanian academic libraries’ (eLABa system, the Lithuanian distance learning system (LieDM, and the Lithuanian universities’ management system (LieMSIS.Research limitations/implications – the paper

  20. Interoperability of Web Archives and Digital Libraries

    DEFF Research Database (Denmark)

    Kalb, Hendrik; Lazaridou, Paraskevi; Pinsent, Edward

    2013-01-01

    The interoperability of web archives and digital libraries is crucial to avoid silos of preserved data and content. While various researches focus on specfic facets of the challenge to interoperate, there is a lack of empirical work about the overall situation of actual challenges. We conduct...

  1. Semantic Blogging : Spreading the Semantic Web Meme

    OpenAIRE

    Cayzer, Steve

    2004-01-01

    This paper is about semantic blogging, an application of the semantic web to blogging. The semantic web promises to make the web more useful by endowing metadata with machine processable semantics. Blogging is a lightweight web publishing paradigm which provides a very low barrier to entry, useful syndication and aggregation behaviour, a simple to understand structure and decentralized construction of a rich information network. Semantic blogging builds upon the success and clear network valu...

  2. Interoperable End-to-End Remote Patient Monitoring Platform Based on IEEE 11073 PHD and ZigBee Health Care Profile.

    Science.gov (United States)

    Clarke, Malcolm; de Folter, Joost; Verma, Vivek; Gokalp, Hulya

    2018-05-01

    This paper describes the implementation of an end-to-end remote monitoring platform based on the IEEE 11073 standards for personal health devices (PHD). It provides an overview of the concepts and approaches and describes how the standard has been optimized for small devices with limited resources of processor, memory, and power that use short-range wireless technology. It explains aspects of IEEE 11073, including the domain information model, state model, and nomenclature, and how these support its plug-and-play architecture. It shows how these aspects underpin a much larger ecosystem of interoperable devices and systems that include IHE PCD-01, HL7, and BlueTooth LE medical devices, and the relationship to the Continua Guidelines, advocating the adoption of data standards and nomenclature to support semantic interoperability between health and ambient assisted living in future platforms. The paper further describes the adaptions that have been made in order to implement the standard on the ZigBee Health Care Profile and the experiences of implementing an end-to-end platform that has been deployed to frail elderly patients with chronic disease(s) and patients with diabetes.

  3. Cloud portability and interoperability issues and current trends

    CERN Document Server

    Di Martino, Beniamino; Esposito, Antonio

    2015-01-01

    This book offers readers a quick, comprehensive and up-to-date overview of the most important methodologies, technologies, APIs and standards related to the portability and interoperability of cloud applications and services, illustrated by a number of use cases representing a variety of interoperability and portability scenarios. The lack of portability and interoperability between cloud platforms at different service levels is the main issue affecting cloud-based services today. The brokering, negotiation, management, monitoring and reconfiguration of cloud resources are challenging tasks

  4. Biomedical semantics in the Semantic Web.

    Science.gov (United States)

    Splendiani, Andrea; Burger, Albert; Paschke, Adrian; Romano, Paolo; Marshall, M Scott

    2011-03-07

    The Semantic Web offers an ideal platform for representing and linking biomedical information, which is a prerequisite for the development and application of analytical tools to address problems in data-intensive areas such as systems biology and translational medicine. As for any new paradigm, the adoption of the Semantic Web offers opportunities and poses questions and challenges to the life sciences scientific community: which technologies in the Semantic Web stack will be more beneficial for the life sciences? Is biomedical information too complex to benefit from simple interlinked representations? What are the implications of adopting a new paradigm for knowledge representation? What are the incentives for the adoption of the Semantic Web, and who are the facilitators? Is there going to be a Semantic Web revolution in the life sciences?We report here a few reflections on these questions, following discussions at the SWAT4LS (Semantic Web Applications and Tools for Life Sciences) workshop series, of which this Journal of Biomedical Semantics special issue presents selected papers from the 2009 edition, held in Amsterdam on November 20th.

  5. PERSPECTIVES ON INTEROPERABILITY INTEGRATION WITHIN NATO DEFENSE PLANNING PROCESS

    Directory of Open Access Journals (Sweden)

    Florian CIOCAN

    2011-01-01

    Full Text Available Interoperability is not a new area of effort at NATO level. In fact, interoperability and more specifi cally standardization, has been a key element of the Alliance’s approach to fi elding forces for decades. But as the security and operational environment has been in a continuous change, the need to face the new threats and the current involvement in challenging operations in Afghanistan and elsewhere alongside with the necessity to interoperate at lower and lower levels of command with an increasing number of nations, including non-NATO ISAF partners, NGOs, and other organizations, have made the task even more challenging. In this respect Interoperability Integration within NATO Defense Planning Process will facilitate the timely identifi cation, development and delivery of required forces and capabilities that are interoperable and adequately prepared, equipped, trained and supported to undertake the Alliance’s full spectrum of missions.

  6. Semantics-driven modelling of user preferences for information retrieval in the biomedical domain.

    Science.gov (United States)

    Gladun, Anatoly; Rogushina, Julia; Valencia-García, Rafael; Béjar, Rodrigo Martínez

    2013-03-01

    A large amount of biomedical and genomic data are currently available on the Internet. However, data are distributed into heterogeneous biological information sources, with little or even no organization. Semantic technologies provide a consistent and reliable basis with which to confront the challenges involved in the organization, manipulation and visualization of data and knowledge. One of the knowledge representation techniques used in semantic processing is the ontology, which is commonly defined as a formal and explicit specification of a shared conceptualization of a domain of interest. The work presented here introduces a set of interoperable algorithms that can use domain and ontological information to improve information-retrieval processes. This work presents an ontology-based information-retrieval system for the biomedical domain. This system, with which some experiments have been carried out that are described in this paper, is based on the use of domain ontologies for the creation and normalization of lightweight ontologies that represent user preferences in a determined domain in order to improve information-retrieval processes.

  7. Defining inter-cloud architecture for interoperability and integration

    NARCIS (Netherlands)

    Demchenko, Y.; Ngo, C.; Makkes, M.X.; Strijkers, R.; de Laat, C.; Zimmermann, W.; Lee, Y.W.; Demchenko, Y.

    2012-01-01

    This paper presents an on-going research to develop the Inter-Cloud Architecture, which addresses the architectural problems in multi-provider multi-domain heterogeneous cloud based applications integration and interoperability, including integration and interoperability with legacy infrastructure

  8. Defining Inter-Cloud Architecture for Interoperability and Integration

    NARCIS (Netherlands)

    Demchenko, Y.; Ngo, C.; Makkes, M.X.; Strijkers, R.J.; Laat, C. de

    2012-01-01

    This paper presents on-going research to develop the Inter-Cloud Architecture that should address problems in multi-provider multi-domain heterogeneous Cloud based applications integration and interoperability, including integration and interoperability with legacy infrastructure services. Cloud

  9. Latest developments for the IAGOS database: Interoperability and metadata

    Science.gov (United States)

    Boulanger, Damien; Gautron, Benoit; Thouret, Valérie; Schultz, Martin; van Velthoven, Peter; Broetz, Bjoern; Rauthe-Schöch, Armin; Brissebrat, Guillaume

    2014-05-01

    In-service Aircraft for a Global Observing System (IAGOS, http://www.iagos.org) aims at the provision of long-term, frequent, regular, accurate, and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft. The IAGOS database is an essential part of the global atmospheric monitoring network. Data access is handled by open access policy based on the submission of research requests which are reviewed by the PIs. Users can access the data through the following web sites: http://www.iagos.fr or http://www.pole-ether.fr as the IAGOS database is part of the French atmospheric chemistry data centre ETHER (CNES and CNRS). The database is in continuous development and improvement. In the framework of the IGAS project (IAGOS for GMES/COPERNICUS Atmospheric Service), major achievements will be reached, such as metadata and format standardisation in order to interoperate with international portals and other databases, QA/QC procedures and traceability, CARIBIC (Civil Aircraft for the Regular Investigation of the Atmosphere Based on an Instrument Container) data integration within the central database, and the real-time data transmission. IGAS work package 2 aims at providing the IAGOS data to users in a standardized format including the necessary metadata and information on data processing, data quality and uncertainties. We are currently redefining and standardizing the IAGOS metadata for interoperable use within GMES/Copernicus. The metadata are compliant with the ISO 19115, INSPIRE and NetCDF-CF conventions. IAGOS data will be provided to users in NetCDF or NASA Ames format. We also are implementing interoperability between all the involved IAGOS data services, including the central IAGOS database, the former MOZAIC and CARIBIC databases, Aircraft Research DLR database and the Jülich WCS web application JOIN (Jülich OWS Interface) which combines model outputs with in situ data for

  10. Preserved musical semantic memory in semantic dementia.

    Science.gov (United States)

    Weinstein, Jessica; Koenig, Phyllis; Gunawardena, Delani; McMillan, Corey; Bonner, Michael; Grossman, Murray

    2011-02-01

    To understand the scope of semantic impairment in semantic dementia. Case study. Academic medical center. A man with semantic dementia, as demonstrated by clinical, neuropsychological, and imaging studies. Music performance and magnetic resonance imaging results. Despite profoundly impaired semantic memory for words and objects due to left temporal lobe atrophy, this semiprofessional musician was creative and expressive in demonstrating preserved musical knowledge. Long-term representations of words and objects in semantic memory may be dissociated from meaningful knowledge in other domains, such as music.

  11. An EarthCube Roadmap for Cross-Domain Interoperability in the Geosciences: Governance Aspects

    Science.gov (United States)

    Zaslavsky, I.; Couch, A.; Richard, S. M.; Valentine, D. W.; Stocks, K.; Murphy, P.; Lehnert, K. A.

    2012-12-01

    The goal of cross-domain interoperability is to enable reuse of data and models outside the original context in which these data and models are collected and used and to facilitate analysis and modeling of physical processes that are not confined to disciplinary or jurisdictional boundaries. A new research initiative of the U.S. National Science Foundation, called EarthCube, is developing a roadmap to address challenges of interoperability in the earth sciences and create a blueprint for community-guided cyberinfrastructure accessible to a broad range of geoscience researchers and students. Infrastructure readiness for cross-domain interoperability encompasses the capabilities that need to be in place for such secondary or derivative-use of information to be both scientifically sound and technically feasible. In this initial assessment we consider the following four basic infrastructure components that need to be present to enable cross-domain interoperability in the geosciences: metadata catalogs (at the appropriate community defined granularity) that provide standard discovery services over datasets, data access services, models and other resources of the domain; vocabularies that support unambiguous interpretation of domain resources and metadata; services used to access data repositories and other resources including models, visualizations and workflows; and formal information models that define structure and semantics of the information returned on service requests. General standards for these components have been proposed; they form the backbone of large scale integration activities in the geosciences. By utilizing these standards, EarthCube research designs can take advantage of data discovery across disciplines using the commonality in key data characteristics related to shared models of spatial features, time measurements, and observations. Data can be discovered via federated catalogs and linked nomenclatures from neighboring domains, while standard data

  12. Robotics Systems Joint Project Office (RSJPO) Interoperability Profiles (IOPS) 101

    Science.gov (United States)

    2012-07-01

    interoperability, although they are supported by some interoperability attributes  For example, stair climbing » Stair climbing is not something that...IOPs need to specify » However, the mobility & actuation related interoperable messages can be used to provide stair climbing » Also...interoperability can enable management of different poses or modes, one of which may be stair climbing R O B O T IC S Y S T E M S J P O L e a d e r s h i p

  13. Investigation of Automated Terminal Interoperability Test

    OpenAIRE

    Brammer, Niklas

    2008-01-01

    In order to develop and secure the functionality of its cellular communications systems, Ericsson deals with numerous R&D and I&V activities. One important aspect is interoperability with mobile terminals from different vendors on the world market. Therefore Ericsson co-operates with mobile platform and user equipment manufacturers. These companies visit the interoperability developmental testing (IoDT) laboratories in Linköping to test their developmental products and prototypes in o...

  14. Smart Grid Interoperability Maturity Model Beta Version

    Energy Technology Data Exchange (ETDEWEB)

    Widergren, Steven E.; Drummond, R.; Giroti, Tony; Houseman, Doug; Knight, Mark; Levinson, Alex; longcore, Wayne; Lowe, Randy; Mater, J.; Oliver, Terry V.; Slack, Phil; Tolk, Andreas; Montgomery, Austin

    2011-12-02

    The GridWise Architecture Council was formed by the U.S. Department of Energy to promote and enable interoperability among the many entities that interact with the electric power system. This balanced team of industry representatives proposes principles for the development of interoperability concepts and standards. The Council provides industry guidance and tools that make it an available resource for smart grid implementations. In the spirit of advancing interoperability of an ecosystem of smart grid devices and systems, this document presents a model for evaluating the maturity of the artifacts and processes that specify the agreement of parties to collaborate across an information exchange interface. You are expected to have a solid understanding of large, complex system integration concepts and experience in dealing with software component interoperation. Those without this technical background should read the Executive Summary for a description of the purpose and contents of the document. Other documents, such as checklists, guides, and whitepapers, exist for targeted purposes and audiences. Please see the www.gridwiseac.org website for more products of the Council that may be of interest to you.

  15. Toward an Interoperability Architecture

    National Research Council Canada - National Science Library

    Buddenberg, Rex

    2001-01-01

    .... The continued burgeoning of the Internet constitutes an existence proof. But a common networking base is insufficient to reach a goal of cross-system interoperability - the large information system...

  16. SPARK: Adapting Keyword Query to Semantic Search

    Science.gov (United States)

    Zhou, Qi; Wang, Chong; Xiong, Miao; Wang, Haofen; Yu, Yong

    Semantic search promises to provide more accurate result than present-day keyword search. However, progress with semantic search has been delayed due to the complexity of its query languages. In this paper, we explore a novel approach of adapting keywords to querying the semantic web: the approach automatically translates keyword queries into formal logic queries so that end users can use familiar keywords to perform semantic search. A prototype system named 'SPARK' has been implemented in light of this approach. Given a keyword query, SPARK outputs a ranked list of SPARQL queries as the translation result. The translation in SPARK consists of three major steps: term mapping, query graph construction and query ranking. Specifically, a probabilistic query ranking model is proposed to select the most likely SPARQL query. In the experiment, SPARK achieved an encouraging translation result.

  17. Do semantic standards lack quality? A survey among 34 semantic standards

    NARCIS (Netherlands)

    Folmer, E.J.A.; Oude Luttighuis, P.H.W.M.; Hillegersberg, J. van

    2011-01-01

    The adoption of standards to improve interoperability in the automotive, aerospace, shipbuilding and other sectors could save billions. While interoperability standards have been created for a number of industries, problems persist, suggesting a lack of quality of the standards themselves. The issue

  18. Do semantic standards lack quality? A survey among 34 semantic standards.

    NARCIS (Netherlands)

    Folmer, Erwin Johan Albert; Oude Luttighuis, Paul; van Hillegersberg, Jos

    2011-01-01

    The adoption of standards to improve interoperability in the automotive, aerospace, shipbuilding and other sectors could save billions. While interoperability standards have been created for a number of industries, problems persist, suggesting a lack of quality of the standards themselves. The issue

  19. Interactive test tool for interoperable C-ITS development

    NARCIS (Netherlands)

    Voronov, A.; Englund, C.; Bengtsson, H.H.; Chen, L.; Ploeg, J.; Jongh, J.F.C.M. de; Sluis, H.J.D. van de

    2015-01-01

    This paper presents the architecture of an Interactive Test Tool (ITT) for interoperability testing of Cooperative Intelligent Transport Systems (C-ITS). Cooperative systems are developed by different manufacturers at different locations, which makes interoperability testing a tedious task. Up until

  20. Characterizing semantic mappings adaptation via biomedical KOS evolution: a case study investigating SNOMED CT and ICD.

    Science.gov (United States)

    Dos Reis, Julio Cesar; Pruski, Cédric; Da Silveira, Marcos; Reynaud-Delaître, Chantal

    2013-01-01

    Mappings established between Knowledge Organization Systems (KOS) increase semantic interoperability between biomedical information systems. However, biomedical knowledge is highly dynamic and changes affecting KOS entities can potentially invalidate part or the totality of existing mappings. Understanding how mappings evolve and what the impacts of KOS evolution on mappings are is therefore crucial for the definition of an automatic approach to maintain mappings valid and up-to-date over time. In this article, we study variations of a specific KOS complex change (split) for two biomedical KOS (SNOMED CT and ICD-9-CM) through a rigorous method of investigation for identifying and refining complex changes, and for selecting representative cases. We empirically analyze and explain their influence on the evolution of associated mappings. Results point out the importance of considering various dimensions of the information described in KOS, like the semantic structure of concepts, the set of relevant information used to define the mappings and the change operations interfering with this set of information.

  1. An open repositories network development for medical teaching resources.

    Science.gov (United States)

    Soula, Gérard; Darmoni, Stefan; Le Beux, Pierre; Renard, Jean-Marie; Dahamna, Badisse; Fieschi, Marius

    2010-01-01

    The lack of interoperability between repositories of heterogeneous and geographically widespread data is an obstacle to the diffusion, sharing and reutilization of those data. We present the development of an open repositories network taking into account both the syntactic and semantic interoperability of the different repositories and based on international standards in this field. The network is used by the medical community in France for the diffusion and sharing of digital teaching resources. The syntactic interoperability of the repositories is managed using the OAI-PMH protocol for the exchange of metadata describing the resources. Semantic interoperability is based, on one hand, on the LOM standard for the description of resources and on MESH for the indexing of the latter and, on the other hand, on semantic interoperability management designed to optimize compliance with standards and the quality of the metadata.

  2. Telemedicine system interoperability architecture: concept description and architecture overview.

    Energy Technology Data Exchange (ETDEWEB)

    Craft, Richard Layne, II

    2004-05-01

    In order for telemedicine to realize the vision of anywhere, anytime access to care, it must address the question of how to create a fully interoperable infrastructure. This paper describes the reasons for pursuing interoperability, outlines operational requirements that any interoperability approach needs to consider, proposes an abstract architecture for meeting these needs, identifies candidate technologies that might be used for rendering this architecture, and suggests a path forward that the telemedicine community might follow.

  3. Formal Semantics and Implementation of BPMN 2.0 Inclusive Gateways

    Science.gov (United States)

    Christiansen, David Raymond; Carbone, Marco; Hildebrandt, Thomas

    We present the first direct formalization of the semantics of inclusive gateways as described in the Business Process Modeling Notation (BPMN) 2.0 Beta 1 specification. The formal semantics is given for a minimal subset of BPMN 2.0 containing just the inclusive and exclusive gateways and the start and stop events. By focusing on this subset we achieve a simple graph model that highlights the particular non-local features of the inclusive gateway semantics. We sketch two ways of implementing the semantics using algorithms based on incrementally updated data structures and also discuss distributed communication-based implementations of the two algorithms.

  4. Semantic Advertising

    OpenAIRE

    Zamanzadeh, Ben; Ashish, Naveen; Ramakrishnan, Cartic; Zimmerman, John

    2013-01-01

    We present the concept of Semantic Advertising which we see as the future of online advertising. Semantic Advertising is online advertising powered by semantic technology which essentially enables us to represent and reason with concepts and the meaning of things. This paper aims to 1) Define semantic advertising, 2) Place it in the context of broader and more widely used concepts such as the Semantic Web and Semantic Search, 3) Provide a survey of work in related areas such as context matchi...

  5. Architectures for the Development of the National Interoperability Framework in Romania

    Directory of Open Access Journals (Sweden)

    Codrin-Florentin NISIOIU

    2015-10-01

    Full Text Available The authors of Digital Agenda consider that Europe do not take fully advantage of interoperability. They believe that we need effective interoperability between IT products and services to build a truly Digital Society. The Digital Agenda can only be effective if all the elements and applications are interoperable and based on open standards and platforms. In this context, I propose in this article a specific architecture for developing Romanian National Interoperability framework.

  6. Intelligent Discovery for Learning Objects Using Semantic Web Technologies

    Science.gov (United States)

    Hsu, I-Ching

    2012-01-01

    The concept of learning objects has been applied in the e-learning field to promote the accessibility, reusability, and interoperability of learning content. Learning Object Metadata (LOM) was developed to achieve these goals by describing learning objects in order to provide meaningful metadata. Unfortunately, the conventional LOM lacks the…

  7. The Semantic Automated Discovery and Integration (SADI Web service Design-Pattern, API and Reference Implementation

    Directory of Open Access Journals (Sweden)

    Wilkinson Mark D

    2011-10-01

    Full Text Available Abstract Background The complexity and inter-related nature of biological data poses a difficult challenge for data and tool integration. There has been a proliferation of interoperability standards and projects over the past decade, none of which has been widely adopted by the bioinformatics community. Recent attempts have focused on the use of semantics to assist integration, and Semantic Web technologies are being welcomed by this community. Description SADI - Semantic Automated Discovery and Integration - is a lightweight set of fully standards-compliant Semantic Web service design patterns that simplify the publication of services of the type commonly found in bioinformatics and other scientific domains. Using Semantic Web technologies at every level of the Web services "stack", SADI services consume and produce instances of OWL Classes following a small number of very straightforward best-practices. In addition, we provide codebases that support these best-practices, and plug-in tools to popular developer and client software that dramatically simplify deployment of services by providers, and the discovery and utilization of those services by their consumers. Conclusions SADI Services are fully compliant with, and utilize only foundational Web standards; are simple to create and maintain for service providers; and can be discovered and utilized in a very intuitive way by biologist end-users. In addition, the SADI design patterns significantly improve the ability of software to automatically discover appropriate services based on user-needs, and automatically chain these into complex analytical workflows. We show that, when resources are exposed through SADI, data compliant with a given ontological model can be automatically gathered, or generated, from these distributed, non-coordinating resources - a behaviour we have not observed in any other Semantic system. Finally, we show that, using SADI, data dynamically generated from Web services

  8. The Semantic Automated Discovery and Integration (SADI) Web service Design-Pattern, API and Reference Implementation

    Science.gov (United States)

    2011-01-01

    Background The complexity and inter-related nature of biological data poses a difficult challenge for data and tool integration. There has been a proliferation of interoperability standards and projects over the past decade, none of which has been widely adopted by the bioinformatics community. Recent attempts have focused on the use of semantics to assist integration, and Semantic Web technologies are being welcomed by this community. Description SADI - Semantic Automated Discovery and Integration - is a lightweight set of fully standards-compliant Semantic Web service design patterns that simplify the publication of services of the type commonly found in bioinformatics and other scientific domains. Using Semantic Web technologies at every level of the Web services "stack", SADI services consume and produce instances of OWL Classes following a small number of very straightforward best-practices. In addition, we provide codebases that support these best-practices, and plug-in tools to popular developer and client software that dramatically simplify deployment of services by providers, and the discovery and utilization of those services by their consumers. Conclusions SADI Services are fully compliant with, and utilize only foundational Web standards; are simple to create and maintain for service providers; and can be discovered and utilized in a very intuitive way by biologist end-users. In addition, the SADI design patterns significantly improve the ability of software to automatically discover appropriate services based on user-needs, and automatically chain these into complex analytical workflows. We show that, when resources are exposed through SADI, data compliant with a given ontological model can be automatically gathered, or generated, from these distributed, non-coordinating resources - a behaviour we have not observed in any other Semantic system. Finally, we show that, using SADI, data dynamically generated from Web services can be explored in a manner

  9. The Semantic Automated Discovery and Integration (SADI) Web service Design-Pattern, API and Reference Implementation.

    Science.gov (United States)

    Wilkinson, Mark D; Vandervalk, Benjamin; McCarthy, Luke

    2011-10-24

    The complexity and inter-related nature of biological data poses a difficult challenge for data and tool integration. There has been a proliferation of interoperability standards and projects over the past decade, none of which has been widely adopted by the bioinformatics community. Recent attempts have focused on the use of semantics to assist integration, and Semantic Web technologies are being welcomed by this community. SADI - Semantic Automated Discovery and Integration - is a lightweight set of fully standards-compliant Semantic Web service design patterns that simplify the publication of services of the type commonly found in bioinformatics and other scientific domains. Using Semantic Web technologies at every level of the Web services "stack", SADI services consume and produce instances of OWL Classes following a small number of very straightforward best-practices. In addition, we provide codebases that support these best-practices, and plug-in tools to popular developer and client software that dramatically simplify deployment of services by providers, and the discovery and utilization of those services by their consumers. SADI Services are fully compliant with, and utilize only foundational Web standards; are simple to create and maintain for service providers; and can be discovered and utilized in a very intuitive way by biologist end-users. In addition, the SADI design patterns significantly improve the ability of software to automatically discover appropriate services based on user-needs, and automatically chain these into complex analytical workflows. We show that, when resources are exposed through SADI, data compliant with a given ontological model can be automatically gathered, or generated, from these distributed, non-coordinating resources - a behaviour we have not observed in any other Semantic system. Finally, we show that, using SADI, data dynamically generated from Web services can be explored in a manner very similar to data housed in

  10. Data Modeling Challenges of Advanced Interoperability.

    Science.gov (United States)

    Blobel, Bernd; Oemig, Frank; Ruotsalainen, Pekka

    2018-01-01

    Progressive health paradigms, involving many different disciplines and combining multiple policy domains, requires advanced interoperability solutions. This results in special challenges for modeling health systems. The paper discusses classification systems for data models and enterprise business architectures and compares them with the ISO Reference Architecture. On that basis, existing definitions, specifications and standards of data models for interoperability are evaluated and their limitations are discussed. Amendments to correctly use those models and to better meet the aforementioned challenges are offered.

  11. Promoting Interoperability: The Case for Discipline-Specific PSAPS

    Science.gov (United States)

    2014-12-01

    multijurisdictional, interoperability is a key factor for success. Responses to 9/11,9 the Oso mudslides in Washington, the Boston Marathon bombing...Continuum125 2. Functional Interoperability As demonstrated by the 9/11 attacks, the Oso mudslide in Washington, the Boston Marathon bombing, and other large

  12. An Architecture Based on Linked Data Technologies for the Integration and Reuse of OER in MOOCs Context

    Science.gov (United States)

    Piedra, Nelson; Chicaiza, Janneth Alexandra; López, Jorge; Tovar, Edmundo

    2014-01-01

    The Linked Data initiative is considered as one of the most effective alternatives for creating global shared information spaces, it has become an interesting approach for discovering and enriching open educational resources data, as well as achieving semantic interoperability and re-use between multiple OER repositories. The notion of Linked Data…

  13. A web services choreography scenario for interoperating bioinformatics applications

    Directory of Open Access Journals (Sweden)

    Cheung David W

    2004-03-01

    Full Text Available Abstract Background Very often genome-wide data analysis requires the interoperation of multiple databases and analytic tools. A large number of genome databases and bioinformatics applications are available through the web, but it is difficult to automate interoperation because: 1 the platforms on which the applications run are heterogeneous, 2 their web interface is not machine-friendly, 3 they use a non-standard format for data input and output, 4 they do not exploit standards to define application interface and message exchange, and 5 existing protocols for remote messaging are often not firewall-friendly. To overcome these issues, web services have emerged as a standard XML-based model for message exchange between heterogeneous applications. Web services engines have been developed to manage the configuration and execution of a web services workflow. Results To demonstrate the benefit of using web services over traditional web interfaces, we compare the two implementations of HAPI, a gene expression analysis utility developed by the University of California San Diego (UCSD that allows visual characterization of groups or clusters of genes based on the biomedical literature. This utility takes a set of microarray spot IDs as input and outputs a hierarchy of MeSH Keywords that correlates to the input and is grouped by Medical Subject Heading (MeSH category. While the HTML output is easy for humans to visualize, it is difficult for computer applications to interpret semantically. To facilitate the capability of machine processing, we have created a workflow of three web services that replicates the HAPI functionality. These web services use document-style messages, which means that messages are encoded in an XML-based format. We compared three approaches to the implementation of an XML-based workflow: a hard coded Java application, Collaxa BPEL Server and Taverna Workbench. The Java program functions as a web services engine and interoperates

  14. Semantator: annotating clinical narratives with semantic web ontologies.

    Science.gov (United States)

    Song, Dezhao; Chute, Christopher G; Tao, Cui

    2012-01-01

    To facilitate clinical research, clinical data needs to be stored in a machine processable and understandable way. Manual annotating clinical data is time consuming. Automatic approaches (e.g., Natural Language Processing systems) have been adopted to convert such data into structured formats; however, the quality of such automatically extracted data may not always be satisfying. In this paper, we propose Semantator, a semi-automatic tool for document annotation with Semantic Web ontologies. With a loaded free text document and an ontology, Semantator supports the creation/deletion of ontology instances for any document fragment, linking/disconnecting instances with the properties in the ontology, and also enables automatic annotation by connecting to the NCBO annotator and cTAKES. By representing annotations in Semantic Web standards, Semantator supports reasoning based upon the underlying semantics of the owl:disjointWith and owl:equivalentClass predicates. We present discussions based on user experiences of using Semantator.

  15. EarthCube - Earth System Bridge: Spanning Scientific Communities with Interoperable Modeling Frameworks

    Science.gov (United States)

    Peckham, S. D.; DeLuca, C.; Gochis, D. J.; Arrigo, J.; Kelbert, A.; Choi, E.; Dunlap, R.

    2014-12-01

    In order to better understand and predict environmental hazards of weather/climate, ecology and deep earth processes, geoscientists develop and use physics-based computational models. These models are used widely both in academic and federal communities. Because of the large effort required to develop and test models, there is widespread interest in component-based modeling, which promotes model reuse and simplified coupling to tackle problems that often cross discipline boundaries. In component-based modeling, the goal is to make relatively small changes to models that make it easy to reuse them as "plug-and-play" components. Sophisticated modeling frameworks exist to rapidly couple these components to create new composite models. They allow component models to exchange variables while accommodating different programming languages, computational grids, time-stepping schemes, variable names and units. Modeling frameworks have arisen in many modeling communities. CSDMS (Community Surface Dynamics Modeling System) serves the academic earth surface process dynamics community, while ESMF (Earth System Modeling Framework) serves many federal Earth system modeling projects. Others exist in both the academic and federal domains and each satisfies design criteria that are determined by the community they serve. While they may use different interface standards or semantic mediation strategies, they share fundamental similarities. The purpose of the Earth System Bridge project is to develop mechanisms for interoperability between modeling frameworks, such as the ability to share a model or service component. This project has three main goals: (1) Develop a Framework Description Language (ES-FDL) that allows modeling frameworks to be described in a standard way so that their differences and similarities can be assessed. (2) Demonstrate that if a model is augmented with a framework-agnostic Basic Model Interface (BMI), then simple, universal adapters can go from BMI to a

  16. Interoperability of Demand Response Resources Demonstration in NY

    Energy Technology Data Exchange (ETDEWEB)

    Wellington, Andre

    2014-03-31

    The Interoperability of Demand Response Resources Demonstration in NY (Interoperability Project) was awarded to Con Edison in 2009. The objective of the project was to develop and demonstrate methodologies to enhance the ability of customer sited Demand Response resources to integrate more effectively with electric delivery companies and regional transmission organizations.

  17. Nato Multinational Brigade Interoperability: Issues, Mitigating Solutions and is it Time for a Nato Multinational Brigade Doctrine?

    Directory of Open Access Journals (Sweden)

    Schiller Mark

    2016-06-01

    Full Text Available Multinational Brigade Operations involving NATO and its European Partners are the norm in the post-Cold War Era. Commonplace today are Multinational Brigades, composed of staffs and subordinate units representing almost every NATO Country and Partner, participating in training exercises or actual operations in both the European and Southwest Asian Theatres. Leadership challenges are prevalent for the Multinational Brigade Commander and his staff, especially those challenges they face in achieving an effective level of brigade interoperability in order to conduct successful operations in NATO’s present and future operating environments. The purpose of this paper is twofold: to examine the major interoperability obstacles a multinational brigade commander and his staff are likely to encounter during the planning and execution of brigade operations; and, to recommend actions and measures a multinational brigade commander and his staff can implement to facilitate interoperability in a multinational brigade operating environment. Several key interoperability topics considered integral to effective multinational brigade operations will be examined and analysed to include understanding partner unit capabilities and limitations facilitated by an integration plan, appropriate command and support relationships, compatible communications, synchronized intelligence and information collection, establishing effective liaison, and fratricide prevention. The paper conclusion will urge for a NATO land brigade doctrine considering doctrine’s critical importance to effective brigade command and control interoperability and the expected missions a land brigade will encounter in future NATO operating environments as part of the NATO Very High Readiness Joint Task Force (VJTF.

  18. OGC and Grid Interoperability in enviroGRIDS Project

    Science.gov (United States)

    Gorgan, Dorian; Rodila, Denisa; Bacu, Victor; Giuliani, Gregory; Ray, Nicolas

    2010-05-01

    EnviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is a 4-years FP7 Project aiming to address the subjects of ecologically unsustainable development and inadequate resource management. The project develops a Spatial Data Infrastructure of the Black Sea Catchment region. The geospatial technologies offer very specialized functionality for Earth Science oriented applications as well as the Grid oriented technology that is able to support distributed and parallel processing. One challenge of the enviroGRIDS project is the interoperability between geospatial and Grid infrastructures by providing the basic and the extended features of the both technologies. The geospatial interoperability technology has been promoted as a way of dealing with large volumes of geospatial data in distributed environments through the development of interoperable Web service specifications proposed by the Open Geospatial Consortium (OGC), with applications spread across multiple fields but especially in Earth observation research. Due to the huge volumes of data available in the geospatial domain and the additional introduced issues (data management, secure data transfer, data distribution and data computation), the need for an infrastructure capable to manage all those problems becomes an important aspect. The Grid promotes and facilitates the secure interoperations of geospatial heterogeneous distributed data within a distributed environment, the creation and management of large distributed computational jobs and assures a security level for communication and transfer of messages based on certificates. This presentation analysis and discusses the most significant use cases for enabling the OGC Web services interoperability with the Grid environment and focuses on the description and implementation of the most promising one. In these use cases we give a special attention to issues such as: the relations between computational grid and

  19. The Influence of Information Systems Interoperability on Economic Activity in Poland

    Directory of Open Access Journals (Sweden)

    Ganczar Małgorzata

    2017-12-01

    Full Text Available In the text, I discuss the abilities and challenges of information systems interoperability. The anticipated and expected result of interoperability is to improve the provision of public utility services to citizens and companies by means of facilitating the provision of public utility services on the basis of a “single window” principle and reducing the costs incurred by public administrations, companies, and citizens, resulting from the efficiency of the provision of public utility services. In the article, the conceptual framework of interoperability is elaborated upon. Moreover, information systems and public registers for entrepreneurs in Poland exemplify whether the interoperability may be applied and, if so, whether interoperability fulfils its targets to the extent of e-Government services for entrepreneurs.

  20. Gazetteer Brokering through Semantic Mediation

    Science.gov (United States)

    Hobona, G.; Bermudez, L. E.; Brackin, R.

    2013-12-01

    A gazetteer is a geographical directory containing some information regarding places. It provides names, location and other attributes for places which may include points of interest (e.g. buildings, oilfields and boreholes), and other features. These features can be published via web services conforming to the Gazetteer Application Profile of the Web Feature Service (WFS) standard of the Open Geospatial Consortium (OGC). Against the backdrop of advances in geophysical surveys, there has been a significant increase in the amount of data referenced to locations. Gazetteers services have played a significant role in facilitating access to such data, including through provision of specialized queries such as text, spatial and fuzzy search. Recent developments in the OGC have led to advances in gazetteers such as support for multilingualism, diacritics, and querying via advanced spatial constraints (e.g. search by radial search and nearest neighbor). A challenge remaining however, is that gazetteers produced by different organizations have typically been modeled differently. Inconsistencies from gazetteers produced by different organizations may include naming the same feature in a different way, naming the attributes differently, locating the feature in a different location, and providing fewer or more attributes than the other services. The Gazetteer application profile of the WFS is a starting point to address such inconsistencies by providing a standardized interface based on rules specified in ISO 19112, the international standard for spatial referencing by geographic identifiers. The profile, however, does not provide rules to deal with semantic inconsistencies. The USGS and NGA commissioned research into the potential for a Single Point of Entry Global Gazetteer (SPEGG). The research was conducted by the Cross Community Interoperability thread of the OGC testbed, referenced OWS-9. The testbed prototyped approaches for brokering gazetteers through use of semantic

  1. Lexical-semantic Mapping between Chinese and English Controlled Vocabularies in the Domain of Chinese Art

    Directory of Open Access Journals (Sweden)

    Shu-Jiun Chen

    2015-12-01

    Full Text Available This study conducts a lexical-semantic mapping between the Chinese controlled vocabulary developed by the National Palace Museum (NPM-CV in Taiwan and an English controlled vocabulary, the Art & Architecture Thesaurus (AAT developed by the Getty Research Institute in the U.S. that is primarily based on Western art. The research question is: In mapping a Chinese controlled vocabulary in Chinese art to a Western-centered art thesaurus, what types of relationships can be identified and what are the issues in mapping? The study’s main findings reveal that only one-third of the NPM-CV terms can be mapped as “exact equivalence” to AAT terms and three-fifths of the NPMCV terms have hierarchical relationships (narrower to broader with some AAT terms. Clearly, using AAT alone to index Chinese art collections will lead to insufficient indexing specificity. The study then proposes solutions to improve Chinese-English semantic interoperability for multilingual knowledge organization systems in the domain of Chinese art. [Article content in Chinese

  2. Patterns in Standards and Technologies for Economic Information Systems Interoperability

    Directory of Open Access Journals (Sweden)

    Vasile Irimia

    2012-06-01

    Full Text Available This paper presets results from a review of the current standards used for collaboration between economic information systems, including web services and service oriented architecture, EDI, ebXML framework, RosettaNet framework, cXML, xCBL UBL, BPMN, BPEL, WS-CDL, ASN.1, and others. Standards have a key role in promoting economic information system interoperability, and thus enable collaboration. Analyzing the current standards, technologies and applications used for economic information systems interoperability has revealed a common pattern that runs through all of them. From this pattern we construct a basic model of interoperability around which we relate and judge all standards, technologies and applications for economic information systems interoperability.

  3. The next generation of interoperability agents in healthcare.

    Science.gov (United States)

    Cardoso, Luciana; Marins, Fernando; Portela, Filipe; Santos, Manuel; Abelha, António; Machado, José

    2014-05-16

    Interoperability in health information systems is increasingly a requirement rather than an option. Standards and technologies, such as multi-agent systems, have proven to be powerful tools in interoperability issues. In the last few years, the authors have worked on developing the Agency for Integration, Diffusion and Archive of Medical Information (AIDA), which is an intelligent, agent-based platform to ensure interoperability in healthcare units. It is increasingly important to ensure the high availability and reliability of systems. The functions provided by the systems that treat interoperability cannot fail. This paper shows the importance of monitoring and controlling intelligent agents as a tool to anticipate problems in health information systems. The interaction between humans and agents through an interface that allows the user to create new agents easily and to monitor their activities in real time is also an important feature, as health systems evolve by adopting more features and solving new problems. A module was installed in Centro Hospitalar do Porto, increasing the functionality and the overall usability of AIDA.

  4. The Next Generation of Interoperability Agents in Healthcare

    Directory of Open Access Journals (Sweden)

    Luciana Cardoso

    2014-05-01

    Full Text Available Interoperability in health information systems is increasingly a requirement rather than an option. Standards and technologies, such as multi-agent systems, have proven to be powerful tools in interoperability issues. In the last few years, the authors have worked on developing the Agency for Integration, Diffusion and Archive of Medical Information (AIDA, which is an intelligent, agent-based platform to ensure interoperability in healthcare units. It is increasingly important to ensure the high availability and reliability of systems. The functions provided by the systems that treat interoperability cannot fail. This paper shows the importance of monitoring and controlling intelligent agents as a tool to anticipate problems in health information systems. The interaction between humans and agents through an interface that allows the user to create new agents easily and to monitor their activities in real time is also an important feature, as health systems evolve by adopting more features and solving new problems. A module was installed in Centro Hospitalar do Porto, increasing the functionality and the overall usability of AIDA.

  5. Semantic content-based recommendations using semantic graphs.

    Science.gov (United States)

    Guo, Weisen; Kraines, Steven B

    2010-01-01

    Recommender systems (RSs) can be useful for suggesting items that might be of interest to specific users. Most existing content-based recommendation (CBR) systems are designed to recommend items based on text content, and the items in these systems are usually described with keywords. However, similarity evaluations based on keywords suffer from the ambiguity of natural languages. We present a semantic CBR method that uses Semantic Web technologies to recommend items that are more similar semantically with the items that the user prefers. We use semantic graphs to represent the items and we calculate the similarity scores for each pair of semantic graphs using an inverse graph frequency algorithm. The items having higher similarity scores to the items that are known to be preferred by the user are recommended.

  6. Reactive Leadership: Divining, Developing, and Demonstrating Community Ontologies

    Science.gov (United States)

    Graybeal, J.

    2008-12-01

    The Marine Metadata Interoperability Project (known as MMI, on the web at http://marinemetadata.org) was formed to provide leadership in metadata practices to the marine science community. In 2004 this meant finding and writing about resources and best practices, which until then were all but invisible. In 2008 the scope is far wider, encompassing comprehensive guidance, collaborative community environments, and introduction and demonstration of advanced technologies to an increasingly interested scientific domain. MMI's technical leadership, based on experiences gained in the hydrologic community, emphasized the role ontologies could play in marine science. An early MMI workshop successfully incorporated a large number of community vocabularies, tools to harmonize them in a common ontological format, and the mapping of terms from vocabularies expressed in that format. That 2005 workshop demonstrated the connections to be made among different community vocabularies, and was well regarded by participants, but did not lead to widespread adoption of the tools, technologies, or even the vocabularies. Ontology development efforts for marine sensors and platforms showed intermittent progress, but again were not adopted or pushed toward completion. It is now 2008, and the marine community is increasingly attentive to a wide range of interoperability issues. A large part of the community has at least heard of "semantic interoperability", and many understand its critical role in finding and working with data. Demand for specific solutions, and for workable approaches, is becoming more vocal in the marine community. Yet there is still no encompassing model in place for achieving semantic interoperability, only simple operational registries have been set up for oceanographic community vocabularies, and only a few isolated applications demonstrate how semantic barriers can be overcome. Why has progress been so slow? Are good answers on the horizon? And if we build it, will the

  7. Standardized Semantic Markup for Reference Terminologies, Thesauri and Coding Systems: Benefits for distributed E-Health Applications.

    Science.gov (United States)

    Hoelzer, Simon; Schweiger, Ralf K; Liu, Raymond; Rudolf, Dirk; Rieger, Joerg; Dudeck, Joachim

    2005-01-01

    With the introduction of the ICD-10 as the standard for diagnosis, the development of an electronic representation of its complete content, inherent semantics and coding rules is necessary. Our concept refers to current efforts of the CEN/TC 251 to establish a European standard for hierarchical classification systems in healthcare. We have developed an electronic representation of the ICD-10 with the extensible Markup Language (XML) that facilitates the integration in current information systems or coding software taking into account different languages and versions. In this context, XML offers a complete framework of related technologies and standard tools for processing that helps to develop interoperable applications.

  8. Semantic similarity measure in biomedical domain leverage web search engine.

    Science.gov (United States)

    Chen, Chi-Huang; Hsieh, Sheau-Ling; Weng, Yung-Ching; Chang, Wen-Yung; Lai, Feipei

    2010-01-01

    Semantic similarity measure plays an essential role in Information Retrieval and Natural Language Processing. In this paper we propose a page-count-based semantic similarity measure and apply it in biomedical domains. Previous researches in semantic web related applications have deployed various semantic similarity measures. Despite the usefulness of the measurements in those applications, measuring semantic similarity between two terms remains a challenge task. The proposed method exploits page counts returned by the Web Search Engine. We define various similarity scores for two given terms P and Q, using the page counts for querying P, Q and P AND Q. Moreover, we propose a novel approach to compute semantic similarity using lexico-syntactic patterns with page counts. These different similarity scores are integrated adapting support vector machines, to leverage the robustness of semantic similarity measures. Experimental results on two datasets achieve correlation coefficients of 0.798 on the dataset provided by A. Hliaoutakis, 0.705 on the dataset provide by T. Pedersen with physician scores and 0.496 on the dataset provided by T. Pedersen et al. with expert scores.

  9. Leveraging electronic healthcare record standards and semantic web technologies for the identification of patient cohorts

    Science.gov (United States)

    Fernández-Breis, Jesualdo Tomás; Maldonado, José Alberto; Marcos, Mar; Legaz-García, María del Carmen; Moner, David; Torres-Sospedra, Joaquín; Esteban-Gil, Angel; Martínez-Salvador, Begoña; Robles, Montserrat

    2013-01-01

    Background The secondary use of electronic healthcare records (EHRs) often requires the identification of patient cohorts. In this context, an important problem is the heterogeneity of clinical data sources, which can be overcome with the combined use of standardized information models, virtual health records, and semantic technologies, since each of them contributes to solving aspects related to the semantic interoperability of EHR data. Objective To develop methods allowing for a direct use of EHR data for the identification of patient cohorts leveraging current EHR standards and semantic web technologies. Materials and methods We propose to take advantage of the best features of working with EHR standards and ontologies. Our proposal is based on our previous results and experience working with both technological infrastructures. Our main principle is to perform each activity at the abstraction level with the most appropriate technology available. This means that part of the processing will be performed using archetypes (ie, data level) and the rest using ontologies (ie, knowledge level). Our approach will start working with EHR data in proprietary format, which will be first normalized and elaborated using EHR standards and then transformed into a semantic representation, which will be exploited by automated reasoning. Results We have applied our approach to protocols for colorectal cancer screening. The results comprise the archetypes, ontologies, and datasets developed for the standardization and semantic analysis of EHR data. Anonymized real data have been used and the patients have been successfully classified by the risk of developing colorectal cancer. Conclusions This work provides new insights in how archetypes and ontologies can be effectively combined for EHR-driven phenotyping. The methodological approach can be applied to other problems provided that suitable archetypes, ontologies, and classification rules can be designed. PMID:23934950

  10. Leveraging electronic healthcare record standards and semantic web technologies for the identification of patient cohorts.

    Science.gov (United States)

    Fernández-Breis, Jesualdo Tomás; Maldonado, José Alberto; Marcos, Mar; Legaz-García, María del Carmen; Moner, David; Torres-Sospedra, Joaquín; Esteban-Gil, Angel; Martínez-Salvador, Begoña; Robles, Montserrat

    2013-12-01

    The secondary use of electronic healthcare records (EHRs) often requires the identification of patient cohorts. In this context, an important problem is the heterogeneity of clinical data sources, which can be overcome with the combined use of standardized information models, virtual health records, and semantic technologies, since each of them contributes to solving aspects related to the semantic interoperability of EHR data. To develop methods allowing for a direct use of EHR data for the identification of patient cohorts leveraging current EHR standards and semantic web technologies. We propose to take advantage of the best features of working with EHR standards and ontologies. Our proposal is based on our previous results and experience working with both technological infrastructures. Our main principle is to perform each activity at the abstraction level with the most appropriate technology available. This means that part of the processing will be performed using archetypes (ie, data level) and the rest using ontologies (ie, knowledge level). Our approach will start working with EHR data in proprietary format, which will be first normalized and elaborated using EHR standards and then transformed into a semantic representation, which will be exploited by automated reasoning. We have applied our approach to protocols for colorectal cancer screening. The results comprise the archetypes, ontologies, and datasets developed for the standardization and semantic analysis of EHR data. Anonymized real data have been used and the patients have been successfully classified by the risk of developing colorectal cancer. This work provides new insights in how archetypes and ontologies can be effectively combined for EHR-driven phenotyping. The methodological approach can be applied to other problems provided that suitable archetypes, ontologies, and classification rules can be designed.

  11. Reference architecture for interoperability testing of Electric Vehicle charging

    NARCIS (Netherlands)

    Lehfuss, F.; Nohrer, M.; Werkmany, E.; Lopezz, J.A.; Zabalaz, E.

    2015-01-01

    This paper presents a reference architecture for interoperability testing of electric vehicles as well as their support equipment with the smart grid and the e-Mobility environment. Pan-European Electric Vehicle (EV)-charging is currently problematic as there are compliance and interoperability

  12. Solving the interoperability challenge of a distributed complex patient guidance system: a data integrator based on HL7's Virtual Medical Record standard.

    Science.gov (United States)

    Marcos, Carlos; González-Ferrer, Arturo; Peleg, Mor; Cavero, Carlos

    2015-05-01

    We show how the HL7 Virtual Medical Record (vMR) standard can be used to design and implement a data integrator (DI) component that collects patient information from heterogeneous sources and stores it into a personal health record, from which it can then retrieve data. Our working hypothesis is that the HL7 vMR standard in its release 1 version can properly capture the semantics needed to drive evidence-based clinical decision support systems. To achieve seamless communication between the personal health record and heterogeneous data consumers, we used a three-pronged approach. First, the choice of the HL7 vMR as a message model for all components accompanied by the use of medical vocabularies eases their semantic interoperability. Second, the DI follows a service-oriented approach to provide access to system components. Third, an XML database provides the data layer.Results The DI supports requirements of a guideline-based clinical decision support system implemented in two clinical domains and settings, ensuring reliable and secure access, high performance, and simplicity of integration, while complying with standards for the storage and processing of patient information needed for decision support and analytics. This was tested within the framework of a multinational project (www.mobiguide-project.eu) aimed at developing a ubiquitous patient guidance system (PGS). The vMR model with its extension mechanism is demonstrated to be effective for data integration and communication within a distributed PGS implemented for two clinical domains across different healthcare settings in two nations. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  13. Arabic web pages clustering and annotation using semantic class features

    Directory of Open Access Journals (Sweden)

    Hanan M. Alghamdi

    2014-12-01

    Full Text Available To effectively manage the great amount of data on Arabic web pages and to enable the classification of relevant information are very important research problems. Studies on sentiment text mining have been very limited in the Arabic language because they need to involve deep semantic processing. Therefore, in this paper, we aim to retrieve machine-understandable data with the help of a Web content mining technique to detect covert knowledge within these data. We propose an approach to achieve clustering with semantic similarities. This approach comprises integrating k-means document clustering with semantic feature extraction and document vectorization to group Arabic web pages according to semantic similarities and then show the semantic annotation. The document vectorization helps to transform text documents into a semantic class probability distribution or semantic class density. To reach semantic similarities, the approach extracts the semantic class features and integrates them into the similarity weighting schema. The quality of the clustering result has evaluated the use of the purity and the mean intra-cluster distance (MICD evaluation measures. We have evaluated the proposed approach on a set of common Arabic news web pages. We have acquired favorable clustering results that are effective in minimizing the MICD, expanding the purity and lowering the runtime.

  14. An Ontological Solution to Support Interoperability in the Textile Industry

    Science.gov (United States)

    Duque, Arantxa; Campos, Cristina; Jiménez-Ruiz, Ernesto; Chalmeta, Ricardo

    Significant developments in information and communication technologies and challenging market conditions have forced enterprises to adapt their way of doing business. In this context, providing mechanisms to guarantee interoperability among heterogeneous organisations has become a critical issue. Even though prolific research has already been conducted in the area of enterprise interoperability, we have found that enterprises still struggle to introduce fully interoperable solutions, especially, in terms of the development and application of ontologies. Thus, the aim of this paper is to introduce basic ontology concepts in a simple manner and to explain the advantages of the use of ontologies to improve interoperability. We will also present a case study showing the implementation of an application ontology for an enterprise in the textile/clothing sector.

  15. Interoperability of Information Systems Managed and Used by the Local Health Departments.

    Science.gov (United States)

    Shah, Gulzar H; Leider, Jonathon P; Luo, Huabin; Kaur, Ravneet

    2016-01-01

    In the post-Affordable Care Act era marked by interorganizational collaborations and availability of large amounts of electronic data from other community partners, it is imperative to assess the interoperability of information systems used by the local health departments (LHDs). To describe the level of interoperability of LHD information systems and identify factors associated with lack of interoperability. This mixed-methods research uses data from the 2015 Informatics Capacity and Needs Assessment Survey, with a target population of all LHDs in the United States. A representative sample of 650 LHDs was drawn using a stratified random sampling design. A total of 324 completed responses were received (50% response rate). Qualitative data were used from a key informant interview study of LHD informatics staff from across the United States. Qualitative data were independently coded by 2 researchers and analyzed thematically. Survey data were cleaned, bivariate comparisons were conducted, and a multivariable logistic regression was run to characterize factors associated with interoperability. For 30% of LHDs, no systems were interoperable, and 38% of LHD respondents indicated some of the systems were interoperable. Significant determinants of interoperability included LHDs having leadership support (adjusted odds ratio [AOR] = 3.54), control of information technology budget allocation (AOR = 2.48), control of data systems (AOR = 2.31), having a strategic plan for information systems (AOR = 1.92), and existence of business process analysis and redesign (AOR = 1.49). Interoperability of all systems may be an informatics goal, but only a small proportion of LHDs reported having interoperable systems, pointing to a substantial need among LHDs nationwide.

  16. Scientific Digital Libraries, Interoperability, and Ontologies

    Science.gov (United States)

    Hughes, J. Steven; Crichton, Daniel J.; Mattmann, Chris A.

    2009-01-01

    Scientific digital libraries serve complex and evolving research communities. Justifications for the development of scientific digital libraries include the desire to preserve science data and the promises of information interconnectedness, correlative science, and system interoperability. Shared ontologies are fundamental to fulfilling these promises. We present a tool framework, some informal principles, and several case studies where shared ontologies are used to guide the implementation of scientific digital libraries. The tool framework, based on an ontology modeling tool, was configured to develop, manage, and keep shared ontologies relevant within changing domains and to promote the interoperability, interconnectedness, and correlation desired by scientists.

  17. Interoperable and standard e-Health solution over Bluetooth.

    Science.gov (United States)

    Martinez, I; Del Valle, P; Munoz, P; Trigo, J D; Escayola, J; Martínez-Espronceda, M; Muñoz, A; Serrano, L; Garcia, J

    2010-01-01

    The new paradigm of e-Health demands open sensors and middleware components that permit transparent integration and end-to-end interoperability of new personal health devices. The use of standards seems to be the internationally adopted way to solve these problems. This paper presents the implementation of an end-to-end standards-based e-Health solution. This includes ISO/IEEE11073 standard for the interoperability of the medical devices in the patient environment and EN13606 standard for the interoperable exchange of the Electronic Healthcare Record. The design strictly fulfills all the technical features of the most recent versions of both standards. The implemented prototype has been tested in a laboratory environment to demonstrate its feasibility for its further transfer to the healthcare system.

  18. Semantic web technologies for enterprise 2.0

    CERN Document Server

    Passant, A

    2010-01-01

    In this book, we detail different theories, methods and implementations combining Web 2.0 paradigms and Semantic Web technologies in Enterprise environments. After introducing those terms, we present the current shortcomings of tools such as blogs and wikis as well as tagging practices in an Enterprise 2.0 context. We define the SemSLATES methodology and the global vision of a middleware architecture based on Semantic Web technologies and Linked Data principles (languages, models, tools and protocols) to solve these issues. Then, we detail the various ontologies that we build to achieve this g

  19. NetCDF-CF-OPeNDAP: Standards for ocean data interoperability and object lessons for community data standards processes

    Science.gov (United States)

    Hankin, Steven C.; Blower, Jon D.; Carval, Thierry; Casey, Kenneth S.; Donlon, Craig; Lauret, Olivier; Loubrieu, Thomas; Srinivasan, Ashwanth; Trinanes, Joaquin; Godøy, Øystein; Mendelssohn, Roy; Signell, Richard P.; de La Beaujardiere, Jeff; Cornillon, Peter; Blanc, Frederique; Rew, Russ; Harlan, Jack; Hall, Julie; Harrison, D.E.; Stammer, Detlef

    2010-01-01

    It is generally recognized that meeting society's emerging environmental science and management needs will require the marine data community to provide simpler, more effective and more interoperable access to its data. There is broad agreement, as well, that data standards are the bedrock upon which interoperability will be built. The path that would bring the marine data community to agree upon and utilize such standards, however, is often elusive. In this paper we examine the trio of standards 1) netCDF files; 2) the Climate and Forecast (CF) metadata convention; and 3) the OPeNDAP data access protocol. These standards taken together have brought our community a high level of interoperability for "gridded" data such as model outputs, satellite products and climatological analyses, and they are gaining rapid acceptance for ocean observations. We will provide an overview of the scope of the contribution that has been made. We then step back from the information technology considerations to examine the community or "social" process by which the successes were achieved. We contrast the path by which the World Meteorological Organization (WMO) has advanced the Global Telecommunications System (GTS) - netCDF/CF/OPeNDAP exemplifying a "bottom up" standards process whereas GTS is "top down". Both of these standards are tales of success at achieving specific purposes, yet each is hampered by technical limitations. These limitations sometimes lead to controversy over whether alternative technological directions should be pursued. Finally we draw general conclusions regarding the factors that affect the success of a standards development effort - the likelihood that an IT standard will meet its design goals and will achieve community-wide acceptance. We believe that a higher level of thoughtful awareness by the scientists, program managers and technology experts of the vital role of standards and the merits of alternative standards processes can help us as a community to

  20. Interoperability of Services in an Open Broadband Market : Cases from the Netherlands

    NARCIS (Netherlands)

    Burgmeijer, J.

    2006-01-01

    End-to-end interoperability of broadband services and networks is a condition for an open broadband market. A business model for broadband service interoperability is given. Two cases from the Netherlands, of initiatives from the market to reach interoperability, are presented: E-norm and FIST VoIP.

  1. Interoperability

    DEFF Research Database (Denmark)

    Savin, Andrej

    be limited. Fourth, Data protection “by design” would be distinguished from data protection “by default”. Fifth, new fundamental rights would be introduced and the old ones clarified. Sixth, new rules on controllers’ and processors’ duties, on supervisory authorities and on sanctions would be introduced....... Finally, the Commission would obtain significant new powers to adopt delegated acts. This appendix explores the impact that the proposed Regulation might have on interoperability of user-­‐generated services.4 Since the proposed Regulation is an instrument of high complexity, only those provisions...... of direct relevance for the project and Work Package 5 will be analysed here....

  2. Restructuring an EHR system and the Medical Markup Language (MML) standard to improve interoperability by archetype technology.

    Science.gov (United States)

    Kobayashi, Shinji; Kume, Naoto; Yoshihara, Hiroyuki

    2015-01-01

    In 2001, we developed an EHR system for regional healthcare information inter-exchange and to provide individual patient data to patients. This system was adopted in three regions in Japan. We also developed a Medical Markup Language (MML) standard for inter- and intra-hospital communications. The system was built on a legacy platform, however, and had not been appropriately maintained or updated to meet clinical requirements. To improve future maintenance costs, we reconstructed the EHR system using archetype technology on the Ruby on Rails platform, and generated MML equivalent forms from archetypes. The system was deployed as a cloud-based system for preliminary use as a regional EHR. The system now has the capability to catch up with new requirements, maintaining semantic interoperability with archetype technology. It is also more flexible than the legacy EHR system.

  3. Towards sustainability: An interoperability outline for a Regional ARC based infrastructure in the WLCG and EGEE infrastructures

    International Nuclear Information System (INIS)

    Field, L; Gronager, M; Johansson, D; Kleist, J

    2010-01-01

    Interoperability of grid infrastructures is becoming increasingly important in the emergence of large scale grid infrastructures based on national and regional initiatives. To achieve interoperability of grid infrastructures adaptions and bridging of many different systems and services needs to be tackled. A grid infrastructure offers services for authentication, authorization, accounting, monitoring, operation besides from the services for handling and data and computations. This paper presents an outline of the work done to integrate the Nordic Tier-1 and 2s, which for the compute part is based on the ARC middleware, into the WLCG grid infrastructure co-operated by the EGEE project. Especially, a throughout description of integration of the compute services is presented.

  4. Benefit quantification of interoperability in coordinate metrology

    DEFF Research Database (Denmark)

    Savio, E.; Carmignato, S.; De Chiffre, Leonardo

    2014-01-01

    these inefficiencies. The paper presents a methodology for an economic evaluation of interoperability benefits with respect to the verification of geometrical product specifications. It requires input data from testing and inspection activities, as well as information on training of personnel and licensing of software......One of the factors contributing to limited reproducibility of coordinate measurements is the use of different inspection software. Time-consuming efforts for translation of part programmes are sometimes needed, and interoperability of inspection equipment has the potential to reduce...

  5. Interoperabilidad de sistemas de organización del conocimiento: el estado del arte Interoperability of knowledge organization systems: the state of the art

    Directory of Open Access Journals (Sweden)

    Ana M. Martínez Tamayo

    2011-06-01

    Full Text Available La interoperabilidad entre distintos sistemas de organización del conocimiento (SOC ha cobrado gran importancia en los últimos tiempos, con el propósito de facilitar la búsqueda simultánea en varias bases de datos o bien fusionar distintas bases de datos en una sola. Las nuevas normas para el diseño y desarrollo de SOC, la estadounidense Z39.19:2005 y la británica BS 8723-4:2007, incluyen recomendaciones detalladas para la interoperabilidad. También se encuentra en preparación una nueva norma ISO 25964-1 sobre tesauros e interoperabilidad que se agregará a las anteriores. La tecnología disponible proporciona herramientas para este fin, como son los formatos y requisitos funcionales de autoridades y las herramientas de la Web Semántica RDF/OWL, SKOS Core y XML. Por otro lado, actualmente es muy difícil diseñar y desarrollar nuevos SOC debido a los problemas económicos, de modo que la interoperabilidad hace posible aprovechar los SOC existentes. En este trabajo se revisan los conceptos básicos, los modelos y métodos recomendados por las normas, así como numerosas experiencias de interoperabilidad entre SOC que han sido documentadas.The interoperability between knowledge organization systems (KOS has become very important in recent years, in order to facilitate simultaneous searches in several databases or to merge different databases into one. The new standards for KOS design and development, the American Z39.19:2005 and the British 8723-4:2007, include detailed recommendations for interoperability. Also, there is a new ISO standard in preparation, the 25964-1 about thesauri and interoperability, which will be added to the above mentioned ones. The available technology provides tools for interoperability, e.g. formats and functional requirements for subject authority, as well as those for Semantic Web RDF/OWL, SKOS Core and XML. On the other hand, presently it is very hard to design and develop new KOS due to economical problems

  6. Interoperability, Enterprise Architectures, and IT Governance in Government

    OpenAIRE

    Scholl , Hans ,; Kubicek , Herbert; Cimander , Ralf

    2011-01-01

    Part 4: Architecture, Security and Interoperability; International audience; Government represents a unique, and also uniquely complex, environment for interoperation of information systems as well as for integration of workflows and processes across governmental levels and branches. While private-sector organizations by and large have the capacity to implement “enterprise architectures” in a relatively straightforward fashion, for notable reasons governments do not enjoy such luxury. For thi...

  7. Accelerating cancer systems biology research through Semantic Web technology.

    Science.gov (United States)

    Wang, Zhihui; Sagotsky, Jonathan; Taylor, Thomas; Shironoshita, Patrick; Deisboeck, Thomas S

    2013-01-01

    Cancer systems biology is an interdisciplinary, rapidly expanding research field in which collaborations are a critical means to advance the field. Yet the prevalent database technologies often isolate data rather than making it easily accessible. The Semantic Web has the potential to help facilitate web-based collaborative cancer research by presenting data in a manner that is self-descriptive, human and machine readable, and easily sharable. We have created a semantically linked online Digital Model Repository (DMR) for storing, managing, executing, annotating, and sharing computational cancer models. Within the DMR, distributed, multidisciplinary, and inter-organizational teams can collaborate on projects, without forfeiting intellectual property. This is achieved by the introduction of a new stakeholder to the collaboration workflow, the institutional licensing officer, part of the Technology Transfer Office. Furthermore, the DMR has achieved silver level compatibility with the National Cancer Institute's caBIG, so users can interact with the DMR not only through a web browser but also through a semantically annotated and secure web service. We also discuss the technology behind the DMR leveraging the Semantic Web, ontologies, and grid computing to provide secure inter-institutional collaboration on cancer modeling projects, online grid-based execution of shared models, and the collaboration workflow protecting researchers' intellectual property. Copyright © 2012 Wiley Periodicals, Inc.

  8. Modeling and interoperability of heterogeneous genomic big data for integrative processing and querying.

    Science.gov (United States)

    Masseroli, Marco; Kaitoua, Abdulrahman; Pinoli, Pietro; Ceri, Stefano

    2016-12-01

    While a huge amount of (epi)genomic data of multiple types is becoming available by using Next Generation Sequencing (NGS) technologies, the most important emerging problem is the so-called tertiary analysis, concerned with sense making, e.g., discovering how different (epi)genomic regions and their products interact and cooperate with each other. We propose a paradigm shift in tertiary analysis, based on the use of the Genomic Data Model (GDM), a simple data model which links genomic feature data to their associated experimental, biological and clinical metadata. GDM encompasses all the data formats which have been produced for feature extraction from (epi)genomic datasets. We specifically describe the mapping to GDM of SAM (Sequence Alignment/Map), VCF (Variant Call Format), NARROWPEAK (for called peaks produced by NGS ChIP-seq or DNase-seq methods), and BED (Browser Extensible Data) formats, but GDM supports as well all the formats describing experimental datasets (e.g., including copy number variations, DNA somatic mutations, or gene expressions) and annotations (e.g., regarding transcription start sites, genes, enhancers or CpG islands). We downloaded and integrated samples of all the above-mentioned data types and formats from multiple sources. The GDM is able to homogeneously describe semantically heterogeneous data and makes the ground for providing data interoperability, e.g., achieved through the GenoMetric Query Language (GMQL), a high-level, declarative query language for genomic big data. The combined use of the data model and the query language allows comprehensive processing of multiple heterogeneous data, and supports the development of domain-specific data-driven computations and bio-molecular knowledge discovery. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Enhancing acronym/abbreviation knowledge bases with semantic information.

    Science.gov (United States)

    Torii, Manabu; Liu, Hongfang

    2007-10-11

    In the biomedical domain, a terminology knowledge base that associates acronyms/abbreviations (denoted as SFs) with the definitions (denoted as LFs) is highly needed. For the construction such terminology knowledge base, we investigate the feasibility to build a system automatically assigning semantic categories to LFs extracted from text. Given a collection of pairs (SF,LF) derived from text, we i) assess the coverage of LFs and pairs (SF,LF) in the UMLS and justify the need of a semantic category assignment system; and ii) automatically derive name phrases annotated with semantic category and construct a system using machine learning. Utilizing ADAM, an existing collection of (SF,LF) pairs extracted from MEDLINE, our system achieved an f-measure of 87% when assigning eight UMLS-based semantic groups to LFs. The system has been incorporated into a web interface which integrates SF knowledge from multiple SF knowledge bases. Web site: http://gauss.dbb.georgetown.edu/liblab/SFThesurus.

  10. Semantic Web-based digital, field and virtual geological

    Science.gov (United States)

    Babaie, H. A.

    2012-12-01

    Digital, field and virtual Semantic Web-based education (SWBE) of geological mapping requires the construction of a set of searchable, reusable, and interoperable digital learning objects (LO) for learners, teachers, and authors. These self-contained units of learning may be text, image, or audio, describing, for example, how to calculate the true dip of a layer from two structural contours or find the apparent dip along a line of section. A collection of multi-media LOs can be integrated, through domain and task ontologies, with mapping-related learning activities and Web services, for example, to search for the description of lithostratigraphic units in an area, or plotting orientation data on stereonet. Domain ontologies (e.g., GeologicStructure, Lithostratigraphy, Rock) represent knowledge in formal languages (RDF, OWL) by explicitly specifying concepts, relations, and theories involved in geological mapping. These ontologies are used by task ontologies that formalize the semantics of computational tasks (e.g., measuring the true thickness of a formation) and activities (e.g., construction of cross section) for all actors to solve specific problems (making map, instruction, learning support, authoring). A SWBE system for geological mapping should also involve ontologies to formalize teaching strategy (pedagogical styles), learner model (e.g., for student performance, personalization of learning), interface (entry points for activities of all actors), communication (exchange of messages among different components and actors), and educational Web services (for interoperability). In this ontology-based environment, actors interact with the LOs through educational servers, that manage (reuse, edit, delete, store) ontologies, and through tools which communicate with Web services to collect resources and links to other tools. Digital geological mapping involves a location-based, spatial organization of geological elements in a set of GIS thematic layers. Each layer

  11. Design and Applications of a GeoSemantic Framework for Integration of Data and Model Resources in Hydrologic Systems

    Science.gov (United States)

    Elag, M.; Kumar, P.

    2016-12-01

    Hydrologists today have to integrate resources such as data and models, which originate and reside in multiple autonomous and heterogeneous repositories over the Web. Several resource management systems have emerged within geoscience communities for sharing long-tail data, which are collected by individual or small research groups, and long-tail models, which are developed by scientists or small modeling communities. While these systems have increased the availability of resources within geoscience domains, deficiencies remain due to the heterogeneity in the methods, which are used to describe, encode, and publish information about resources over the Web. This heterogeneity limits our ability to access the right information in the right context so that it can be efficiently retrieved and understood without the Hydrologist's mediation. A primary challenge of the Web today is the lack of the semantic interoperability among the massive number of resources, which already exist and are continually being generated at rapid rates. To address this challenge, we have developed a decentralized GeoSemantic (GS) framework, which provides three sets of micro-web services to support (i) semantic annotation of resources, (ii) semantic alignment between the metadata of two resources, and (iii) semantic mediation among Standard Names. Here we present the design of the framework and demonstrate its application for semantic integration between data and models used in the IML-CZO. First we show how the IML-CZO data are annotated using the Semantic Annotation Services. Then we illustrate how the Resource Alignment Services and Knowledge Integration Services are used to create a semantic workflow among TopoFlow model, which is a spatially-distributed hydrologic model and the annotated data. Results of this work are (i) a demonstration of how the GS framework advances the integration of heterogeneous data and models of water-related disciplines by seamless handling of their semantic

  12. Semantic Multimedia

    NARCIS (Netherlands)

    S. Staab; A. Scherp; R. Arndt; R. Troncy (Raphael); M. Grzegorzek; C. Saathoff; S. Schenk; L. Hardman (Lynda)

    2008-01-01

    htmlabstractMultimedia constitutes an interesting field of application for Semantic Web and Semantic Web reasoning, as the access and management of multimedia content and context depends strongly on the semantic descriptions of both. At the same time, multimedia resources constitute complex objects,

  13. Evaluation of Enterprise Architecture Interoperability

    National Research Council Canada - National Science Library

    Jamison, Theresa A; Niska, Brice T; Layman, Phillip A; Whitney, Steven P

    2005-01-01

    ...), which describes these architectures. The purpose of this project, suggested by Air Force Space Command, was to examine the value of existing analytical tools in making an interoperability assessment of individual enterprises, as well...

  14. The DFG Viewer for Interoperability in Germany

    Directory of Open Access Journals (Sweden)

    Ralf Goebel

    2010-02-01

    Full Text Available This article deals with the DFG Viewer for Interoperability, a free and open source web-based viewer for digitised books, and assesses its relevance for interoperability in Germany. First the specific situation in Germany is described, including the important role of the Deutsche Forschungsgemeinschaft (German Research Foundation. The article then moves on to the overall concept of the viewer and its technical background. It introduces the data formats and standards used, it briefly illustrates how the viewer works and includes a few examples.

  15. SCALEUS: Semantic Web Services Integration for Biomedical Applications.

    Science.gov (United States)

    Sernadela, Pedro; González-Castro, Lorena; Oliveira, José Luís

    2017-04-01

    In recent years, we have witnessed an explosion of biological data resulting largely from the demands of life science research. The vast majority of these data are freely available via diverse bioinformatics platforms, including relational databases and conventional keyword search applications. This type of approach has achieved great results in the last few years, but proved to be unfeasible when information needs to be combined or shared among different and scattered sources. During recent years, many of these data distribution challenges have been solved with the adoption of semantic web. Despite the evident benefits of this technology, its adoption introduced new challenges related with the migration process, from existent systems to the semantic level. To facilitate this transition, we have developed Scaleus, a semantic web migration tool that can be deployed on top of traditional systems in order to bring knowledge, inference rules, and query federation to the existent data. Targeted at the biomedical domain, this web-based platform offers, in a single package, straightforward data integration and semantic web services that help developers and researchers in the creation process of new semantically enhanced information systems. SCALEUS is available as open source at http://bioinformatics-ua.github.io/scaleus/ .

  16. On the formal definition of the systems' interoperability capability: an anthropomorphic approach

    Science.gov (United States)

    Zdravković, Milan; Luis-Ferreira, Fernando; Jardim-Goncalves, Ricardo; Trajanović, Miroslav

    2017-03-01

    The extended view of enterprise information systems in the Internet of Things (IoT) introduces additional complexity to the interoperability problems. In response to this, the problem of systems' interoperability is revisited by taking into the account the different aspects of philosophy, psychology, linguistics and artificial intelligence, namely by analysing the potential analogies between the processes of human and system communication. Then, the capability to interoperate as a property of the system, is defined as a complex ability to seamlessly sense and perceive a stimulus from its environment (assumingly, a message from any other system), make an informed decision about this perception and consequently, articulate a meaningful and useful action or response, based on this decision. Although this capability is defined on the basis of the existing interoperability theories, the proposed approach to its definition excludes the assumption on the awareness of co-existence of two interoperating systems. Thus, it establishes the links between the research of interoperability of systems and intelligent software agents, as one of the systems' digital identities.

  17. QTLTableMiner++: semantic mining of QTL tables in scientific articles.

    Science.gov (United States)

    Singh, Gurnoor; Kuzniar, Arnold; van Mulligen, Erik M; Gavai, Anand; Bachem, Christian W; Visser, Richard G F; Finkers, Richard

    2018-05-25

    A quantitative trait locus (QTL) is a genomic region that correlates with a phenotype. Most of the experimental information about QTL mapping studies is described in tables of scientific publications. Traditional text mining techniques aim to extract information from unstructured text rather than from tables. We present QTLTableMiner ++ (QTM), a table mining tool that extracts and semantically annotates QTL information buried in (heterogeneous) tables of plant science literature. QTM is a command line tool written in the Java programming language. This tool takes scientific articles from the Europe PMC repository as input, extracts QTL tables using keyword matching and ontology-based concept identification. The tables are further normalized using rules derived from table properties such as captions, column headers and table footers. Furthermore, table columns are classified into three categories namely column descriptors, properties and values based on column headers and data types of cell entries. Abbreviations found in the tables are expanded using the Schwartz and Hearst algorithm. Finally, the content of QTL tables is semantically enriched with domain-specific ontologies (e.g. Crop Ontology, Plant Ontology and Trait Ontology) using the Apache Solr search platform and the results are stored in a relational database and a text file. The performance of the QTM tool was assessed by precision and recall based on the information retrieved from two manually annotated corpora of open access articles, i.e. QTL mapping studies in tomato (Solanum lycopersicum) and in potato (S. tuberosum). In summary, QTM detected QTL statements in tomato with 74.53% precision and 92.56% recall and in potato with 82.82% precision and 98.94% recall. QTM is a unique tool that aids in providing QTL information in machine-readable and semantically interoperable formats.

  18. Semantically supporting data discovery, markup and aggregation in the European Marine Observation and Data Network (EMODnet)

    Science.gov (United States)

    Lowry, Roy; Leadbetter, Adam

    2014-05-01

    The semantic content of the NERC Vocabulary Server (NVS) has been developed over thirty years. It has been used to mark up metadata and data in a wide range of international projects, including the European Commission (EC) Framework Programme 7 projects SeaDataNet and The Open Service Network for Marine Environmental Data (NETMAR). Within the United States, the National Science Foundation projects Rolling Deck to Repository and Biological & Chemical Data Management Office (BCO-DMO) use concepts from NVS for markup. Further, typed relationships between NVS concepts and terms served by the Marine Metadata Interoperability Ontology Registry and Repository. The vast majority of the concepts publicly served from NVS (35% of ~82,000) form the British Oceanographic Data Centre (BODC) Parameter Usage Vocabulary (PUV). The PUV is instantiated on the NVS as a SKOS concept collection. These terms are used to describe the individual channels in data and metadata served by, for example, BODC, SeaDataNet and BCO-DMO. The PUV terms are designed to be very precise and may contain a high level of detail. Some users have reported that the PUV is difficult to navigate due to its size and complexity (a problem CSIRO have begun to address by deploying a SISSVoc interface to the NVS), and it has been difficult to aggregate data as multiple PUV terms can - with full validity - be used to describe the same data channels. Better approaches to data aggregation are required as a use case for the PUV from the EC European Marine Observation and Data Network (EMODnet) Chemistry project. One solution, proposed and demonstrated during the course of the NETMAR project, is to build new SKOS concept collections which formalise the desired aggregations for given applications, and uses typed relationships to state which PUV concepts contribute to a specific aggregation. Development of these new collections requires input from a group of experts in the application domain who can decide which PUV

  19. An Interoperable System toward Cardiac Risk Stratification from ECG Monitoring

    Directory of Open Access Journals (Sweden)

    Cristina Soguero-Ruiz

    2018-03-01

    Full Text Available Many indices have been proposed for cardiovascular risk stratification from electrocardiogram signal processing, still with limited use in clinical practice. We created a system integrating the clinical definition of cardiac risk subdomains from ECGs and the use of diverse signal processing techniques. Three subdomains were defined from the joint analysis of the technical and clinical viewpoints. One subdomain was devoted to demographic and clinical data. The other two subdomains were intended to obtain widely defined risk indices from ECG monitoring: a simple-domain (heart rate turbulence (HRT, and a complex-domain (heart rate variability (HRV. Data provided by the three subdomains allowed for the generation of alerts with different intensity and nature, as well as for the grouping and scrutinization of patients according to the established processing and risk-thresholding criteria. The implemented system was tested by connecting data from real-world in-hospital electronic health records and ECG monitoring by considering standards for syntactic (HL7 messages and semantic interoperability (archetypes based on CEN/ISO EN13606 and SNOMED-CT. The system was able to provide risk indices and to generate alerts in the health records to support decision-making. Overall, the system allows for the agile interaction of research and clinical practice in the Holter-ECG-based cardiac risk domain.

  20. An Interoperable System toward Cardiac Risk Stratification from ECG Monitoring

    Science.gov (United States)

    Mora-Jiménez, Inmaculada; Ramos-López, Javier; Quintanilla Fernández, Teresa; García-García, Antonio; Díez-Mazuela, Daniel; García-Alberola, Arcadi

    2018-01-01

    Many indices have been proposed for cardiovascular risk stratification from electrocardiogram signal processing, still with limited use in clinical practice. We created a system integrating the clinical definition of cardiac risk subdomains from ECGs and the use of diverse signal processing techniques. Three subdomains were defined from the joint analysis of the technical and clinical viewpoints. One subdomain was devoted to demographic and clinical data. The other two subdomains were intended to obtain widely defined risk indices from ECG monitoring: a simple-domain (heart rate turbulence (HRT)), and a complex-domain (heart rate variability (HRV)). Data provided by the three subdomains allowed for the generation of alerts with different intensity and nature, as well as for the grouping and scrutinization of patients according to the established processing and risk-thresholding criteria. The implemented system was tested by connecting data from real-world in-hospital electronic health records and ECG monitoring by considering standards for syntactic (HL7 messages) and semantic interoperability (archetypes based on CEN/ISO EN13606 and SNOMED-CT). The system was able to provide risk indices and to generate alerts in the health records to support decision-making. Overall, the system allows for the agile interaction of research and clinical practice in the Holter-ECG-based cardiac risk domain. PMID:29494497

  1. Plugfest 2009: Global Interoperability in Telerobotics and Telemedicine.

    Science.gov (United States)

    King, H Hawkeye; Hannaford, Blake; Kwok, Ka-Wai; Yang, Guang-Zhong; Griffiths, Paul; Okamura, Allison; Farkhatdinov, Ildar; Ryu, Jee-Hwan; Sankaranarayanan, Ganesh; Arikatla, Venkata; Tadano, Kotaro; Kawashima, Kenji; Peer, Angelika; Schauß, Thomas; Buss, Martin; Miller, Levi; Glozman, Daniel; Rosen, Jacob; Low, Thomas

    2010-05-07

    Despite the great diversity of teleoperator designs and applications, their underlying control systems have many similarities. These similarities can be exploited to enable inter-operability between heterogeneous systems. We have developed a network data specification, the Interoperable Telerobotics Protocol, that can be used for Internet based control of a wide range of teleoperators. In this work we test interoperable telerobotics on the global Internet, focusing on the telesurgery application domain. Fourteen globally dispersed telerobotic master and slave systems were connected in thirty trials in one twenty four hour period. Users performed common manipulation tasks to demonstrate effective master-slave operation. With twenty eight (93%) successful, unique connections the results show a high potential for standardizing telerobotic operation. Furthermore, new paradigms for telesurgical operation and training are presented, including a networked surgery trainer and upper-limb exoskeleton control of micro-manipulators.

  2. Varieties of semantic 'access' deficit in Wernicke's aphasia and semantic aphasia.

    Science.gov (United States)

    Thompson, Hannah E; Robson, Holly; Lambon Ralph, Matthew A; Jefferies, Elizabeth

    2015-12-01

    Comprehension deficits are common in stroke aphasia, including in cases with (i) semantic aphasia, characterized by poor executive control of semantic processing across verbal and non-verbal modalities; and (ii) Wernicke's aphasia, associated with poor auditory-verbal comprehension and repetition, plus fluent speech with jargon. However, the varieties of these comprehension problems, and their underlying causes, are not well understood. Both patient groups exhibit some type of semantic 'access' deficit, as opposed to the 'storage' deficits observed in semantic dementia. Nevertheless, existing descriptions suggest that these patients might have different varieties of 'access' impairment-related to difficulty resolving competition (in semantic aphasia) versus initial activation of concepts from sensory inputs (in Wernicke's aphasia). We used a case series design to compare patients with Wernicke's aphasia and those with semantic aphasia on Warrington's paradigmatic assessment of semantic 'access' deficits. In these verbal and non-verbal matching tasks, a small set of semantically-related items are repeatedly presented over several cycles so that the target on one trial becomes a distractor on another (building up interference and eliciting semantic 'blocking' effects). Patients with Wernicke's aphasia and semantic aphasia were distinguished according to lesion location in the temporal cortex, but in each group, some individuals had additional prefrontal damage. Both of these aspects of lesion variability-one that mapped onto classical 'syndromes' and one that did not-predicted aspects of the semantic 'access' deficit. Both semantic aphasia and Wernicke's aphasia cases showed multimodal semantic impairment, although as expected, the Wernicke's aphasia group showed greater deficits on auditory-verbal than picture judgements. Distribution of damage in the temporal lobe was crucial for predicting the initially 'beneficial' effects of stimulus repetition: cases with

  3. Pemanfaatan Google API Untuk Model Interoperability Web Berbasis PHP Dengan Google Drive

    OpenAIRE

    Sumiari, Ni Kadek

    2015-01-01

    Dalam sebuah website tercapinya interoperability suatu system sangatlah penting. Penggunaan database berbasis Mysql, Sql Server ataupun oracle memang sudah sangat lumrah dipergunakan dalam sebuah system berbasis website. Namun penggunaan database tersebut tidak bisa menjamin apakah interoperability dari system tersebut dapat tercapai. Selain dari keamanan data dari segi implementasi system juga cukup sulit. Salah satu solusi dalam mencapi interoperability suatu system berbasis website adalah...

  4. Generative Semantics.

    Science.gov (United States)

    King, Margaret

    The first section of this paper deals with the attempts within the framework of transformational grammar to make semantics a systematic part of linguistic description, and outlines the characteristics of the generative semantics position. The second section takes a critical look at generative semantics in its later manifestations, and makes a case…

  5. Establishing Interoperability of a Blog Archive through Linked Open Data

    DEFF Research Database (Denmark)

    Kalb, Hendrik; Lazaridou, Paraskevi; Trier, Matthias

    2013-01-01

    on archived data. However, interoperability among BlogForever archives, as well as with other digital libraries, is necessary in order to avoid silos of data. In this paper, we reveal some of our efforts to establish interoperability through the application of Linked Open data....

  6. A common type system for clinical natural language processing.

    Science.gov (United States)

    Wu, Stephen T; Kaggal, Vinod C; Dligach, Dmitriy; Masanz, James J; Chen, Pei; Becker, Lee; Chapman, Wendy W; Savova, Guergana K; Liu, Hongfang; Chute, Christopher G

    2013-01-03

    One challenge in reusing clinical data stored in electronic medical records is that these data are heterogenous. Clinical Natural Language Processing (NLP) plays an important role in transforming information in clinical text to a standard representation that is comparable and interoperable. Information may be processed and shared when a type system specifies the allowable data structures. Therefore, we aim to define a common type system for clinical NLP that enables interoperability between structured and unstructured data generated in different clinical settings. We describe a common type system for clinical NLP that has an end target of deep semantics based on Clinical Element Models (CEMs), thus interoperating with structured data and accommodating diverse NLP approaches. The type system has been implemented in UIMA (Unstructured Information Management Architecture) and is fully functional in a popular open-source clinical NLP system, cTAKES (clinical Text Analysis and Knowledge Extraction System) versions 2.0 and later. We have created a type system that targets deep semantics, thereby allowing for NLP systems to encapsulate knowledge from text and share it alongside heterogenous clinical data sources. Rather than surface semantics that are typically the end product of NLP algorithms, CEM-based semantics explicitly build in deep clinical semantics as the point of interoperability with more structured data types.

  7. SCC: Semantic Context Cascade for Efficient Action Detection

    KAUST Repository

    Heilbron, Fabian Caba

    2017-11-09

    Despite the recent advances in large-scale video analysis, action detection remains as one of the most challenging unsolved problems in computer vision. This snag is in part due to the large volume of data that needs to be analyzed to detect actions in videos. Existing approaches have mitigated the computational cost, but still, these methods lack rich high-level semantics that helps them to localize the actions quickly. In this paper, we introduce a Semantic Cascade Context (SCC) model that aims to detect action in long video sequences. By embracing semantic priors associated with human activities, SCC produces high-quality class-specific action proposals and prune unrelated activities in a cascade fashion. Experimental results in ActivityNet unveils that SCC achieves state-of-the-art performance for action detection while operating at real time.

  8. Augmenting interoperability across repositories architectural ideas

    CERN Multimedia

    CERN. Geneva

    2005-01-01

    The aDORe digital repository architecture designed and implemented by the Los Alamos Research Library is fully standards-based and highly modular, with the various components of the architecture interacting in a protocol-driven manner. Although aDORe was designed for use in the context of the Los Alamos Library, its modular and standards-based design has led to interesting insights regarding possible new levels of interoperability in a federation of heterogeneous repositories. The presentation will discuss these insights, and will illustrate that attractive federations of repositories can be built by introducing rather basic interoperability requirements. The presentation will also show that, once these requirements are met, a powerful service framework that overlays the federation can emerge.

  9. The quest for standards in medical imaging

    Energy Technology Data Exchange (ETDEWEB)

    Gibaud, Bernard, E-mail: bernard.gibaud@irisa.fr [INSERM, VisAGeS U746 Unit/Project, Faculty of Medicine, Campus de Villejean, F-35043 Rennes (France); INRIA, VisAGeS U746 Unit/Project, IRISA, Campus de Beaulieu, F-35042 Rennes (France); University of Rennes I-CNRS UMR 6074, IRISA, Campus de Beaulieu, F-35042 Rennes (France)

    2011-05-15

    This article focuses on standards supporting interoperability and system integration in the medical imaging domain. We introduce the basic concepts and actors and we review the most salient achievements in this domain, especially with the DICOM standard, and the definition of IHE integration profiles. We analyze and discuss what was successful, and what could still be more widely adopted by industry. We then sketch out a perspective of what should be done next, based on our vision of new requirements for the next decade. In particular, we discuss the challenges of a more explicit sharing of image and image processing semantics, and we discuss the help that semantic web technologies (and especially ontologies) may bring to achieving this goal.

  10. The quest for standards in medical imaging

    International Nuclear Information System (INIS)

    Gibaud, Bernard

    2011-01-01

    This article focuses on standards supporting interoperability and system integration in the medical imaging domain. We introduce the basic concepts and actors and we review the most salient achievements in this domain, especially with the DICOM standard, and the definition of IHE integration profiles. We analyze and discuss what was successful, and what could still be more widely adopted by industry. We then sketch out a perspective of what should be done next, based on our vision of new requirements for the next decade. In particular, we discuss the challenges of a more explicit sharing of image and image processing semantics, and we discuss the help that semantic web technologies (and especially ontologies) may bring to achieving this goal.

  11. Reflections on the role of open source in health information system interoperability.

    Science.gov (United States)

    Sfakianakis, S; Chronaki, C E; Chiarugi, F; Conforti, F; Katehakis, D G

    2007-01-01

    This paper reflects on the role of open source in health information system interoperability. Open source is a driving force in computer science research and the development of information systems. It facilitates the sharing of information and ideas, enables evolutionary development and open collaborative testing of code, and broadens the adoption of interoperability standards. In health care, information systems have been developed largely ad hoc following proprietary specifications and customized design. However, the wide deployment of integrated services such as Electronic Health Records (EHRs) over regional health information networks (RHINs) relies on interoperability of the underlying information systems and medical devices. This reflection is built on the experiences of the PICNIC project that developed shared software infrastructure components in open source for RHINs and the OpenECG network that offers open source components to lower the implementation cost of interoperability standards such as SCP-ECG, in electrocardiography. Open source components implementing standards and a community providing feedback from real-world use are key enablers of health care information system interoperability. Investing in open source is investing in interoperability and a vital aspect of a long term strategy towards comprehensive health services and clinical research.

  12. Towards Semantic Web Services on Large, Multi-Dimensional Coverages

    Science.gov (United States)

    Baumann, P.

    2009-04-01

    Observed and simulated data in the Earth Sciences often come as coverages, the general term for space-time varying phenomena as set forth by standardization bodies like the Open GeoSpatial Consortium (OGC) and ISO. Among such data are 1-d time series, 2-D surface data, 3-D surface data time series as well as x/y/z geophysical and oceanographic data, and 4-D metocean simulation results. With increasing dimensionality the data sizes grow exponentially, up to Petabyte object sizes. Open standards for exploiting coverage archives over the Web are available to a varying extent. The OGC Web Coverage Service (WCS) standard defines basic extraction operations: spatio-temporal and band subsetting, scaling, reprojection, and data format encoding of the result - a simple interoperable interface for coverage access. More processing functionality is available with products like Matlab, Grid-type interfaces, and the OGC Web Processing Service (WPS). However, these often lack properties known as advantageous from databases: declarativeness (describe results rather than the algorithms), safe in evaluation (no request can keep a server busy infinitely), and optimizable (enable the server to rearrange the request so as to produce the same result faster). WPS defines a geo-enabled SOAP interface for remote procedure calls. This allows to webify any program, but does not allow for semantic interoperability: a function is identified only by its function name and parameters while the semantics is encoded in the (only human readable) title and abstract. Hence, another desirable property is missing, namely an explicit semantics which allows for machine-machine communication and reasoning a la Semantic Web. The OGC Web Coverage Processing Service (WCPS) language, which has been adopted as an international standard by OGC in December 2008, defines a flexible interface for the navigation, extraction, and ad-hoc analysis of large, multi-dimensional raster coverages. It is abstract in that it

  13. Relationship Structures and Semantic Type Assignments of the UMLS Enriched Semantic Network

    Science.gov (United States)

    Zhang, Li; Halper, Michael; Perl, Yehoshua; Geller, James; Cimino, James J.

    2005-01-01

    Objective: The Enriched Semantic Network (ESN) was introduced as an extension of the Unified Medical Language System (UMLS) Semantic Network (SN). Its multiple subsumption configuration and concomitant multiple inheritance make the ESN's relationship structures and semantic type assignments different from those of the SN. A technique for deriving the relationship structures of the ESN's semantic types and an automated technique for deriving the ESN's semantic type assignments from those of the SN are presented. Design: The technique to derive the ESN's relationship structures finds all newly inherited relationships in the ESN. All such relationships are audited for semantic validity, and the blocking mechanism is used to block invalid relationships. The mapping technique to derive the ESN's semantic type assignments uses current SN semantic type assignments and preserves nonredundant categorizations, while preventing new redundant categorizations. Results: Among the 426 newly inherited relationships, 326 are deemed valid. Seven blockings are applied to avoid inheritance of the 100 invalid relationships. Sixteen semantic types have different relationship structures in the ESN as compared to those in the SN. The mapping of semantic type assignments from the SN to the ESN avoids the generation of 26,950 redundant categorizations. The resulting ESN contains 138 semantic types, 149 IS-A links, 7,303 relationships, and 1,013,876 semantic type assignments. Conclusion: The ESN's multiple inheritance provides more complete relationship structures than in the SN. The ESN's semantic type assignments avoid the existing redundant categorizations appearing in the SN and prevent new ones that might arise due to multiple parents. Compared to the SN, the ESN provides a more accurate unifying semantic abstraction of the UMLS Metathesaurus. PMID:16049233

  14. NCI's national environmental research data collection: metadata management built on standards and preparing for the semantic web

    Science.gov (United States)

    Wang, Jingbo; Bastrakova, Irina; Evans, Ben; Gohar, Kashif; Santana, Fabiana; Wyborn, Lesley

    2015-04-01

    National Computational Infrastructure (NCI) manages national environmental research data collections (10+ PB) as part of its specialized high performance data node of the Research Data Storage Infrastructure (RDSI) program. We manage 40+ data collections using NCI's Data Management Plan (DMP), which is compatible with the ISO 19100 metadata standards. We utilize ISO standards to make sure our metadata is transferable and interoperable for sharing and harvesting. The DMP is used along with metadata from the data itself, to create a hierarchy of data collection, dataset and time series catalogues that is then exposed through GeoNetwork for standard discoverability. This hierarchy catalogues are linked using a parent-child relationship. The hierarchical infrastructure of our GeoNetwork catalogues system aims to address both discoverability and in-house administrative use-cases. At NCI, we are currently improving the metadata interoperability in our catalogue by linking with standardized community vocabulary services. These emerging vocabulary services are being established to help harmonise data from different national and international scientific communities. One such vocabulary service is currently being established by the Australian National Data Services (ANDS). Data citation is another important aspect of the NCI data infrastructure, which allows tracking of data usage and infrastructure investment, encourage data sharing, and increasing trust in research that is reliant on these data collections. We incorporate the standard vocabularies into the data citation metadata so that the data citation become machine readable and semantically friendly for web-search purpose as well. By standardizing our metadata structure across our entire data corpus, we are laying the foundation to enable the application of appropriate semantic mechanisms to enhance discovery and analysis of NCI's national environmental research data information. We expect that this will further

  15. The Role of Markup for Enabling Interoperability in Health Informatics

    Directory of Open Access Journals (Sweden)

    Steve eMckeever

    2015-05-01

    Full Text Available Interoperability is the faculty of making information systems work together. In this paper we will distinguish a number of different forms that interoperability can take and show how they are realised on a variety of physiological and health care use cases. The last fifteen years has seen the rise of very cheap digital storage both on and off cite. With the advent of the 'Internet of Things' people's expectations are for greater interconnectivity and seamless interoperability. The potential impact these technologies have on healthcare are dramatic: from improved diagnoses through immediate access to a patient's electronic health record, to 'in silico' modeling of organs and early stage drug trials, to predictive medicine based on top-down modeling of disease progression and treatment. We will begin by looking at the underlying technology, classify the various kinds of interoperability that exist in the field, and discuss how they are realised. We conclude with a discussion on future possibilities that big data and further standardizations will enable.

  16. The role of markup for enabling interoperability in health informatics.

    Science.gov (United States)

    McKeever, Steve; Johnson, David

    2015-01-01

    Interoperability is the faculty of making information systems work together. In this paper we will distinguish a number of different forms that interoperability can take and show how they are realized on a variety of physiological and health care use cases. The last 15 years has seen the rise of very cheap digital storage both on and off site. With the advent of the Internet of Things people's expectations are for greater interconnectivity and seamless interoperability. The potential impact these technologies have on healthcare are dramatic: from improved diagnoses through immediate access to a patient's electronic health record, to in silico modeling of organs and early stage drug trials, to predictive medicine based on top-down modeling of disease progression and treatment. We will begin by looking at the underlying technology, classify the various kinds of interoperability that exist in the field, and discuss how they are realized. We conclude with a discussion on future possibilities that big data and further standardizations will enable.

  17. Trust estimation of the semantic web using semantic web clustering

    Science.gov (United States)

    Shirgahi, Hossein; Mohsenzadeh, Mehran; Haj Seyyed Javadi, Hamid

    2017-05-01

    Development of semantic web and social network is undeniable in the Internet world these days. Widespread nature of semantic web has been very challenging to assess the trust in this field. In recent years, extensive researches have been done to estimate the trust of semantic web. Since trust of semantic web is a multidimensional problem, in this paper, we used parameters of social network authority, the value of pages links authority and semantic authority to assess the trust. Due to the large space of semantic network, we considered the problem scope to the clusters of semantic subnetworks and obtained the trust of each cluster elements as local and calculated the trust of outside resources according to their local trusts and trust of clusters to each other. According to the experimental result, the proposed method shows more than 79% Fscore that is about 11.9% in average more than Eigen, Tidal and centralised trust methods. Mean of error in this proposed method is 12.936, that is 9.75% in average less than Eigen and Tidal trust methods.

  18. S3QL: A distributed domain specific language for controlled semantic integration of life sciences data

    Directory of Open Access Journals (Sweden)

    de Lencastre Hermínia

    2011-07-01

    Full Text Available Abstract Background The value and usefulness of data increases when it is explicitly interlinked with related data. This is the core principle of Linked Data. For life sciences researchers, harnessing the power of Linked Data to improve biological discovery is still challenged by a need to keep pace with rapidly evolving domains and requirements for collaboration and control as well as with the reference semantic web ontologies and standards. Knowledge organization systems (KOSs can provide an abstraction for publishing biological discoveries as Linked Data without complicating transactions with contextual minutia such as provenance and access control. We have previously described the Simple Sloppy Semantic Database (S3DB as an efficient model for creating knowledge organization systems using Linked Data best practices with explicit distinction between domain and instantiation and support for a permission control mechanism that automatically migrates between the two. In this report we present a domain specific language, the S3DB query language (S3QL, to operate on its underlying core model and facilitate management of Linked Data. Results Reflecting the data driven nature of our approach, S3QL has been implemented as an application programming interface for S3DB systems hosting biomedical data, and its syntax was subsequently generalized beyond the S3DB core model. This achievement is illustrated with the assembly of an S3QL query to manage entities from the Simple Knowledge Organization System. The illustrative use cases include gastrointestinal clinical trials, genomic characterization of cancer by The Cancer Genome Atlas (TCGA and molecular epidemiology of infectious diseases. Conclusions S3QL was found to provide a convenient mechanism to represent context for interoperation between public and private datasets hosted at biomedical research institutions and linked data formalisms.

  19. An ontology roadmap for crowdsourcing innovation intermediaries

    OpenAIRE

    Silva, Cândida; Ramos, Isabel

    2014-01-01

    Ontologies have proliferated in the last years, essentially justified by the need of achieving a consensus in the multiple representations of reality inside computers, and therefore the accomplishment of interoperability between machines and systems. Ontologies provide an explicit conceptualization that describes the semantics of the data. Crowdsourcing innovation intermediaries are organizations that mediate the communication and relationship between companies that aspire to solv...

  20. Classification with an edge: Improving semantic image segmentation with boundary detection

    Science.gov (United States)

    Marmanis, D.; Schindler, K.; Wegner, J. D.; Galliani, S.; Datcu, M.; Stilla, U.

    2018-01-01

    We present an end-to-end trainable deep convolutional neural network (DCNN) for semantic segmentation with built-in awareness of semantically meaningful boundaries. Semantic segmentation is a fundamental remote sensing task, and most state-of-the-art methods rely on DCNNs as their workhorse. A major reason for their success is that deep networks learn to accumulate contextual information over very large receptive fields. However, this success comes at a cost, since the associated loss of effective spatial resolution washes out high-frequency details and leads to blurry object boundaries. Here, we propose to counter this effect by combining semantic segmentation with semantically informed edge detection, thus making class boundaries explicit in the model. First, we construct a comparatively simple, memory-efficient model by adding boundary detection to the SEGNET encoder-decoder architecture. Second, we also include boundary detection in FCN-type models and set up a high-end classifier ensemble. We show that boundary detection significantly improves semantic segmentation with CNNs in an end-to-end training scheme. Our best model achieves >90% overall accuracy on the ISPRS Vaihingen benchmark.

  1. A Pragmatic Approach to Sustainable Interoperability for the Web 2.0 World

    Science.gov (United States)

    Wright, D. J.; Sankaran, S.

    2015-12-01

    In the geosciences, interoperability is a fundamental requirement. Members of various standards organizations such as the OGC and ISO-TC 211 have done yeomen services to promote a standards-centric approach to manage the interoperability challenges that organizations face today. The specific challenges that organizations face when adopting interoperability patterns are very many. One approach, that of mandating the use of specific standards has been reasonably successful. But scientific communities, as with all others, ultimately want their solutions to be widely accepted and used. And to this end there is a crying need to explore all possible interoperability patterns without restricting the choices to mandated standards. Standards are created by a slow and deliberative process that sometimes takes a long time to come to fruition and therefore sometime feel to fall short of user expectations. It seems therefore that organizations are left with a series of perceived orthogonal requirements when they want to pursue interoperability. They want a robust but agile solution, a mature approach that also needs to satisfy latest technology trends and so on. Sustainable interoperability patterns need to be forward looking and should choose the patterns and paradigms of the Web 2.0 generation. To this end, the key is to choose platform technologies that embrace multiple interoperability mechanisms that are built on fundamental "open" principles and which align with popular mainstream patterns. We seek to explore data-, metadata- and web service-related interoperability patterns through the prism of building solutions that encourage strong implementer and end-user engagement, improved usability and scalability considerations, and appealing developer frameworks that can grow the audience. The path to tread is not new, and the geocommunity only needs to observe and align its end goals with current Web 2.0 patterns to realize all the benefits that today we all take for granted

  2. Requirements for and barriers towards interoperable ehealth technology in primary care

    NARCIS (Netherlands)

    Oude Nijeweme-d'Hollosy, Wendeline; van Velsen, Lex Stefan; Huygens, Martine; Hermens, Hermanus J.

    Despite eHealth technology's rapid growth, eHealth applications are rarely embedded within primary care, mostly because systems lack interoperability. This article identifies requirements for, and barriers towards, interoperable eHealth technology from healthcare professionals' perspective -- the

  3. Risk Management Considerations for Interoperable Acquisition

    National Research Council Canada - National Science Library

    Meyers, B. C

    2006-01-01

    .... The state of risk management practice -- the specification of standards and the methodologies to implement them -- is addressed and examined with respect to the needs of system-of-systems interoperability...

  4. Connected Lighting System Interoperability Study Part 1: Application Programming Interfaces

    Energy Technology Data Exchange (ETDEWEB)

    Gaidon, Clement [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Poplawski, Michael [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2017-10-31

    First in a series of studies that focuses on interoperability as realized by the use of Application Programming Interfaces (APIs), explores the diversity of such interfaces in several connected lighting systems; characterizes the extent of interoperability that they provide; and illustrates challenges, limitations, and tradeoffs that were encountered during this exploration.

  5. DETAILED CLINICAL MODELS AND THEIR RELATION WITH ELECTRONIC HEALTH RECORDS.

    OpenAIRE

    Boscá Tomás, Diego

    2016-01-01

    [EN] Healthcare domain produces and consumes big quantities of people's health data. Although data exchange is the norm rather than the exception, being able to access to all patient data is still far from achieved. Current developments such as personal health records will introduce even more data and complexity to the Electronic Health Records (EHR). Achieving semantic interoperability is one of the biggest challenges to overcome in order to benefit from all the information contained in the ...

  6. Semantic Desktop

    Science.gov (United States)

    Sauermann, Leo; Kiesel, Malte; Schumacher, Kinga; Bernardi, Ansgar

    In diesem Beitrag wird gezeigt, wie der Arbeitsplatz der Zukunft aussehen könnte und wo das Semantic Web neue Möglichkeiten eröffnet. Dazu werden Ansätze aus dem Bereich Semantic Web, Knowledge Representation, Desktop-Anwendungen und Visualisierung vorgestellt, die es uns ermöglichen, die bestehenden Daten eines Benutzers neu zu interpretieren und zu verwenden. Dabei bringt die Kombination von Semantic Web und Desktop Computern besondere Vorteile - ein Paradigma, das unter dem Titel Semantic Desktop bekannt ist. Die beschriebenen Möglichkeiten der Applikationsintegration sind aber nicht auf den Desktop beschränkt, sondern können genauso in Web-Anwendungen Verwendung finden.

  7. Semantic web for robots : applying semantic web technologies for interoperability between virtual worlds and real robots

    NARCIS (Netherlands)

    Juarez Cordova, A.G.

    2012-01-01

    The topic of this PhD project is in the context of cross-reality, a term that de??nes mixed reality environments that tunnel dense real-world data acquired through the use of sensor/actuator device networks into virtual worlds. It is part of the ongoing academia and industry e??orts to achieve

  8. Metadata behind the Interoperability of Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Miguel Angel Manso Callejo

    2009-05-01

    Full Text Available Wireless Sensor Networks (WSNs produce changes of status that are frequent, dynamic and unpredictable, and cannot be represented using a linear cause-effect approach. Consequently, a new approach is needed to handle these changes in order to support dynamic interoperability. Our approach is to introduce the notion of context as an explicit representation of changes of a WSN status inferred from metadata elements, which in turn, leads towards a decision-making process about how to maintain dynamic interoperability. This paper describes the developed context model to represent and reason over different WSN status based on four types of contexts, which have been identified as sensing, node, network and organisational contexts. The reasoning has been addressed by developing contextualising and bridges rules. As a result, we were able to demonstrate how contextualising rules have been used to reason on changes of WSN status as a first step towards maintaining dynamic interoperability.

  9. INTEROPERABLE FRAMEWORK SOLUTION TO ICU HEALTH CARE MONITORING

    Directory of Open Access Journals (Sweden)

    Shola Usha Rani

    2015-03-01

    Full Text Available An interoperable telehealth system provides an independent healthcare solution for better management of health and wellness. It allows people to manage their heart disease and diabetes etc. by sending their health parameters like blood pressure, heart rate, glucose levels, temperature, weight, respiration from remote place to health professional, and get real-time feedback on their condition. Here different medical devices are connected to the patient for monitoring. Each kind of device is manufactured by different vendors. And each device information and communication requires different installation and network design. It causes design complexities and network overheads when moving patients for diagnosis examinations. This problem will be solved by interoperability among devices. The ISO/IEEE 11073 is an international standard which produces interoperable hospital information system solution to medical devices. One such type of integrated environment that requires the integration of medical devices is ICU (Intensive Care Unit. This paper presents the issues for ICU monitoring system and framework solution for it.

  10. Semantic heterogeneity: comparing new semantic web approaches with those of digital libraries

    OpenAIRE

    Krause, Jürgen

    2008-01-01

    To demonstrate that newer developments in the semantic web community, particularly those based on ontologies (simple knowledge organization system and others) mitigate common arguments from the digital library (DL) community against participation in the Semantic web. The approach is a semantic web discussion focusing on the weak structure of the Web and the lack of consideration given to the semantic content during indexing. The points criticised by the semantic web and ontology approaches ar...

  11. Semantic framework for mapping object-oriented model to semantic web languages.

    Science.gov (United States)

    Ježek, Petr; Mouček, Roman

    2015-01-01

    The article deals with and discusses two main approaches in building semantic structures for electrophysiological metadata. It is the use of conventional data structures, repositories, and programming languages on one hand and the use of formal representations of ontologies, known from knowledge representation, such as description logics or semantic web languages on the other hand. Although knowledge engineering offers languages supporting richer semantic means of expression and technological advanced approaches, conventional data structures and repositories are still popular among developers, administrators and users because of their simplicity, overall intelligibility, and lower demands on technical equipment. The choice of conventional data resources and repositories, however, raises the question of how and where to add semantics that cannot be naturally expressed using them. As one of the possible solutions, this semantics can be added into the structures of the programming language that accesses and processes the underlying data. To support this idea we introduced a software prototype that enables its users to add semantically richer expressions into a Java object-oriented code. This approach does not burden users with additional demands on programming environment since reflective Java annotations were used as an entry for these expressions. Moreover, additional semantics need not to be written by the programmer directly to the code, but it can be collected from non-programmers using a graphic user interface. The mapping that allows the transformation of the semantically enriched Java code into the Semantic Web language OWL was proposed and implemented in a library named the Semantic Framework. This approach was validated by the integration of the Semantic Framework in the EEG/ERP Portal and by the subsequent registration of the EEG/ERP Portal in the Neuroscience Information Framework.

  12. The semantic structure of gratitude

    Directory of Open Access Journals (Sweden)

    Smirnov, Alexander V.

    2016-06-01

    Full Text Available In the modern social and economic environment of Russia, gratitude might be considered an ambiguous phenomenon. It can have different meaning for a person in different contexts and can manifest itself differently as well (that is, as an expression of sincere feelings or as an element of corruption. In this respect it is topical to investigate the system of meanings and relationships that define the semantic space of gratitude. The goal of the study was the investigation and description of the content and structure of the semantic space of the gratitude phenomenon as well as the determination of male, female, age, and ethnic peculiarities of the expression of gratitude. The objective was achieved by using the semantic differential designed by the authors to investigate attitudes toward gratitude. This investigation was carried out with the participation of 184 respondents (Russians, Tatars, Ukrainians, Jews living in the Russian Federation, Belarus, Kazakhstan, Tajikistan, Israel, Australia, Canada, and the United Kingdom and identifying themselves as representatives of one of these nationalities. The structural components of gratitude were singled out by means of exploratory factor analysis of the empirical data from the designed semantic differential. Gender, age, and ethnic differences were differentiated by means of Student’s t-test. Gratitude can be represented by material and nonmaterial forms as well as by actions in response to help given. The empirical data allowed us to design the ethnically nonspecified semantic structure of gratitude. During the elaboration of the differential, semantic universals of gratitude, which constitute its psychosemantic content, were distinguished. Peculiarities of attitudes toward gratitude by those in different age and gender groups were revealed. Differences in the degree of manifestation of components of the psychosemantic structure of gratitude related to ethnic characteristics were not discovered

  13. 77 FR 19575 - Promoting Interoperability in the 700 MHz Commercial Spectrum; Interoperability of Mobile User...

    Science.gov (United States)

    2012-04-02

    ... efforts and/or through modifications to the Commission's technical rules or other regulatory measures. The... regulatory measures. \\1\\ The Commission has a longstanding interest in promoting the interoperability of... standards for Long-Term Evolution (LTE) wireless broadband technology are developed by the 3rd Generation...

  14. Programming the semantic web

    CERN Document Server

    Segaran, Toby; Taylor, Jamie

    2009-01-01

    With this book, the promise of the Semantic Web -- in which machines can find, share, and combine data on the Web -- is not just a technical possibility, but a practical reality Programming the Semantic Web demonstrates several ways to implement semantic web applications, using current and emerging standards and technologies. You'll learn how to incorporate existing data sources into semantically aware applications and publish rich semantic data. Each chapter walks you through a single piece of semantic technology and explains how you can use it to solve real problems. Whether you're writing

  15. Towards E-Society Policy Interoperability

    Science.gov (United States)

    Iannella, Renato

    The move towards the Policy-Oriented Web is destined to provide support for policy expression and management in the core web layers. One of the most promising areas that can drive this new technology adoption is e-Society communities. With so much user-generated content being shared by these social networks, there is the real danger that the implicit sharing rules that communities have developed over time will be lost in translation in the new digital communities. This will lead to a corresponding loss in confidence in e-Society sites. The Policy-Oriented Web attempts to turn the implicit into the explicit with a common framework for policy language interoperability and awareness. This paper reports on the policy driving factors from the Social Networks experiences using real-world use cases and scenarios. In particular, the key functions of policy-awareness - for privacy, rights, and identity - will be the driving force that enables the e-Society to appreciate new interoperable policy regimes.

  16. Interoperability of Electronic Health Records: A Physician-Driven Redesign.

    Science.gov (United States)

    Miller, Holly; Johns, Lucy

    2018-01-01

    PURPOSE: Electronic health records (EHRs), now used by hundreds of thousands of providers and encouraged by federal policy, have the potential to improve quality and decrease costs in health care. But interoperability, although technically feasible among different EHR systems, is the weak link in the chain of logic. Interoperability is inhibited by poor understanding, by suboptimal implementation, and at times by a disinclination to dilute market share or patient base on the part of vendors or providers, respectively. The intent of this project has been to develop a series of practicable recommendations that, if followed by EHR vendors and users, can promote and enhance interoperability, helping EHRs reach their potential. METHODOLOGY: A group of 11 physicians, one nurse, and one health policy consultant, practicing from California to Massachusetts, has developed a document titled "Feature and Function Recommendations To Optimize Clinician Usability of Direct Interoperability To Enhance Patient Care" that offers recommendations from the clinician point of view. This report introduces some of these recommendations and suggests their implications for policy and the "virtualization" of EHRs. CONCLUSION: Widespread adoption of even a few of these recommendations by designers and vendors would enable a major advance toward the "Triple Aim" of improving the patient experience, improving the health of populations, and reducing per capita costs.

  17. Semantic metrics

    OpenAIRE

    Hu, Bo; Kalfoglou, Yannis; Dupplaw, David; Alani, Harith; Lewis, Paul; Shadbolt, Nigel

    2006-01-01

    In the context of the Semantic Web, many ontology-related operations, e.g. ontology ranking, segmentation, alignment, articulation, reuse, evaluation, can be boiled down to one fundamental operation: computing the similarity and/or dissimilarity among ontological entities, and in some cases among ontologies themselves. In this paper, we review standard metrics for computing distance measures and we propose a series of semantic metrics. We give a formal account of semantic metrics drawn from a...

  18. A common type system for clinical natural language processing

    Directory of Open Access Journals (Sweden)

    Wu Stephen T

    2013-01-01

    Full Text Available Abstract Background One challenge in reusing clinical data stored in electronic medical records is that these data are heterogenous. Clinical Natural Language Processing (NLP plays an important role in transforming information in clinical text to a standard representation that is comparable and interoperable. Information may be processed and shared when a type system specifies the allowable data structures. Therefore, we aim to define a common type system for clinical NLP that enables interoperability between structured and unstructured data generated in different clinical settings. Results We describe a common type system for clinical NLP that has an end target of deep semantics based on Clinical Element Models (CEMs, thus interoperating with structured data and accommodating diverse NLP approaches. The type system has been implemented in UIMA (Unstructured Information Management Architecture and is fully functional in a popular open-source clinical NLP system, cTAKES (clinical Text Analysis and Knowledge Extraction System versions 2.0 and later. Conclusions We have created a type system that targets deep semantics, thereby allowing for NLP systems to encapsulate knowledge from text and share it alongside heterogenous clinical data sources. Rather than surface semantics that are typically the end product of NLP algorithms, CEM-based semantics explicitly build in deep clinical semantics as the point of interoperability with more structured data types.

  19. Inquisitive semantics and pragmatics

    NARCIS (Netherlands)

    Groenendijk, J.; Roelofsen, F.; Larrazabal, J.M.; Zubeldia, L.

    2009-01-01

    This paper starts with an informal introduction to inquisitive semantics. After that, we present a formal definition of the semantics, and introduce the basic semantic notions of inquisitiveness and informativeness, in terms of wich we define the semantic categories of questions, assertions, and

  20. Interoperable Archetypes With a Three Folded Terminology Governance.

    Science.gov (United States)

    Pederson, Rune; Ellingsen, Gunnar

    2015-01-01

    The use of openEHR archetypes increases the interoperability of clinical terminology, and in doing so improves upon the availability of clinical terminology for both primary and secondary purposes. Where clinical terminology is employed in the EPR system, research reports conflicting a results for the use of structuring and standardization as measurements of success. In order to elucidate this concept, this paper focuses on the effort to establish a national repository for openEHR based archetypes in Norway where clinical terminology could be included with benefit for interoperability three folded.

  1. Interoperable eHealth Platform for Personalized Smart Services

    DEFF Research Database (Denmark)

    Mihaylov, Mihail Rumenov; Mihovska, Albena Dimitrova; Kyriazakos, Sofoklis

    2015-01-01

    personalized context-aware applications to serve the user's needs. This paper proposes the use of advised sensing, context-aware and cloud-based lifestyle reasoning to design an innovative eHealth platform that supports highly personalized smart services to primary users. The architecture of the platform has...... been designed in accordance with the interoperability requirements and standards as proposed by ITU-T and Continua Alliance. In particular, we define the interface dependencies and functional requirements needed, to allow eCare and eHealth vendors to manufacture interoperable sensors, ambient and home...

  2. Personal semantics: at the crossroads of semantic and episodic memory.

    Science.gov (United States)

    Renoult, Louis; Davidson, Patrick S R; Palombo, Daniela J; Moscovitch, Morris; Levine, Brian

    2012-11-01

    Declarative memory is usually described as consisting of two systems: semantic and episodic memory. Between these two poles, however, may lie a third entity: personal semantics (PS). PS concerns knowledge of one's past. Although typically assumed to be an aspect of semantic memory, it is essentially absent from existing models of knowledge. Furthermore, like episodic memory (EM), PS is idiosyncratically personal (i.e., not culturally-shared). We show that, depending on how it is operationalized, the neural correlates of PS can look more similar to semantic memory, more similar to EM, or dissimilar to both. We consider three different perspectives to better integrate PS into existing models of declarative memory and suggest experimental strategies for disentangling PS from semantic and episodic memory. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. Employing Semantic Technologies for the Orchestration of Government Services

    Science.gov (United States)

    Sabol, Tomáš; Furdík, Karol; Mach, Marián

    The main aim of the eGovernment is to provide efficient, secure, inclusive services for its citizens and businesses. The necessity to integrate services and information resources, to increase accessibility, to reduce the administrative burden on citizens and enterprises - these are only a few reasons why the paradigm of the eGovernment has been shifted from the supply-driven approach toward the connected governance, emphasizing the concept of interoperability (Archmann and Nielsen 2008). On the EU level, the interoperability is explicitly addressed as one of the four main challenges, including in the i2010 strategy (i2010 2005). The Commission's Communication (Interoperability for Pan-European eGovernment Services 2006) strongly emphasizes the necessity of interoperable eGovernment services, based on standards, open specifications, and open interfaces. The Pan-European interoperability initiatives, such as the European Interoperability Framework (2004) and IDABC, as well as many projects supported by the European Commission within the IST Program and the Competitiveness and Innovation Program (CIP), illustrate the importance of interoperability on the EU level.

  4. Ocean Data Interoperability Platform (ODIP): using regional data systems for global ocean research

    Science.gov (United States)

    Schaap, D.; Thijsse, P.; Glaves, H.

    2017-12-01

    Ocean acidification, loss of coral reefs, sustainable exploitation of the marine environment are just a few of the challenges researchers around the world are currently attempting to understand and address. However, studies of these ecosystem level challenges are impossible unless researchers can discover and re-use the large volumes of interoperable multidisciplinary data that are currently only accessible through regional and global data systems that serve discreet, and often discipline specific, user communities. The plethora of marine data systems currently in existence are also using different standards, technologies and best practices making re-use of the data problematic for those engaged in interdisciplinary marine research. The Ocean Data Interoperability Platform (ODIP) is responding to this growing demand for discoverable, accessible and reusable data by establishing the foundations for a common global framework for marine data management. But creation of such an infrastructure is a major undertaking, and one that needs to be achieved in part by establishing different levels of interoperability across existing regional and global marine e-infrastructures. Workshops organised by ODIP II facilitate dialogue between selected regional and global marine data systems in an effort to identify potential solutions that integrate these marine e-infrastructures. The outcomes of these discussions have formed the basis for a number of prototype development tasks that aim to demonstrate effective sharing of data across multiple data systems, and allow users to access data from more than one system through a single access point. The ODIP II project is currently developing four prototype solutions that are establishing interoperability between selected regional marine data management infrastructures in Europe, the USA, Canada and Australia, and with the global POGO, IODE Ocean Data Portal (ODP) and GEOSS systems. The potential impact of implementing these solutions for

  5. SSWAP: A Simple Semantic Web Architecture and Protocol for semantic web services.

    Science.gov (United States)

    Gessler, Damian D G; Schiltz, Gary S; May, Greg D; Avraham, Shulamit; Town, Christopher D; Grant, David; Nelson, Rex T

    2009-09-23

    SSWAP (Simple Semantic Web Architecture and Protocol; pronounced "swap") is an architecture, protocol, and platform for using reasoning to semantically integrate heterogeneous disparate data and services on the web. SSWAP was developed as a hybrid semantic web services technology to overcome limitations found in both pure web service technologies and pure semantic web technologies. There are currently over 2400 resources published in SSWAP. Approximately two dozen are custom-written services for QTL (Quantitative Trait Loci) and mapping data for legumes and grasses (grains). The remaining are wrappers to Nucleic Acids Research Database and Web Server entries. As an architecture, SSWAP establishes how clients (users of data, services, and ontologies), providers (suppliers of data, services, and ontologies), and discovery servers (semantic search engines) interact to allow for the description, querying, discovery, invocation, and response of semantic web services. As a protocol, SSWAP provides the vocabulary and semantics to allow clients, providers, and discovery servers to engage in semantic web services. The protocol is based on the W3C-sanctioned first-order description logic language OWL DL. As an open source platform, a discovery server running at http://sswap.info (as in to "swap info") uses the description logic reasoner Pellet to integrate semantic resources. The platform hosts an interactive guide to the protocol at http://sswap.info/protocol.jsp, developer tools at http://sswap.info/developer.jsp, and a portal to third-party ontologies at http://sswapmeet.sswap.info (a "swap meet"). SSWAP addresses the three basic requirements of a semantic web services architecture (i.e., a common syntax, shared semantic, and semantic discovery) while addressing three technology limitations common in distributed service systems: i.e., i) the fatal mutability of traditional interfaces, ii) the rigidity and fragility of static subsumption hierarchies, and iii) the

  6. Inter-operability

    International Nuclear Information System (INIS)

    Plaziat, J.F.; Moulin, P.; Van Beurden, R.; Ballet, E.

    2005-01-01

    Building an internal gas market implies establishing harmonized rules for cross border trading between operators. To that effect, the European association EASEE-gas is carrying out standards and procedures, commonly called 'inter-operability'. Set up in 2002, the Association brings together all segments of the gas industry: producers, transporters, distributors, traders and shippers, suppliers, consumers and service providers. This workshop presents the latest status on issues such as barriers to gas trade in Europe, rules and procedures under preparation by EASEE-gas, and the implementation schedule of these rules by operators. This article gathers 5 presentations about this topic given at the gas conference

  7. Academic Research Library as Broker in Addressing Interoperability Challenges for the Geosciences

    Science.gov (United States)

    Smith, P., II

    2015-12-01

    Data capture is an important process in the research lifecycle. Complete descriptive and representative information of the data or database is necessary during data collection whether in the field or in the research lab. The National Science Foundation's (NSF) Public Access Plan (2015) mandates the need for federally funded projects to make their research data more openly available. Developing, implementing, and integrating metadata workflows into to the research process of the data lifecycle facilitates improved data access while also addressing interoperability challenges for the geosciences such as data description and representation. Lack of metadata or data curation can contribute to (1) semantic, (2) ontology, and (3) data integration issues within and across disciplinary domains and projects. Some researchers of EarthCube funded projects have identified these issues as gaps. These gaps can contribute to interoperability data access, discovery, and integration issues between domain-specific and general data repositories. Academic Research Libraries have expertise in providing long-term discovery and access through the use of metadata standards and provision of access to research data, datasets, and publications via institutional repositories. Metadata crosswalks, open archival information systems (OAIS), trusted-repositories, data seal of approval, persistent URL, linking data, objects, resources, and publications in institutional repositories and digital content management systems are common components in the library discipline. These components contribute to a library perspective on data access and discovery that can benefit the geosciences. The USGS Community for Data Integration (CDI) has developed the Science Support Framework (SSF) for data management and integration within its community of practice for contribution to improved understanding of the Earth's physical and biological systems. The USGS CDI SSF can be used as a reference model to map to Earth

  8. Examining the Relationship between Electronic Health Record Interoperability and Quality Management

    Science.gov (United States)

    Purcell, Bernice M.

    2013-01-01

    A lack of interoperability impairs data quality among health care providers' electronic health record (EHR) systems. The problem is whether the International Organization for Standardization (ISO) 9000 principles relate to the problem of interoperability in implementation of EHR systems. The purpose of the nonexperimental quantitative research…

  9. SemanticOrganizer: A Customizable Semantic Repository for Distributed NASA Project Teams

    Science.gov (United States)

    Keller, Richard M.; Berrios, Daniel C.; Carvalho, Robert E.; Hall, David R.; Rich, Stephen J.; Sturken, Ian B.; Swanson, Keith J.; Wolfe, Shawn R.

    2004-01-01

    SemanticOrganizer is a collaborative knowledge management system designed to support distributed NASA projects, including diverse teams of scientists, engineers, and accident investigators. The system provides a customizable, semantically structured information repository that stores work products relevant to multiple projects of differing types. SemanticOrganizer is one of the earliest and largest semantic web applications deployed at NASA to date, and has been used in diverse contexts ranging from the investigation of Space Shuttle Columbia's accident to the search for life on other planets. Although the underlying repository employs a single unified ontology, access control and ontology customization mechanisms make the repository contents appear different for each project team. This paper describes SemanticOrganizer, its customization facilities, and a sampling of its applications. The paper also summarizes some key lessons learned from building and fielding a successful semantic web application across a wide-ranging set of domains with diverse users.

  10. Varieties of semantic ‘access’ deficit in Wernicke’s aphasia and semantic aphasia

    Science.gov (United States)

    Robson, Holly; Lambon Ralph, Matthew A.; Jefferies, Elizabeth

    2015-01-01

    Comprehension deficits are common in stroke aphasia, including in cases with (i) semantic aphasia, characterized by poor executive control of semantic processing across verbal and non-verbal modalities; and (ii) Wernicke’s aphasia, associated with poor auditory–verbal comprehension and repetition, plus fluent speech with jargon. However, the varieties of these comprehension problems, and their underlying causes, are not well understood. Both patient groups exhibit some type of semantic ‘access’ deficit, as opposed to the ‘storage’ deficits observed in semantic dementia. Nevertheless, existing descriptions suggest that these patients might have different varieties of ‘access’ impairment—related to difficulty resolving competition (in semantic aphasia) versus initial activation of concepts from sensory inputs (in Wernicke’s aphasia). We used a case series design to compare patients with Wernicke’s aphasia and those with semantic aphasia on Warrington’s paradigmatic assessment of semantic ‘access’ deficits. In these verbal and non-verbal matching tasks, a small set of semantically-related items are repeatedly presented over several cycles so that the target on one trial becomes a distractor on another (building up interference and eliciting semantic ‘blocking’ effects). Patients with Wernicke’s aphasia and semantic aphasia were distinguished according to lesion location in the temporal cortex, but in each group, some individuals had additional prefrontal damage. Both of these aspects of lesion variability—one that mapped onto classical ‘syndromes’ and one that did not—predicted aspects of the semantic ‘access’ deficit. Both semantic aphasia and Wernicke’s aphasia cases showed multimodal semantic impairment, although as expected, the Wernicke’s aphasia group showed greater deficits on auditory-verbal than picture judgements. Distribution of damage in the temporal lobe was crucial for predicting the initially

  11. Integrating Semantic Information in Metadata Descriptions for a Geoscience-wide Resource Inventory.

    Science.gov (United States)

    Zaslavsky, I.; Richard, S. M.; Gupta, A.; Valentine, D.; Whitenack, T.; Ozyurt, I. B.; Grethe, J. S.; Schachne, A.

    2016-12-01

    Integrating semantic information into legacy metadata catalogs is a challenging issue and so far has been mostly done on a limited scale. We present experience of CINERGI (Community Inventory of Earthcube Resources for Geoscience Interoperability), an NSF Earthcube Building Block project, in creating a large cross-disciplinary catalog of geoscience information resources to enable cross-domain discovery. The project developed a pipeline for automatically augmenting resource metadata, in particular generating keywords that describe metadata documents harvested from multiple geoscience information repositories or contributed by geoscientists through various channels including surveys and domain resource inventories. The pipeline examines available metadata descriptions using text parsing, vocabulary management and semantic annotation and graph navigation services of GeoSciGraph. GeoSciGraph, in turn, relies on a large cross-domain ontology of geoscience terms, which bridges several independently developed ontologies or taxonomies including SWEET, ENVO, YAGO, GeoSciML, GCMD, SWO, and CHEBI. The ontology content enables automatic extraction of keywords reflecting science domains, equipment used, geospatial features, measured properties, methods, processes, etc. We specifically focus on issues of cross-domain geoscience ontology creation, resolving several types of semantic conflicts among component ontologies or vocabularies, and constructing and managing facets for improved data discovery and navigation. The ontology and keyword generation rules are iteratively improved as pipeline results are presented to data managers for selective manual curation via a CINERGI Annotator user interface. We present lessons learned from applying CINERGI metadata augmentation pipeline to a number of federal agency and academic data registries, in the context of several use cases that require data discovery and integration across multiple earth science data catalogs of varying quality

  12. Measures for interoperability of phenotypic data: minimum information requirements and formatting.

    Science.gov (United States)

    Ćwiek-Kupczyńska, Hanna; Altmann, Thomas; Arend, Daniel; Arnaud, Elizabeth; Chen, Dijun; Cornut, Guillaume; Fiorani, Fabio; Frohmberg, Wojciech; Junker, Astrid; Klukas, Christian; Lange, Matthias; Mazurek, Cezary; Nafissi, Anahita; Neveu, Pascal; van Oeveren, Jan; Pommier, Cyril; Poorter, Hendrik; Rocca-Serra, Philippe; Sansone, Susanna-Assunta; Scholz, Uwe; van Schriek, Marco; Seren, Ümit; Usadel, Björn; Weise, Stephan; Kersey, Paul; Krajewski, Paweł

    2016-01-01

    Plant phenotypic data shrouds a wealth of information which, when accurately analysed and linked to other data types, brings to light the knowledge about the mechanisms of life. As phenotyping is a field of research comprising manifold, diverse and time-consuming experiments, the findings can be fostered by reusing and combining existing datasets. Their correct interpretation, and thus replicability, comparability and interoperability, is possible provided that the collected observations are equipped with an adequate set of metadata. So far there have been no common standards governing phenotypic data description, which hampered data exchange and reuse. In this paper we propose the guidelines for proper handling of the information about plant phenotyping experiments, in terms of both the recommended content of the description and its formatting. We provide a document called "Minimum Information About a Plant Phenotyping Experiment", which specifies what information about each experiment should be given, and a Phenotyping Configuration for the ISA-Tab format, which allows to practically organise this information within a dataset. We provide examples of ISA-Tab-formatted phenotypic data, and a general description of a few systems where the recommendations have been implemented. Acceptance of the rules described in this paper by the plant phenotyping community will help to achieve findable, accessible, interoperable and reusable data.

  13. A Guide to Understanding Emerging Interoperability Technologies

    National Research Council Canada - National Science Library

    Bollinger, Terry

    2000-01-01

    .... Over time, individual interoperability problems tend to disappear as the resources involved literally become part of one system through integration and standardization, but the overall problem...

  14. Evaluation of Interoperability Protocols in Repositories of Electronic Theses and Dissertations

    Science.gov (United States)

    Hakimjavadi, Hesamedin; Masrek, Mohamad Noorman

    2013-01-01

    Purpose: The purpose of this study is to evaluate the status of eight interoperability protocols within repositories of electronic theses and dissertations (ETDs) as an introduction to further studies on feasibility of deploying these protocols in upcoming areas of interoperability. Design/methodology/approach: Three surveys of 266 ETD…

  15. An Interoperability Framework and Capability Profiling for Manufacturing Software

    Science.gov (United States)

    Matsuda, M.; Arai, E.; Nakano, N.; Wakai, H.; Takeda, H.; Takata, M.; Sasaki, H.

    ISO/TC184/SC5/WG4 is working on ISO16100: Manufacturing software capability profiling for interoperability. This paper reports on a manufacturing software interoperability framework and a capability profiling methodology which were proposed and developed through this international standardization activity. Within the context of manufacturing application, a manufacturing software unit is considered to be capable of performing a specific set of function defined by a manufacturing software system architecture. A manufacturing software interoperability framework consists of a set of elements and rules for describing the capability of software units to support the requirements of a manufacturing application. The capability profiling methodology makes use of the domain-specific attributes and methods associated with each specific software unit to describe capability profiles in terms of unit name, manufacturing functions, and other needed class properties. In this methodology, manufacturing software requirements are expressed in terns of software unit capability profiles.

  16. Getting connected: Both associative and semantic links structure semantic memory for newly learned persons.

    Science.gov (United States)

    Wiese, Holger; Schweinberger, Stefan R

    2015-01-01

    The present study examined whether semantic memory for newly learned people is structured by visual co-occurrence, shared semantics, or both. Participants were trained with pairs of simultaneously presented (i.e., co-occurring) preexperimentally unfamiliar faces, which either did or did not share additionally provided semantic information (occupation, place of living, etc.). Semantic information could also be shared between faces that did not co-occur. A subsequent priming experiment revealed faster responses for both co-occurrence/no shared semantics and no co-occurrence/shared semantics conditions, than for an unrelated condition. Strikingly, priming was strongest in the co-occurrence/shared semantics condition, suggesting additive effects of these factors. Additional analysis of event-related brain potentials yielded priming in the N400 component only for combined effects of visual co-occurrence and shared semantics, with more positive amplitudes in this than in the unrelated condition. Overall, these findings suggest that both semantic relatedness and visual co-occurrence are important when novel information is integrated into person-related semantic memory.

  17. Word-embeddings Italian semantic spaces: A semantic model for psycholinguistic research

    Directory of Open Access Journals (Sweden)

    Marelli Marco

    2017-01-01

    Full Text Available Distributional semantics has been for long a source of successful models in psycholinguistics, permitting to obtain semantic estimates for a large number of words in an automatic and fast way. However, resources in this respect remain scarce or limitedly accessible for languages different from English. The present paper describes WEISS (Word-Embeddings Italian Semantic Space, a distributional semantic model based on Italian. WEISS includes models of semantic representations that are trained adopting state-of-the-art word-embeddings methods, applying neural networks to induce distributed representations for lexical meanings. The resource is evaluated against two test sets, demonstrating that WEISS obtains a better performance with respect to a baseline encoding word associations. Moreover, an extensive qualitative analysis of the WEISS output provides examples of the model potentialities in capturing several semantic phenomena. Two variants of WEISS are released and made easily accessible via web through the SNAUT graphic interface.

  18. Development of a Ground Water Data Portal for Interoperable Data Exchange within the U.S. National Ground Water Monitoring Network and Beyond

    Science.gov (United States)

    Booth, N. L.; Brodaric, B.; Lucido, J. M.; Kuo, I.; Boisvert, E.; Cunningham, W. L.

    2011-12-01

    using the OGC Sensor Observation Service (SOS) standard. Ground Water Markup Language (GWML) encodes well log, lithology and construction information and is exchanged using the OGC Web Feature Service (WFS) standard. Within the NGWMN Data Portal, data exchange between distributed data provider repositories is achieved through the use of these web services and a central mediation hub, which performs both format (syntactic) and nomenclature (semantic) mediation, conforming heterogeneous inputs into common standards-based outputs. Through these common standards, interoperability between the U.S. NGWMN and Canada's Groundwater Information Network (GIN) is achieved, advancing a ground water virtual observatory across North America.

  19. Secure and interoperable communication infrastructures for PPDR organisations

    Science.gov (United States)

    Müller, Wilmuth; Marques, Hugo; Pereira, Luis; Rodriguez, Jonathan; Brouwer, Frank; Bouwers, Bert; Politis, Ilias; Lykourgiotis, Asimakis; Ladas, Alexandros; Adigun, Olayinka; Jelenc, David

    2016-05-01

    The growing number of events affecting public safety and security (PS&S) on a regional scale with potential to grow up to large scale cross border disasters puts an increased pressure on agencies and organisation responsible for PS&S. In order to respond timely and in an adequate manner to such events, Public Protection and Disaster Relief (PPDR) organisations need to cooperate, align their procedures and activities, share the needed information and be interoperable. Existing PPDR/PMR technologies such as TETRA, TETRAPOL or P25, do not currently provide broadband capability nor is expected such technologies to be upgraded in the future. This presents a major limitation in supporting new services and information flows. Furthermore, there is no known standard that addresses interoperability of these technologies. In this contribution the design of a next generation communication infrastructure for PPDR organisations which fulfills the requirements of secure and seamless end-to-end communication and interoperable information exchange within the deployed communication networks is presented. Based on Enterprise Architecture of PPDR organisations, a next generation PPDR network that is backward compatible with legacy communication technologies is designed and implemented, capable of providing security, privacy, seamless mobility, QoS and reliability support for mission-critical Private Mobile Radio (PMR) voice and broadband data services. The designed solution provides a robust, reliable, and secure mobile broadband communications system for a wide variety of PMR applications and services on PPDR broadband networks, including the ability of inter-system, interagency and cross-border operations with emphasis on interoperability between users in PMR and LTE.

  20. Semantics of Kinship Terms in Tamil from the Semantic Typology Point of View

    Directory of Open Access Journals (Sweden)

    Анна Александровна Смирнитская

    2016-12-01

    Full Text Available In this article the author examines the lexical-semantic group “kinship terms” in Tamil, applying the attainments of modern semantic typology and the theory of semantic derivation. The kinship terms describing nuclear and extended family are explored. The “semantic shift” relation between two different meanings is established if such relation is realized by synchronous polysemy in one lexeme, semantic derivation, diachronic semantic change, cognates or some other means. The starting point of the study is the typological data from the DatSemShift catalogue of semantic shifts in languages of the world developed by a group of researchers under the guidance of Anna A. Zalizniak in the Institute of Linguistics, RAS. We verify the presence of semantic shifts described in the Database in Tamil. Also, we propose new semantic shifts specific only for this language. We confirm the presence of semantic relation of the studied type among the meanings with English “labels”: father - parents, girl - daughter, to deliver (a child - parents, - child, old woman - wife, owner - wife and others. The data also allows the assumption that the same relation exists between the meanings: old - grandfather, earth - mother, son - courage, unripe - son and others. The meanings of this field are the sources of semantic movements to abstract notions, lexicon of posession, forms of address and others; in addition many inner semantic relations inside this field are revealed. The meanings covering the nuclear part of the kinship system participate in universal semantic shifts described in the DatSemShift catalogue, while the meanings from collateral branches of this bifurcative kinship system (uncle, aunt turn out to be incomparable with kinship terms from indo-european lineal systems. Their meanings can be included in the DatSemShift catalogue only with an indication of system specifics. The information about semantic shifts can be useful for

  1. Semantic Role Labeling

    CERN Document Server

    Palmer, Martha; Xue, Nianwen

    2011-01-01

    This book is aimed at providing an overview of several aspects of semantic role labeling. Chapter 1 begins with linguistic background on the definition of semantic roles and the controversies surrounding them. Chapter 2 describes how the theories have led to structured lexicons such as FrameNet, VerbNet and the PropBank Frame Files that in turn provide the basis for large scale semantic annotation of corpora. This data has facilitated the development of automatic semantic role labeling systems based on supervised machine learning techniques. Chapter 3 presents the general principles of applyin

  2. Representations for Semantic Learning Webs: Semantic Web Technology in Learning Support

    Science.gov (United States)

    Dzbor, M.; Stutt, A.; Motta, E.; Collins, T.

    2007-01-01

    Recent work on applying semantic technologies to learning has concentrated on providing novel means of accessing and making use of learning objects. However, this is unnecessarily limiting: semantic technologies will make it possible to develop a range of educational Semantic Web services, such as interpretation, structure-visualization, support…

  3. Semantic similarity-based alignment between clinical archetypes and SNOMED CT: an application to observations.

    Science.gov (United States)

    Meizoso García, María; Iglesias Allones, José Luis; Martínez Hernández, Diego; Taboada Iglesias, María Jesús

    2012-08-01

    One of the main challenges of eHealth is semantic interoperability of health systems. But, this will only be possible if the capture, representation and access of patient data is standardized. Clinical data models, such as OpenEHR Archetypes, define data structures that are agreed by experts to ensure the accuracy of health information. In addition, they provide an option to normalize clinical data by means of binding terms used in the model definition to standard medical vocabularies. Nevertheless, the effort needed to establish the association between archetype terms and standard terminology concepts is considerable. Therefore, the purpose of this study is to provide an automated approach to bind OpenEHR archetypes terms to the external terminology SNOMED CT, with the capability to do it at a semantic level. This research uses lexical techniques and external terminological tools in combination with context-based techniques, which use information about structural and semantic proximity to identify similarities between terms and so, to find alignments between them. The proposed approach exploits both the structural context of archetypes and the terminology context, in which concepts are logically defined through the relationships (hierarchical and definitional) to other concepts. A set of 25 OBSERVATION archetypes with 477 bound terms was used to test the method. Of these, 342 terms (74.6%) were linked with 96.1% precision, 71.7% recall and 1.23 SNOMED CT concepts on average for each mapping. It has been detected that about one third of the archetype clinical information is grouped logically. Context-based techniques take advantage of this to increase the recall and to validate a 30.4% of the bindings produced by lexical techniques. This research shows that it is possible to automatically map archetype terms to a standard terminology with a high precision and recall, with the help of appropriate contextual and semantic information of both models. Moreover, the

  4. Intercloud Architecture for interoperability and integration

    NARCIS (Netherlands)

    Demchenko, Y.; Makkes, M.X.; Strijkers, R.J.; Laat, C. de

    2012-01-01

    This paper presents on-going research to develop the Intercloud Architecture Framework (ICAF) that addresses problems in multi-provider multi-domain heterogeneous cloud based infrastructure services and applications integration and interoperability. The paper refers to existing standards in Cloud

  5. Intercloud architecture for interoperability and integration

    NARCIS (Netherlands)

    Demchenko, Y.; Makkes, M.X.; Strijkers, R.; de Laat, C.

    2012-01-01

    This paper presents on-going research to develop the Intercloud Architecture Framework (ICAF) that addresses problems in multi-provider multi-domain heterogeneous cloud based infrastructure services and applications integration and interoperability. The paper refers to existing standards in Cloud

  6. An E-government Interoperability Platform Supporting Personal Data Protection Regulations

    OpenAIRE

    González, Laura; Echevarría, Andrés; Morales, Dahiana; Ruggia, Raúl

    2016-01-01

    Public agencies are increasingly required to collaborate with each other in order to provide high-quality e-government services. This collaboration is usually based on the service-oriented approach and supported by interoperability platforms. Such platforms are specialized middleware-based infrastructures enabling the provision, discovery and invocation of interoperable software services. In turn, given that personal data handled by governments are often very sensitive, most governments have ...

  7. Interoperation of World-Wide Production e-Science Infrastructures

    CERN Document Server

    Riedel, M; Soddemann, T; Field, L; Navarro, JP; Casey, J; Litmaath, M; Baud, J; Koblitz, B; Catlett, C; Skow, D; Wang, S; Saeki, Y; Sato, H; Matsuoka, S; Geddes, N

    Many production Grid and e-Science infrastructures have begun to offer services to end-users during the past several years with an increasing number of scientific applications that require access to a wide variety of resources and services in multiple Grids. Therefore, the Grid Interoperation Now—Community Group of the Open Grid Forum—organizes and manages interoperation efforts among those production Grid infrastructures to reach the goal of a world-wide Grid vision on a technical level in the near future. This contribution highlights fundamental approaches of the group and discusses open standards in the context of production e-Science infrastructures.

  8. WebGIS based on semantic grid model and web services

    Science.gov (United States)

    Zhang, WangFei; Yue, CaiRong; Gao, JianGuo

    2009-10-01

    As the combination point of the network technology and GIS technology, WebGIS has got the fast development in recent years. With the restriction of Web and the characteristics of GIS, traditional WebGIS has some prominent problems existing in development. For example, it can't accomplish the interoperability of heterogeneous spatial databases; it can't accomplish the data access of cross-platform. With the appearance of Web Service and Grid technology, there appeared great change in field of WebGIS. Web Service provided an interface which can give information of different site the ability of data sharing and inter communication. The goal of Grid technology was to make the internet to a large and super computer, with this computer we can efficiently implement the overall sharing of computing resources, storage resource, data resource, information resource, knowledge resources and experts resources. But to WebGIS, we only implement the physically connection of data and information and these is far from the enough. Because of the different understanding of the world, following different professional regulations, different policies and different habits, the experts in different field will get different end when they observed the same geographic phenomenon and the semantic heterogeneity produced. Since these there are large differences to the same concept in different field. If we use the WebGIS without considering of the semantic heterogeneity, we will answer the questions users proposed wrongly or we can't answer the questions users proposed. To solve this problem, this paper put forward and experienced an effective method of combing semantic grid and Web Services technology to develop WebGIS. In this paper, we studied the method to construct ontology and the method to combine Grid technology and Web Services and with the detailed analysis of computing characteristics and application model in the distribution of data, we designed the WebGIS query system driven by

  9. UML 2 Semantics and Applications

    CERN Document Server

    Lano, Kevin

    2009-01-01

    A coherent and integrated account of the leading UML 2 semantics work and the practical applications of UML semantics development With contributions from leading experts in the field, the book begins with an introduction to UML and goes on to offer in-depth and up-to-date coverage of: The role of semantics Considerations and rationale for a UML system model Definition of the UML system model UML descriptive semantics Axiomatic semantics of UML class diagrams The object constraint language Axiomatic semantics of state machines A coalgebraic semantic framework for reasoning about interaction des

  10. Epimenides: Interoperability Reasoning for Digital Preservation

    NARCIS (Netherlands)

    Kargakis, Yannis; Tzitzikas, Yannis; van Horik, M.P.M.

    2014-01-01

    This paper presents Epimenides, a system that implements a novel interoperability dependency reasoning approach for assisting digital preservation activities. A distinctive feature is that it can model also converters and emulators, and the adopted modelling approach enables the automatic reasoning

  11. Semantic memory in object use.

    Science.gov (United States)

    Silveri, Maria Caterina; Ciccarelli, Nicoletta

    2009-10-01

    We studied five patients with semantic memory disorders, four with semantic dementia and one with herpes simplex virus encephalitis, to investigate the involvement of semantic conceptual knowledge in object use. Comparisons between patients who had semantic deficits of different severity, as well as the follow-up, showed that the ability to use objects was largely preserved when the deficit was mild but progressively decayed as the deficit became more severe. Naming was generally more impaired than object use. Production tasks (pantomime execution and actual object use) and comprehension tasks (pantomime recognition and action recognition) as well as functional knowledge about objects were impaired when the semantic deficit was severe. Semantic and unrelated errors were produced during object use, but actions were always fluent and patients performed normally on a novel tools task in which the semantic demand was minimal. Patients with severe semantic deficits scored borderline on ideational apraxia tasks. Our data indicate that functional semantic knowledge is crucial for using objects in a conventional way and suggest that non-semantic factors, mainly non-declarative components of memory, might compensate to some extent for semantic disorders and guarantee some residual ability to use very common objects independently of semantic knowledge.

  12. Intellectual Capital and Predefined Headings in Swedish Health Care Sector

    Directory of Open Access Journals (Sweden)

    Terner Annika

    2017-01-01

    Full Text Available The heavily decentralized Swedish health care sector is facing massive challenges, e.g. to even out differences in health care performance. Intellectual Capital can partly be used to explain these differences. In the research field it is difficult to find contributions regarding the study of intellectual capital management in the health care sector and there is also a lack of studies on semantic interoperability. It is semantic interoperability which allows the right information to be available to the right people at the right time across products and organizations. Structured and standardized headings can be a tool to enable semantic interoperability. The aim of this article is to argue for predefined headings as intellectual capital and as base for a national shared and standardized terminology in the health care sector. The study shows that there is a lack of national management of predefined headings deployed in both electronic health records and national quality registries. This lack causes multiple documentation which is time-consuming, impacts health professionals’ workloads, data quality and partly the performance of health care. We argue that predefined headings can be a base for semantic interoperability and that there is a need for the management of predefined headings on a national level.

  13. Forcing Interoperability: An Intentionally Fractured Approach

    Science.gov (United States)

    Gallaher, D. W.; Brodzik, M.; Scambos, T.; Stroeve, J.

    2008-12-01

    The NSIDC is attempting to rebuild a significant portion of its public-facing cyberinfrastructure to better meet the needs expressed by the cryospheric community. The project initially addresses a specific science need - understanding Greenland's contribution to global sea level rise through comparison and analysis of variables such as temperature, albedo, melt, ice velocity and surface elevation. This project will ultimately be expanded to cover most of NSIDC's cryospheric data. Like many organizations, we need to provide users with data discovery interfaces, collaboration tools and mapping services. Complicating this effort is the need to reduce the volume of raw data delivered to the user. Data growth, especially with time-series data, will overwhelm our software, processors and network like never before. We need to provide the users the ability to perform first level analysis directly on our site. In order to accomplish this, the users should be free to modify the behavior of these tools as well as incorporate their own tools and analysis to meet their needs. Rather than building one monolithic project to build this system, we have chosen to build three semi-independent systems. One team is building a data discovery and web based distribution system, the second is building an advanced analysis and workflow system and the third is building a customized web mapping service. These systems will use the same underlying data structures and services but will employ different technologies and teams to build their objectives, schedules and user interfaces. Obviously, we are adding complexity and risk to the overall project however this may be the best method to achieve interoperability because the development teams will be required to build off each others work. The teams will be forced to design with other users in mind as opposed to building interoperability as an afterthought, which a tendency in monolithic systems. All three teams will take advantage of preexisting

  14. The structure of semantic person memory: evidence from semantic priming in person recognition.

    Science.gov (United States)

    Wiese, Holger

    2011-11-01

    This paper reviews research on the structure of semantic person memory as examined with semantic priming. In this experimental paradigm, a familiarity decision on a target face or written name is usually faster when it is preceded by a related as compared to an unrelated prime. This effect has been shown to be relatively short lived and susceptible to interfering items. Moreover, semantic priming can cross stimulus domains, such that a written name can prime a target face and vice versa. However, it remains controversial whether representations of people are stored in associative networks based on co-occurrence, or in more abstract semantic categories. In line with prominent cognitive models of face recognition, which explain semantic priming by shared semantic information between prime and target, recent research demonstrated that priming could be obtained from purely categorically related, non-associated prime/target pairs. Although strategic processes, such as expectancy and retrospective matching likely contribute, there is also evidence for a non-strategic contribution to priming, presumably related to spreading activation. Finally, a semantic priming effect has been demonstrated in the N400 event-related potential (ERP) component, which may reflect facilitated access to semantic information. It is concluded that categorical relatedness is one organizing principle of semantic person memory. ©2011 The British Psychological Society.

  15. Meinongian Semantics and Artificial Intelligence

    Directory of Open Access Journals (Sweden)

    William J. Rapaport

    2013-12-01

    Full Text Available This essay describes computational semantic networks for a philosophical audience and surveys several approaches to semantic-network semantics. In particular, propositional semantic networks (exemplified by SNePS are discussed; it is argued that only a fully intensional, Meinongian semantics is appropriate for them; and several Meinongian systems are presented.

  16. Geospatial Semantics and the Semantic Web

    CERN Document Server

    Ashish, Naveen

    2011-01-01

    The availability of geographic and geospatial information and services, especially on the open Web has become abundant in the last several years with the proliferation of online maps, geo-coding services, geospatial Web services and geospatially enabled applications. The need for geospatial reasoning has significantly increased in many everyday applications including personal digital assistants, Web search applications, local aware mobile services, specialized systems for emergency response, medical triaging, intelligence analysis and more. Geospatial Semantics and the Semantic Web: Foundation

  17. A Semantic Web management model for integrative biomedical informatics.

    Directory of Open Access Journals (Sweden)

    Helena F Deus

    2008-08-01

    Full Text Available Data, data everywhere. The diversity and magnitude of the data generated in the Life Sciences defies automated articulation among complementary efforts. The additional need in this field for managing property and access permissions compounds the difficulty very significantly. This is particularly the case when the integration involves multiple domains and disciplines, even more so when it includes clinical and high throughput molecular data.The emergence of Semantic Web technologies brings the promise of meaningful interoperation between data and analysis resources. In this report we identify a core model for biomedical Knowledge Engineering applications and demonstrate how this new technology can be used to weave a management model where multiple intertwined data structures can be hosted and managed by multiple authorities in a distributed management infrastructure. Specifically, the demonstration is performed by linking data sources associated with the Lung Cancer SPORE awarded to The University of Texas MD Anderson Cancer Center at Houston and the Southwestern Medical Center at Dallas. A software prototype, available with open source at www.s3db.org, was developed and its proposed design has been made publicly available as an open source instrument for shared, distributed data management.The Semantic Web technologies have the potential to addresses the need for distributed and evolvable representations that are critical for systems Biology and translational biomedical research. As this technology is incorporated into application development we can expect that both general purpose productivity software and domain specific software installed on our personal computers will become increasingly integrated with the relevant remote resources. In this scenario, the acquisition of a new dataset should automatically trigger the delegation of its analysis.

  18. Semantic Web based Self-management for a Pervasive Service Middleware

    DEFF Research Database (Denmark)

    Zhang, Weishan; Hansen, Klaus Marius

    2008-01-01

    Self-management is one of the challenges for realizing ambient intelligence in pervasive computing. In this paper,we propose and present a semantic Web based self-management approach for a pervasive service middleware where dynamic context information is encoded in a set of self-management context...... ontologies. The proposed approach is justified from the characteristics of pervasive computing and the open world assumption and reasoning potentials of semantic Web and its rule language. To enable real-time self-management, application level and network level state reporting is employed in our approach....... State changes are triggering execution of self-management rules for adaption, monitoring, diagnosis, and so on. Evaluations of self-diagnosis in terms of extensibility, performance,and scalability show that the semantic Web based self-management approach is effective to achieve the self-diagnosis goals...

  19. ROMIE, une approche d'alignement d'ontologies à base d'instances

    OpenAIRE

    Elbyed , Abdeltif

    2009-01-01

    System interoperability is an important issue, widely recognized in information technology intensive organizations and in the research community of information systems. The wide adoption of the World Wide Web to access and distribute information further stresses the need for system interoperability. Initiative solutions like the Semantic Web facilitate the localization and the integration of the data in a more intelligent way via the use of ontologies. The Semantic Web offers a compelling vis...

  20. Verbal and non-verbal semantic impairment: From fluent primary progressive aphasia to semantic dementia

    Directory of Open Access Journals (Sweden)

    Mirna Lie Hosogi Senaha

    Full Text Available Abstract Selective disturbances of semantic memory have attracted the interest of many investigators and the question of the existence of single or multiple semantic systems remains a very controversial theme in the literature. Objectives: To discuss the question of multiple semantic systems based on a longitudinal study of a patient who presented semantic dementia from fluent primary progressive aphasia. Methods: A 66 year-old woman with selective impairment of semantic memory was examined on two occasions, undergoing neuropsychological and language evaluations, the results of which were compared to those of three paired control individuals. Results: In the first evaluation, physical examination was normal and the score on the Mini-Mental State Examination was 26. Language evaluation revealed fluent speech, anomia, disturbance in word comprehension, preservation of the syntactic and phonological aspects of the language, besides surface dyslexia and dysgraphia. Autobiographical and episodic memories were relatively preserved. In semantic memory tests, the following dissociation was found: disturbance of verbal semantic memory with preservation of non-verbal semantic memory. Magnetic resonance of the brain revealed marked atrophy of the left anterior temporal lobe. After 14 months, the difficulties in verbal semantic memory had become more severe and the semantic disturbance, limited initially to the linguistic sphere, had worsened to involve non-verbal domains. Conclusions: Given the dissociation found in the first examination, we believe there is sufficient clinical evidence to refute the existence of a unitary semantic system.

  1. Montague semantics

    NARCIS (Netherlands)

    Janssen, T.M.V.

    2012-01-01

    Montague semantics is a theory of natural language semantics and of its relation with syntax. It was originally developed by the logician Richard Montague (1930-1971) and subsequently modified and extended by linguists, philosophers, and logicians. The most important features of the theory are its

  2. Using a logical information model-driven design process in healthcare.

    Science.gov (United States)

    Cheong, Yu Chye; Bird, Linda; Tun, Nwe Ni; Brooks, Colleen

    2011-01-01

    A hybrid standards-based approach has been adopted in Singapore to develop a Logical Information Model (LIM) for healthcare information exchange. The Singapore LIM uses a combination of international standards, including ISO13606-1 (a reference model for electronic health record communication), ISO21090 (healthcare datatypes), SNOMED CT (healthcare terminology) and HL7 v2 (healthcare messaging). This logic-based design approach also incorporates mechanisms for achieving bi-directional semantic interoperability.

  3. Semantic Interoperability in Distributed Planning

    National Research Council Canada - National Science Library

    Staskevich, Gennady R; Hudack, Jeffrey W; Lawton, James; Carozzoni, Joseph A

    2008-01-01

    .... A key challenge for this transformation to Globally-Linked Air and Space Operations Centers is developing the ability to collaboratively plan and execute operations with multiple cooperating command centers...

  4. From Data to Semantic Information

    Directory of Open Access Journals (Sweden)

    Luciano Floridi

    2003-06-01

    Full Text Available Abstract: There is no consensus yet on the definition of semantic information. This paper contributes to the current debate by criticising and revising the Standard Definition of semantic Information (SDI as meaningful data, in favour of the Dretske-Grice approach: meaningful and well-formed data constitute semantic information only if they also qualify as contingently truthful. After a brief introduction, SDI is criticised for providing necessary but insufficient conditions for the definition of semantic information. SDI is incorrect because truth-values do not supervene on semantic information, and misinformation (that is, false semantic information is not a type of semantic information, but pseudo-information, that is not semantic information at all. This is shown by arguing that none of the reasons for interpreting misinformation as a type of semantic information is convincing, whilst there are compelling reasons to treat it as pseudo-information. As a consequence, SDI is revised to include a necessary truth-condition. The last section summarises the main results of the paper and indicates the important implications of the revised definition for the analysis of the deflationary theories of truth, the standard definition of knowledge and the classic, quantitative theory of semantic information.

  5. Semantic Web meets Integrative Biology: a survey.

    Science.gov (United States)

    Chen, Huajun; Yu, Tong; Chen, Jake Y

    2013-01-01

    Integrative Biology (IB) uses experimental or computational quantitative technologies to characterize biological systems at the molecular, cellular, tissue and population levels. IB typically involves the integration of the data, knowledge and capabilities across disciplinary boundaries in order to solve complex problems. We identify a series of bioinformatics problems posed by interdisciplinary integration: (i) data integration that interconnects structured data across related biomedical domains; (ii) ontology integration that brings jargons, terminologies and taxonomies from various disciplines into a unified network of ontologies; (iii) knowledge integration that integrates disparate knowledge elements from multiple sources; (iv) service integration that build applications out of services provided by different vendors. We argue that IB can benefit significantly from the integration solutions enabled by Semantic Web (SW) technologies. The SW enables scientists to share content beyond the boundaries of applications and websites, resulting into a web of data that is meaningful and understandable to any computers. In this review, we provide insight into how SW technologies can be used to build open, standardized and interoperable solutions for interdisciplinary integration on a global basis. We present a rich set of case studies in system biology, integrative neuroscience, bio-pharmaceutics and translational medicine, to highlight the technical features and benefits of SW applications in IB.

  6. Foundations of semantic web technologies

    CERN Document Server

    Hitzler, Pascal; Rudolph, Sebastian

    2009-01-01

    The Quest for Semantics Building Models Calculating with Knowledge Exchanging Information Semanic Web Technologies RESOURCE DESCRIPTION LANGUAGE (RDF)Simple Ontologies in RDF and RDF SchemaIntroduction to RDF Syntax for RDF Advanced Features Simple Ontologies in RDF Schema Encoding of Special Data Structures An ExampleRDF Formal Semantics Why Semantics? Model-Theoretic Semantics for RDF(S) Syntactic Reasoning with Deduction Rules The Semantic Limits of RDF(S)WEB ONTOLOGY LANGUAGE (OWL) Ontologies in OWL OWL Syntax and Intuitive Semantics OWL Species The Forthcoming OWL 2 StandardOWL Formal Sem

  7. A fusion network for semantic segmentation using RGB-D data

    Science.gov (United States)

    Yuan, Jiahui; Zhang, Kun; Xia, Yifan; Qi, Lin; Dong, Junyu

    2018-04-01

    Semantic scene parsing is considerable in many intelligent field, including perceptual robotics. For the past few years, pixel-wise prediction tasks like semantic segmentation with RGB images has been extensively studied and has reached very remarkable parsing levels, thanks to convolutional neural networks (CNNs) and large scene datasets. With the development of stereo cameras and RGBD sensors, it is expected that additional depth information will help improving accuracy. In this paper, we propose a semantic segmentation framework incorporating RGB and complementary depth information. Motivated by the success of fully convolutional networks (FCN) in semantic segmentation field, we design a fully convolutional networks consists of two branches which extract features from both RGB and depth data simultaneously and fuse them as the network goes deeper. Instead of aggregating multiple model, our goal is to utilize RGB data and depth data more effectively in a single model. We evaluate our approach on the NYU-Depth V2 dataset, which consists of 1449 cluttered indoor scenes, and achieve competitive results with the state-of-the-art methods.

  8. Future Interoperability of Camp Protection Systems (FICAPS)

    Science.gov (United States)

    Caron, Sylvie; Gündisch, Rainer; Marchand, Alain; Stahl, Karl-Hermann

    2013-05-01

    The FICAPS Project has been established as a Project of the European Defence Agency based on an initiative of Germany and France. Goal of this Project was to derive Guidelines, which by a proper implementation in future developments improve Camp Protection Systems (CPS) by enabling and improving interoperability between Camp Protection Systems and its Equipments of different Nations involved in multinational missions. These Guidelines shall allow for: • Real-time information exchange between equipments and systems of different suppliers and nations (even via SatCom), • Quick and easy replacement of equipments (even of different Nations) at run-time in the field by means of plug and play capability, thus lowering the operational and logistic costs and making the system highly available, • Enhancement of system capabilities (open and modular systems) by adding new equipment with new capabilities (just plug-in, automatic adjustment of the HMI Human Machine Interface) without costly and time consuming validation and test on system level (validation and test can be done on Equipment level), Four scenarios have been identified to summarize the interoperability requirements from an operational viewpoint. To prove the definitions given in the Guideline Document, a French and a German Demonstration System, based on existing national assets, were realized. Demonstrations, showing the capabilities given by the defined interoperability requirements with respect to the operational scenarios, were performed. Demonstrations included remote control of a CPS by another CPS, remote sensor control (Electro-Optic/InfraRed EO/IR) and remote effector control. This capability can be applied to extend the protection area or to protect distant infrastructural assets Demonstrations have been performed. The required interoperability functionality was shown successfully. Even if the focus of the FICAPS project was on camp protection, the solution found is also appropriate for other

  9. 76 FR 4102 - Smart Grid Interoperability Standards; Supplemental Notice of Technical Conference

    Science.gov (United States)

    2011-01-24

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. RM11-2-000] Smart Grid Interoperability Standards; Supplemental Notice of Technical Conference January 13, 2011. On December 21, 2010, the Federal Energy Regulatory Commission announced that a Technical Conference on Smart Grid Interoperability Standards will be held on Monday...

  10. An Interoperable Cartographic Database

    OpenAIRE

    Slobodanka Ključanin; Zdravko Galić

    2007-01-01

    The concept of producing a prototype of interoperable cartographic database is explored in this paper, including the possibilities of integration of different geospatial data into the database management system and their visualization on the Internet. The implementation includes vectorization of the concept of a single map page, creation of the cartographic database in an object-relation database, spatial analysis, definition and visualization of the database content in the form of a map on t...

  11. Processing biological literature with customizable Web services supporting interoperable formats.

    Science.gov (United States)

    Rak, Rafal; Batista-Navarro, Riza Theresa; Carter, Jacob; Rowley, Andrew; Ananiadou, Sophia

    2014-01-01

    Web services have become a popular means of interconnecting solutions for processing a body of scientific literature. This has fuelled research on high-level data exchange formats suitable for a given domain and ensuring the interoperability of Web services. In this article, we focus on the biological domain and consider four interoperability formats, BioC, BioNLP, XMI and RDF, that represent domain-specific and generic representations and include well-established as well as emerging specifications. We use the formats in the context of customizable Web services created in our Web-based, text-mining workbench Argo that features an ever-growing library of elementary analytics and capabilities to build and deploy Web services straight from a convenient graphical user interface. We demonstrate a 2-fold customization of Web services: by building task-specific processing pipelines from a repository of available analytics, and by configuring services to accept and produce a combination of input and output data interchange formats. We provide qualitative evaluation of the formats as well as quantitative evaluation of automatic analytics. The latter was carried out as part of our participation in the fourth edition of the BioCreative challenge. Our analytics built into Web services for recognizing biochemical concepts in BioC collections achieved the highest combined scores out of 10 participating teams. Database URL: http://argo.nactem.ac.uk. © The Author(s) 2014. Published by Oxford University Press.

  12. 76 FR 66040 - NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Draft...

    Science.gov (United States)

    2011-10-25

    ...-01] NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Draft... draft version of the NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0... Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Release 2.0) (Draft) for public review and...

  13. Intercloud Architecture Framework for Interoperability and Integration

    NARCIS (Netherlands)

    Demchenko, Y.; Ngo, C.; Makkes, M.X.; Strijkers, R.J.

    2013-01-01

    This report presents on-going research to develop the Intercloud Architecture Framework (ICAF) that addresses interoperability and integration issues in multi-provider multi-domain heterogeneous Cloud based infrastructure services and applications provisioning, including integration and

  14. Recognizable or Not: Towards Image Semantic Quality Assessment for Compression

    Science.gov (United States)

    Liu, Dong; Wang, Dandan; Li, Houqiang

    2017-12-01

    Traditionally, image compression was optimized for the pixel-wise fidelity or the perceptual quality of the compressed images given a bit-rate budget. But recently, compressed images are more and more utilized for automatic semantic analysis tasks such as recognition and retrieval. For these tasks, we argue that the optimization target of compression is no longer perceptual quality, but the utility of the compressed images in the given automatic semantic analysis task. Accordingly, we propose to evaluate the quality of the compressed images neither at pixel level nor at perceptual level, but at semantic level. In this paper, we make preliminary efforts towards image semantic quality assessment (ISQA), focusing on the task of optical character recognition (OCR) from compressed images. We propose a full-reference ISQA measure by comparing the features extracted from text regions of original and compressed images. We then propose to integrate the ISQA measure into an image compression scheme. Experimental results show that our proposed ISQA measure is much better than PSNR and SSIM in evaluating the semantic quality of compressed images; accordingly, adopting our ISQA measure to optimize compression for OCR leads to significant bit-rate saving compared to using PSNR or SSIM. Moreover, we perform subjective test about text recognition from compressed images, and observe that our ISQA measure has high consistency with subjective recognizability. Our work explores new dimensions in image quality assessment, and demonstrates promising direction to achieve higher compression ratio for specific semantic analysis tasks.

  15. Basic semantics of product sounds

    NARCIS (Netherlands)

    Özcan Vieira, E.; Van Egmond, R.

    2012-01-01

    Product experience is a result of sensory and semantic experiences with product properties. In this paper, we focus on the semantic attributes of product sounds and explore the basic components for product sound related semantics using a semantic differential paradigmand factor analysis. With two

  16. Retrieval from semantic memory.

    NARCIS (Netherlands)

    Noordman-Vonk, Wietske

    1977-01-01

    The present study has been concerned with the retrieval of semantic information. Retrieving semantic information is a fundamental process in almost any kind of cognitive behavior. The introduction presented the main experimental paradigms and results found in the literature on semantic memory as

  17. Towards Universal Semantic Tagging

    NARCIS (Netherlands)

    Abzianidze, Lasha; Bos, Johan

    2017-01-01

    The paper proposes the task of universal semantic tagging---tagging word tokens with language-neutral, semantically informative tags. We argue that the task, with its independent nature, contributes to better semantic analysis for wide-coverage multilingual text. We present the initial version of

  18. Process-oriented semantic web search

    CERN Document Server

    Tran, DT

    2011-01-01

    The book is composed of two main parts. The first part is a general study of Semantic Web Search. The second part specifically focuses on the use of semantics throughout the search process, compiling a big picture of Process-oriented Semantic Web Search from different pieces of work that target specific aspects of the process.In particular, this book provides a rigorous account of the concepts and technologies proposed for searching resources and semantic data on the Semantic Web. To collate the various approaches and to better understand what the notion of Semantic Web Search entails, this bo

  19. Considering the role of semantic memory in episodic future thinking: evidence from semantic dementia.

    Science.gov (United States)

    Irish, Muireann; Addis, Donna Rose; Hodges, John R; Piguet, Olivier

    2012-07-01

    Semantic dementia is a progressive neurodegenerative condition characterized by the profound and amodal loss of semantic memory in the context of relatively preserved episodic memory. In contrast, patients with Alzheimer's disease typically display impairments in episodic memory, but with semantic deficits of a much lesser magnitude than in semantic dementia. Our understanding of episodic memory retrieval in these cohorts has greatly increased over the last decade, however, we know relatively little regarding the ability of these patients to imagine and describe possible future events, and whether episodic future thinking is mediated by divergent neural substrates contingent on dementia subtype. Here, we explored episodic future thinking in patients with semantic dementia (n=11) and Alzheimer's disease (n=11), in comparison with healthy control participants (n=10). Participants completed a battery of tests designed to probe episodic and semantic thinking across past and future conditions, as well as standardized tests of episodic and semantic memory. Further, all participants underwent magnetic resonance imaging. Despite their relatively intact episodic retrieval for recent past events, the semantic dementia cohort showed significant impairments for episodic future thinking. In contrast, the group with Alzheimer's disease showed parallel deficits across past and future episodic conditions. Voxel-based morphometry analyses confirmed that atrophy in the left inferior temporal gyrus and bilateral temporal poles, regions strongly implicated in semantic memory, correlated significantly with deficits in episodic future thinking in semantic dementia. Conversely, episodic future thinking performance in Alzheimer's disease correlated with atrophy in regions associated with episodic memory, namely the posterior cingulate, parahippocampal gyrus and frontal pole. These distinct neuroanatomical substrates contingent on dementia group were further qualified by correlational

  20. High Performance Descriptive Semantic Analysis of Semantic Graph Databases

    Energy Technology Data Exchange (ETDEWEB)

    Joslyn, Cliff A.; Adolf, Robert D.; al-Saffar, Sinan; Feo, John T.; Haglin, David J.; Mackey, Greg E.; Mizell, David W.

    2011-06-02

    As semantic graph database technology grows to address components ranging from extant large triple stores to SPARQL endpoints over SQL-structured relational databases, it will become increasingly important to be able to understand their inherent semantic structure, whether codified in explicit ontologies or not. Our group is researching novel methods for what we call descriptive semantic analysis of RDF triplestores, to serve purposes of analysis, interpretation, visualization, and optimization. But data size and computational complexity makes it increasingly necessary to bring high performance computational resources to bear on this task. Our research group built a novel high performance hybrid system comprising computational capability for semantic graph database processing utilizing the large multi-threaded architecture of the Cray XMT platform, conventional servers, and large data stores. In this paper we describe that architecture and our methods, and present the results of our analyses of basic properties, connected components, namespace interaction, and typed paths such for the Billion Triple Challenge 2010 dataset.

  1. On the feasibility of interoperable schemes in hand biometrics.

    Science.gov (United States)

    Morales, Aythami; González, Ester; Ferrer, Miguel A

    2012-01-01

    Personal recognition through hand-based biometrics has attracted the interest of many researchers in the last twenty years. A significant number of proposals based on different procedures and acquisition devices have been published in the literature. However, comparisons between devices and their interoperability have not been thoroughly studied. This paper tries to fill this gap by proposing procedures to improve the interoperability among different hand biometric schemes. The experiments were conducted on a database made up of 8,320 hand images acquired from six different hand biometric schemes, including a flat scanner, webcams at different wavelengths, high quality cameras, and contactless devices. Acquisitions on both sides of the hand were included. Our experiment includes four feature extraction methods which determine the best performance among the different scenarios for two of the most popular hand biometrics: hand shape and palm print. We propose smoothing techniques at the image and feature levels to reduce interdevice variability. Results suggest that comparative hand shape offers better performance in terms of interoperability than palm prints, but palm prints can be more effective when using similar sensors.

  2. Interoperable and accessible census and survey data from IPUMS.

    Science.gov (United States)

    Kugler, Tracy A; Fitch, Catherine A

    2018-02-27

    The first version of the Integrated Public Use Microdata Series (IPUMS) was released to users in 1993, and since that time IPUMS has come to stand for interoperable and accessible census and survey data. Initially created to harmonize U.S. census microdata over time, IPUMS now includes microdata from the U.S. and international censuses and from surveys on health, employment, and other topics. IPUMS also provides geo-spatial data, aggregate population data, and environmental data. IPUMS supports ten data products, each disseminating an integrated data collection with a set of tools that make complex data easy to find, access, and use. Key features are record-level integration to create interoperable datasets, user-friendly interfaces, and comprehensive metadata and documentation. The IPUMS philosophy aligns closely with the FAIR principles of findability, accessibility, interoperability, and re-usability. IPUMS data have catalyzed knowledge generation across a wide range of social science and other disciplines, as evidenced by the large volume of publications and other products created by the vast IPUMS user community.

  3. On the Feasibility of Interoperable Schemes in Hand Biometrics

    Science.gov (United States)

    Morales, Aythami; González, Ester; Ferrer, Miguel A.

    2012-01-01

    Personal recognition through hand-based biometrics has attracted the interest of many researchers in the last twenty years. A significant number of proposals based on different procedures and acquisition devices have been published in the literature. However, comparisons between devices and their interoperability have not been thoroughly studied. This paper tries to fill this gap by proposing procedures to improve the interoperability among different hand biometric schemes. The experiments were conducted on a database made up of 8,320 hand images acquired from six different hand biometric schemes, including a flat scanner, webcams at different wavelengths, high quality cameras, and contactless devices. Acquisitions on both sides of the hand were included. Our experiment includes four feature extraction methods which determine the best performance among the different scenarios for two of the most popular hand biometrics: hand shape and palm print. We propose smoothing techniques at the image and feature levels to reduce interdevice variability. Results suggest that comparative hand shape offers better performance in terms of interoperability than palm prints, but palm prints can be more effective when using similar sensors. PMID:22438714

  4. Semantic role labeling for protein transport predicates

    Directory of Open Access Journals (Sweden)

    Martin James H

    2008-06-01

    Full Text Available Abstract Background Automatic semantic role labeling (SRL is a natural language processing (NLP technique that maps sentences to semantic representations. This technique has been widely studied in the recent years, but mostly with data in newswire domains. Here, we report on a SRL model for identifying the semantic roles of biomedical predicates describing protein transport in GeneRIFs – manually curated sentences focusing on gene functions. To avoid the computational cost of syntactic parsing, and because the boundaries of our protein transport roles often did not match up with syntactic phrase boundaries, we approached this problem with a word-chunking paradigm and trained support vector machine classifiers to classify words as being at the beginning, inside or outside of a protein transport role. Results We collected a set of 837 GeneRIFs describing movements of proteins between cellular components, whose predicates were annotated for the semantic roles AGENT, PATIENT, ORIGIN and DESTINATION. We trained these models with the features of previous word-chunking models, features adapted from phrase-chunking models, and features derived from an analysis of our data. Our models were able to label protein transport semantic roles with 87.6% precision and 79.0% recall when using manually annotated protein boundaries, and 87.0% precision and 74.5% recall when using automatically identified ones. Conclusion We successfully adapted the word-chunking classification paradigm to semantic role labeling, applying it to a new domain with predicates completely absent from any previous studies. By combining the traditional word and phrasal role labeling features with biomedical features like protein boundaries and MEDPOST part of speech tags, we were able to address the challenges posed by the new domain data and subsequently build robust models that achieved F-measures as high as 83.1. This system for extracting protein transport information from Gene

  5. Interoperability as a quality label for portable & wearable health monitoring systems.

    Science.gov (United States)

    Chronaki, Catherine E; Chiarugi, Franco

    2005-01-01

    Advances in ICT promising universal access to high quality care, reduction of medical errors, and containment of health care costs, have renewed interest in electronic health records (EHR) standards and resulted in comprehensive EHR adoption programs in many European states. Health cards, and in particular the European health insurance card, present an opportunity for instant cross-border access to emergency health data including allergies, medication, even a reference ECG. At the same time, research and development in miniaturized medical devices and wearable medical sensors promise continuous health monitoring in a comfortable, flexible, and fashionable way. These trends call for the seamless integration of medical devices and intelligent wearables into an active EHR exploiting the vast information available to increase medical knowledge and establish personal wellness profiles. In a mobile connected world with empowered health consumers and fading barriers between health and healthcare, interoperability has a strong impact on consumer trust. As a result, current interoperability initiatives are extending the traditional standardization process to embrace implementation, validation, and conformance testing. In this paper, starting from the OpenECG initiative, which promotes the consistent implementation of interoperability standards in electrocardiography and supports a worldwide community with data sets, open source tools, specifications, and online conformance testing, we discuss EHR interoperability as a quality label for personalized health monitoring systems. Such a quality label would support big players and small enterprises in creating interoperable eHealth products, while opening the way for pervasive healthcare and the take-up of the eHealth market.

  6. The Episodic/Semantic Memory Distinction as an Heuristic in the Study of Instructional Effects on Cognitive Structure.

    Science.gov (United States)

    Konold, Clifford E.; Bates, John A.

    1982-01-01

    Significant correlations between measures of cognitive structure and performance were found using a procedure distinguishing between episodic and semantic memory as an heuristic with achievement test items. The design increased the likelihood of indications of semantic memory. Higher-order and lower-order cognitive processes are discussed.…

  7. Common Patterns with End-to-end Interoperability for Data Access

    Science.gov (United States)

    Gallagher, J.; Potter, N.; Jones, M. B.

    2010-12-01

    file transfers. These options affect seamlessness in that they represent tradeoffs in new development (required for the first option) with cumbersome extra user actions (required by the last option). While the middle option, adding new functionality to an existing library (netCDF), is very appealing because practice has shown that it can be very effective over a wide range of clients, it's very hard to build these libraries because correctly writing a new implementation of an existing API that preserves the original's exact semantics can be a daunting task. In the example discussed here, we developed a new module for Kepler using OPeNDAP's Java API. This provided a way to leverage internal optimizations for data organization in Kepler and we felt that outweighed the additional cost of new development and the need for users to learn how to use a new Kepler module. While common storage formats and open standards play an important role in data access, our work with the Kepler workflow system reinforces the experience that matching the data models of the data server (source) and user client (sink) and choosing the most appropriate integration strategy are critical to achieving interoperability.

  8. An Interoperable Cartographic Database

    Directory of Open Access Journals (Sweden)

    Slobodanka Ključanin

    2007-05-01

    Full Text Available The concept of producing a prototype of interoperable cartographic database is explored in this paper, including the possibilities of integration of different geospatial data into the database management system and their visualization on the Internet. The implementation includes vectorization of the concept of a single map page, creation of the cartographic database in an object-relation database, spatial analysis, definition and visualization of the database content in the form of a map on the Internet. 

  9. Real-time object detection and semantic segmentation for autonomous driving

    Science.gov (United States)

    Li, Baojun; Liu, Shun; Xu, Weichao; Qiu, Wei

    2018-02-01

    In this paper, we proposed a Highly Coupled Network (HCNet) for joint objection detection and semantic segmentation. It follows that our method is faster and performs better than the previous approaches whose decoder networks of different tasks are independent. Besides, we present multi-scale loss architecture to learn better representation for different scale objects, but without extra time in the inference phase. Experiment results show that our method achieves state-of-the-art results on the KITTI datasets. Moreover, it can run at 35 FPS on a GPU and thus is a practical solution to object detection and semantic segmentation for autonomous driving.

  10. A Defense of Semantic Minimalism

    Science.gov (United States)

    Kim, Su

    2012-01-01

    Semantic Minimalism is a position about the semantic content of declarative sentences, i.e., the content that is determined entirely by syntax. It is defined by the following two points: "Point 1": The semantic content is a complete/truth-conditional proposition. "Point 2": The semantic content is useful to a theory of…

  11. Equipping the Enterprise Interoperability Problem Solver

    NARCIS (Netherlands)

    Oude Luttighuis, Paul; Folmer, Erwin Johan Albert; Charalabidis, Yannis

    2010-01-01

    The maturity of the enterprise interoperability field does not match the importance attached to it by many, both in the public as well as the private community. A host of models, paradigms, designs, standards, methods, and instruments seems to be available, but many of them are only used in rather

  12. An E-government Interoperability Platform Supporting Personal Data Protection Regulations

    Directory of Open Access Journals (Sweden)

    Laura González

    2016-08-01

    Full Text Available Public agencies are increasingly required to collaborate with each other in order to provide high-quality e-government services. This collaboration is usually based on the service-oriented approach and supported by interoperability platforms. Such platforms are specialized middleware-based infrastructures enabling the provision, discovery and invocation of interoperable software services. In turn, given that personal data handled by governments are often very sensitive, most governments have developed some sort of legislation focusing on data protection. This paper proposes solutions for monitoring and enforcing data protection laws within an E-government Interoperability Platform. In particular, the proposal addresses requirements posed by the Uruguayan Data Protection Law and the Uruguayan E-government Platform, although it can also be applied in similar scenarios. The solutions are based on well-known integration mechanisms (e.g. Enterprise Service Bus as well as recognized security standards (e.g. eXtensible Access Control Markup Language and were completely prototyped leveraging the SwitchYard ESB product.

  13. Subliminal semantic priming in speech.

    Directory of Open Access Journals (Sweden)

    Jérôme Daltrozzo

    Full Text Available Numerous studies have reported subliminal repetition and semantic priming in the visual modality. We transferred this paradigm to the auditory modality. Prime awareness was manipulated by a reduction of sound intensity level. Uncategorized prime words (according to a post-test were followed by semantically related, unrelated, or repeated target words (presented without intensity reduction and participants performed a lexical decision task (LDT. Participants with slower reaction times in the LDT showed semantic priming (faster reaction times for semantically related compared to unrelated targets and negative repetition priming (slower reaction times for repeated compared to semantically related targets. This is the first report of semantic priming in the auditory modality without conscious categorization of the prime.

  14. Open semantic analysis: The case of word level semantics in Danish

    DEFF Research Database (Denmark)

    Nielsen, Finn Årup; Hansen, Lars Kai

    2017-01-01

    The present research is motivated by the need for accessible and efficient tools for automated semantic analysis in Danish. We are interested in tools that are completely open, so they can be used by a critical public, in public administration, non-governmental organizations and businesses. We...... describe data-driven models for Danish semantic relatedness, word intrusion and sentiment prediction. Open Danish corpora were assembled and unsupervised learning implemented for explicit semantic analysis and with Gensim’s Word2vec model. We evaluate the performance of the two models on three different...... annotated word datasets. We test the semantic representations’ alignment with single word sentiment using supervised learning. We find that logistic regression and large random forests perform well with Word2vec features....

  15. Terminology representation guidelines for biomedical ontologies in the semantic web notations.

    Science.gov (United States)

    Tao, Cui; Pathak, Jyotishman; Solbrig, Harold R; Wei, Wei-Qi; Chute, Christopher G

    2013-02-01

    Terminologies and ontologies are increasingly prevalent in healthcare and biomedicine. However they suffer from inconsistent renderings, distribution formats, and syntax that make applications through common terminologies services challenging. To address the problem, one could posit a shared representation syntax, associated schema, and tags. We identified a set of commonly-used elements in biomedical ontologies and terminologies based on our experience with the Common Terminology Services 2 (CTS2) Specification as well as the Lexical Grid (LexGrid) project. We propose guidelines for precisely such a shared terminology model, and recommend tags assembled from SKOS, OWL, Dublin Core, RDF Schema, and DCMI meta-terms. We divide these guidelines into lexical information (e.g. synonyms, and definitions) and semantic information (e.g. hierarchies). The latter we distinguish for use by informal terminologies vs. formal ontologies. We then evaluate the guidelines with a spectrum of widely used terminologies and ontologies to examine how the lexical guidelines are implemented, and whether our proposed guidelines would enhance interoperability. Copyright © 2012 Elsevier Inc. All rights reserved.

  16. Food product tracing technology capabilities and interoperability.

    Science.gov (United States)

    Bhatt, Tejas; Zhang, Jianrong Janet

    2013-12-01

    Despite the best efforts of food safety and food defense professionals, contaminated food continues to enter the food supply. It is imperative that contaminated food be removed from the supply chain as quickly as possible to protect public health and stabilize markets. To solve this problem, scores of technology companies purport to have the most effective, economical product tracing system. This study sought to compare and contrast the effectiveness of these systems at analyzing product tracing information to identify the contaminated ingredient and likely source, as well as distribution of the product. It also determined if these systems can work together to better secure the food supply (their interoperability). Institute of Food Technologists (IFT) hypothesized that when technology providers are given a full set of supply-chain data, even for a multi-ingredient product, their systems will generally be able to trace a contaminated product forward and backward through the supply chain. However, when provided with only a portion of supply-chain data, even for a product with a straightforward supply chain, it was expected that interoperability of the systems will be lacking and that there will be difficulty collaborating to identify sources and/or recipients of potentially contaminated product. IFT provided supply-chain data for one complex product to 9 product tracing technology providers, and then compared and contrasted their effectiveness at analyzing product tracing information to identify the contaminated ingredient and likely source, as well as distribution of the product. A vertically integrated foodservice restaurant agreed to work with IFT to secure data from its supply chain for both a multi-ingredient and a simpler product. Potential multi-ingredient products considered included canned tuna, supreme pizza, and beef tacos. IFT ensured that all supply-chain data collected did not include any proprietary information or information that would otherwise

  17. Applied Semantic Web Technologies

    CERN Document Server

    Sugumaran, Vijayan

    2011-01-01

    The rapid advancement of semantic web technologies, along with the fact that they are at various levels of maturity, has left many practitioners confused about the current state of these technologies. Focusing on the most mature technologies, Applied Semantic Web Technologies integrates theory with case studies to illustrate the history, current state, and future direction of the semantic web. It maintains an emphasis on real-world applications and examines the technical and practical issues related to the use of semantic technologies in intelligent information management. The book starts with

  18. Heterogeneous but “Standard” Coding Systems for Adverse Events: Issues in Achieving Interoperability between Apples and Oranges

    Science.gov (United States)

    Richesson, Rachel L.; Fung, Kin Wah; Krischer, Jeffrey P.

    2008-01-01

    Monitoring adverse events (AEs) is an important part of clinical research and a crucial target for data standards. The representation of adverse events themselves requires the use of controlled vocabularies with thousands of needed clinical concepts. Several data standards for adverse events currently exist, each with a strong user base. The structure and features of these current adverse event data standards (including terminologies and classifications) are different, so comparisons and evaluations are not straightforward, nor are strategies for their harmonization. Three different data standards - the Medical Dictionary for Regulatory Activities (MedDRA) and the Systematized Nomenclature of Medicine Clinical Terms (SNOMED CT) terminologies, and Common Terminology Criteria for Adverse Events (CTCAE) classification - are explored as candidate representations for AEs. This paper describes the structural features of each coding system, their content and relationship to the Unified Medical Language System (UMLS), and unsettled issues for future interoperability of these standards. PMID:18406213

  19. Effects of Semantic Web Based Learning on Pre-Service Teachers' ICT Learning Achievement and Satisfaction

    Science.gov (United States)

    Karalar, Halit; Korucu, Agah Tugrul

    2016-01-01

    Although the Semantic Web offers many opportunities for learners, effects of it in the classroom is not well known. Therefore, in this study explanations have been stated as how the learning objects defined by means of using the terminology in a developed ontology and kept in objects repository should be presented to learners with the aim of…

  20. Building a transnational biosurveillance network using semantic web technologies: requirements, design, and preliminary evaluation.

    Science.gov (United States)

    Teodoro, Douglas; Pasche, Emilie; Gobeill, Julien; Emonet, Stéphane; Ruch, Patrick; Lovis, Christian

    2012-05-29

    Antimicrobial resistance has reached globally alarming levels and is becoming a major public health threat. Lack of efficacious antimicrobial resistance surveillance systems was identified as one of the causes of increasing resistance, due to the lag time between new resistances and alerts to care providers. Several initiatives to track drug resistance evolution have been developed. However, no effective real-time and source-independent antimicrobial resistance monitoring system is available publicly. To design and implement an architecture that can provide real-time and source-independent antimicrobial resistance monitoring to support transnational resistance surveillance. In particular, we investigated the use of a Semantic Web-based model to foster integration and interoperability of interinstitutional and cross-border microbiology laboratory databases. Following the agile software development methodology, we derived the main requirements needed for effective antimicrobial resistance monitoring, from which we proposed a decentralized monitoring architecture based on the Semantic Web stack. The architecture uses an ontology-driven approach to promote the integration of a network of sentinel hospitals or laboratories. Local databases are wrapped into semantic data repositories that automatically expose local computing-formalized laboratory information in the Web. A central source mediator, based on local reasoning, coordinates the access to the semantic end points. On the user side, a user-friendly Web interface provides access and graphical visualization to the integrated views. We designed and implemented the online Antimicrobial Resistance Trend Monitoring System (ARTEMIS) in a pilot network of seven European health care institutions sharing 70+ million triples of information about drug resistance and consumption. Evaluation of the computing performance of the mediator demonstrated that, on average, query response time was a few seconds (mean 4.3, SD 0.1 × 10

  1. Interoperability And Value Added To Earth Observation Data

    Science.gov (United States)

    Gasperi, J.

    2012-04-01

    Geospatial web services technology has provided a new means for geospatial data interoperability. Open Geospatial Consortium (OGC) services such as Web Map Service (WMS) to request maps on the Internet, Web Feature Service (WFS) to exchange vectors or Catalog Service for the Web (CSW) to search for geospatialized data have been widely adopted in the Geosciences community in general and in the remote sensing community in particular. These services make Earth Observation data available to a wider range of public users than ever before. The mapshup web client offers an innovative and efficient user interface that takes advantage of the power of interoperability. This presentation will demonstrate how mapshup can be effectively used in the context of natural disasters management.

  2. MAKOCI: A WEB PORTAL FOR INTEGRATING AND SHARING GEOGRAPHIC INFORMATION SERVICES

    Directory of Open Access Journals (Sweden)

    Chin-Te Jung

    2013-01-01

    Full Text Available The lack of integration and communication of various geographic information services (GI services has resulted in many duplication collection of earth observation data, and challenges of semantic interoperability. This paper proposes an ontology-based multi-agents platform, called MAKOCI (multi-agent knowledge oriented cyberinfrastructure, which acts as GI service one stop to manage, publish, share, and discover GI services semantically. By ontologies, formal meanings of concepts are defined to annotate and discover GI services on a conceptual level for semantic interoperability. With the assistance of multi-agents, the processes in MAKOCI can be divided into various modules and be communicated based on the same semantics in ontologies. A prototype was implemented to test MAKOCI. Finally, we conclude the advantages and disadvantages of MAKOCI and point out several future works.

  3. Effects of Semantic Ambiguity Detection Training on Reading Comprehension Achievement of English Learners with Learning Difficulties

    Science.gov (United States)

    Jozwik, Sara L.; Douglas, Karen H.

    2016-01-01

    This study examined how explicit instruction in semantic ambiguity detection affected the reading comprehension and metalinguistic awareness of five English learners (ELs) with learning difficulties (e.g., attention deficit/hyperactivity disorder, specific learning disability). A multiple probe across participants design (Gast & Ledford, 2010)…

  4. Adventures in semantic publishing: exemplar semantic enhancements of a research article.

    Directory of Open Access Journals (Sweden)

    David Shotton

    2009-04-01

    Full Text Available Scientific innovation depends on finding, integrating, and re-using the products of previous research. Here we explore how recent developments in Web technology, particularly those related to the publication of data and metadata, might assist that process by providing semantic enhancements to journal articles within the mainstream process of scholarly journal publishing. We exemplify this by describing semantic enhancements we have made to a recent biomedical research article taken from PLoS Neglected Tropical Diseases, providing enrichment to its content and increased access to datasets within it. These semantic enhancements include provision of live DOIs and hyperlinks; semantic markup of textual terms, with links to relevant third-party information resources; interactive figures; a re-orderable reference list; a document summary containing a study summary, a tag cloud, and a citation analysis; and two novel types of semantic enrichment: the first, a Supporting Claims Tooltip to permit "Citations in Context", and the second, Tag Trees that bring together semantically related terms. In addition, we have published downloadable spreadsheets containing data from within tables and figures, have enriched these with provenance information, and have demonstrated various types of data fusion (mashups with results from other research articles and with Google Maps. We have also published machine-readable RDF metadata both about the article and about the references it cites, for which we developed a Citation Typing Ontology, CiTO (http://purl.org/net/cito/. The enhanced article, which is available at http://dx.doi.org/10.1371/journal.pntd.0000228.x001, presents a compelling existence proof of the possibilities of semantic publication. We hope the showcase of examples and ideas it contains, described in this paper, will excite the imaginations of researchers and publishers, stimulating them to explore the possibilities of semantic publishing for their own

  5. Pascal Semantics by a Combination of Denotational Semantics and High-level Petri Nets

    DEFF Research Database (Denmark)

    Jensen, Kurt; Schmidt, Erik Meineche

    1986-01-01

    This paper describes the formal semantics of a subset of PASCAL, by means of a semantic model based on a combination of denotational semantics and high-level Petri nets. It is our intention that the paper can be used as part of the written material for an introductory course in computer science....

  6. DOORS to the semantic web and grid with a PORTAL for biomedical computing.

    Science.gov (United States)

    Taswell, Carl

    2008-03-01

    The semantic web remains in the early stages of development. It has not yet achieved the goals envisioned by its founders as a pervasive web of distributed knowledge and intelligence. Success will be attained when a dynamic synergism can be created between people and a sufficient number of infrastructure systems and tools for the semantic web in analogy with those for the original web. The domain name system (DNS), web browsers, and the benefits of publishing web pages motivated many people to register domain names and publish web sites on the original web. An analogous resource label system, semantic search applications, and the benefits of collaborative semantic networks will motivate people to register resource labels and publish resource descriptions on the semantic web. The Domain Ontology Oriented Resource System (DOORS) and Problem Oriented Registry of Tags and Labels (PORTAL) are proposed as infrastructure systems for resource metadata within a paradigm that can serve as a bridge between the original web and the semantic web. The Internet Registry Information Service (IRIS) registers [corrected] domain names while DNS publishes domain addresses with mapping of names to addresses for the original web. Analogously, PORTAL registers resource labels and tags while DOORS publishes resource locations and descriptions with mapping of labels to locations for the semantic web. BioPORT is proposed as a prototype PORTAL registry specific for the problem domain of biomedical computing.

  7. OMOGENIA: A Semantically Driven Collaborative Environment

    Science.gov (United States)

    Liapis, Aggelos

    Ontology creation can be thought of as a social procedure. Indeed the concepts involved in general need to be elicited from communities of domain experts and end-users by teams of knowledge engineers. Many problems in ontology creation appear to resemble certain problems in software design, particularly with respect to the setup of collaborative systems. For instance, the resolution of conceptual conflicts between formalized ontologies is a major engineering problem as ontologies move into widespread use on the semantic web. Such conflict resolution often requires human collaboration and cannot be achieved by automated methods with the exception of simple cases. In this chapter we discuss research in the field of computer-supported cooperative work (CSCW) that focuses on classification and which throws light on ontology building. Furthermore, we present a semantically driven collaborative environment called OMOGENIA as a natural way to display and examine the structure of an evolving ontology in a collaborative setting.

  8. On the Feasibility of Interoperable Schemes in Hand Biometrics

    Directory of Open Access Journals (Sweden)

    Miguel A. Ferrer

    2012-02-01

    Full Text Available Personal recognition through hand-based biometrics has attracted the interest of many researchers in the last twenty years. A significant number of proposals based on different procedures and acquisition devices have been published in the literature. However, comparisons between devices and their interoperability have not been thoroughly studied. This paper tries to fill this gap by proposing procedures to improve the interoperability among different hand biometric schemes. The experiments were conducted on a database made up of 8,320 hand images acquired from six different hand biometric schemes, including a flat scanner, webcams at different wavelengths, high quality cameras, and contactless devices. Acquisitions on both sides of the hand were included. Our experiment includes four feature extraction methods which determine the best performance among the different scenarios for two of the most popular hand biometrics: hand shape and palm print. We propose smoothing techniques at the image and feature levels to reduce interdevice variability. Results suggest that comparative hand shape offers better performance in terms of interoperability than palm prints, but palm prints can be more effective when using similar sensors.

  9. Auditing the Assignments of Top-Level Semantic Types in the UMLS Semantic Network to UMLS Concepts.

    Science.gov (United States)

    He, Zhe; Perl, Yehoshua; Elhanan, Gai; Chen, Yan; Geller, James; Bian, Jiang

    2017-11-01

    The Unified Medical Language System (UMLS) is an important terminological system. By the policy of its curators, each concept of the UMLS should be assigned the most specific Semantic Types (STs) in the UMLS Semantic Network (SN). Hence, the Semantic Types of most UMLS concepts are assigned at or near the bottom (leaves) of the UMLS Semantic Network. While most ST assignments are correct, some errors do occur. Therefore, Quality Assurance efforts of UMLS curators for ST assignments should concentrate on automatically detected sets of UMLS concepts with higher error rates than random sets. In this paper, we investigate the assignments of top-level semantic types in the UMLS semantic network to concepts, identify potential erroneous assignments, define four categories of errors, and thus provide assistance to curators of the UMLS to avoid these assignments errors. Human experts analyzed samples of concepts assigned 10 of the top-level semantic types and categorized the erroneous ST assignments into these four logical categories. Two thirds of the concepts assigned these 10 top-level semantic types are erroneous. Our results demonstrate that reviewing top-level semantic type assignments to concepts provides an effective way for UMLS quality assurance, comparing to reviewing a random selection of semantic type assignments.

  10. An Interoperable Security Framework for Connected Healthcare

    NARCIS (Netherlands)

    Asim, M.; Petkovic, M.; Qu, M.; Wang, C.

    2011-01-01

    Connected and interoperable healthcare system promises to reduce thecost of the healthcare delivery, increase its efficiency and enableconsumers to better engage with clinicians and manage their care. However at the same time it introduces new risks towards security andprivacy of personal health

  11. Schema di annotazione per la dottrina giuridica: il caso di studio della banca dati DoGi‑Dottrina Giuridica

    Directory of Open Access Journals (Sweden)

    Tommaso Agnoloni

    2013-01-01

    Full Text Available Interoperability today is the key term for the enhancement of databases published on the web: the data, when isolated, have little value, on the contrary, their value increases significantly when different datasets, produced and published independently by different providers, can be reused and freely mashed by third parties. The use of data for new purposes not foreseen by organizations and individuals who publish "raw data" is one of the advantages of linked open data model. To achieve these benefits content and relationships between the entities described in the dataset should be explicitly represented in standard web formats (XML, RDF, URI. The DoGi-Legal Literature database, one of the most valuable sources for online access to legal doctrine, created and managed by the Institute of Legal Information Theory and Techniques of the CNR is following this direction. This paper will define the schema of the data representing the database in RDF format. This will make the DoGi database interoperable between different data and service providers (libraries, publishers, information services for accessing national and European legal information, allowing the creation of new advanced services to be made available on the web of data. In particular, the contribution will focus on the goal to promote semantic interoperability between the DoGi classification schema and other semantic indexing tools in legal domain.

  12. Harmonising phenomics information for a better interoperability in the rare disease field.

    Science.gov (United States)

    Maiella, Sylvie; Olry, Annie; Hanauer, Marc; Lanneau, Valérie; Lourghi, Halima; Donadille, Bruno; Rodwell, Charlotte; Köhler, Sebastian; Seelow, Dominik; Jupp, Simon; Parkinson, Helen; Groza, Tudor; Brudno, Michael; Robinson, Peter N; Rath, Ana

    2018-02-07

    HIPBI-RD (Harmonising phenomics information for a better interoperability in the rare disease field) is a three-year project which started in 2016 funded via the E-Rare 3 ERA-NET program. This project builds on three resources largely adopted by the rare disease (RD) community: Orphanet, its ontology ORDO (the Orphanet Rare Disease Ontology), HPO (the Human Phenotype Ontology) as well as PhenoTips software for the capture and sharing of structured phenotypic data for RD patients. Our project is further supported by resources developed by the European Bioinformatics Institute and the Garvan Institute. HIPBI-RD aims to provide the community with an integrated, RD-specific bioinformatics ecosystem that will harmonise the way phenomics information is stored in databases and patient files worldwide, and thereby contribute to interoperability. This ecosystem will consist of a suite of tools and ontologies, optimized to work together, and made available through commonly used software repositories. The project workplan follows three main objectives: The HIPBI-RD ecosystem will contribute to the interpretation of variants identified through exome and full genome sequencing by harmonising the way phenotypic information is collected, thus improving diagnostics and delineation of RD. The ultimate goal of HIPBI-RD is to provide a resource that will contribute to bridging genome-scale biology and a disease-centered view on human pathobiology. Achievements in Year 1. Copyright © 2018. Published by Elsevier Masson SAS.

  13. Reactive Kripke semantics

    CERN Document Server

    Gabbay, Dov M

    2013-01-01

    This text offers an extension to the traditional Kripke semantics for non-classical logics by adding the notion of reactivity. Reactive Kripke models change their accessibility relation as we progress in the evaluation process of formulas in the model. This feature makes the reactive Kripke semantics strictly stronger and more applicable than the traditional one. Here we investigate the properties and axiomatisations of this new and most effective semantics, and we offer a wide landscape of applications of the idea of reactivity. Applied topics include reactive automata, reactive grammars, rea

  14. Semantics, contrastive linguistics and parallel corpora

    Directory of Open Access Journals (Sweden)

    Violetta Koseska

    2014-09-01

    Full Text Available Semantics, contrastive linguistics and parallel corpora In view of the ambiguity of the term “semantics”, the author shows the differences between the traditional lexical semantics and the contemporary semantics in the light of various semantic schools. She examines semantics differently in connection with contrastive studies where the description must necessary go from the meaning towards the linguistic form, whereas in traditional contrastive studies the description proceeded from the form towards the meaning. This requirement regarding theoretical contrastive studies necessitates construction of a semantic interlanguage, rather than only singling out universal semantic categories expressed with various language means. Such studies can be strongly supported by parallel corpora. However, in order to make them useful for linguists in manual and computer translations, as well as in the development of dictionaries, including online ones, we need not only formal, often automatic, annotation of texts, but also semantic annotation - which is unfortunately manual. In the article we focus on semantic annotation concerning time, aspect and quantification of names and predicates in the whole semantic structure of the sentence on the example of the “Polish-Bulgarian-Russian parallel corpus”.

  15. An interoperable security framework for connected healthcare

    NARCIS (Netherlands)

    Asim, M.; Petkovic, M.; Qu, M.; Wang, Changjie

    2011-01-01

    Connected and interoperable healthcare system promises to reduce the cost of healthcare delivery, increase its efficiency and enable consumers to better engage with clinicians and manage their care. However at the same time it introduces new risks towards security and privacy of personal health

  16. How to ensure sustainable interoperability in heterogeneous distributed systems through architectural approach.

    Science.gov (United States)

    Pape-Haugaard, Louise; Frank, Lars

    2011-01-01

    A major obstacle in ensuring ubiquitous information is the utilization of heterogeneous systems in eHealth. The objective in this paper is to illustrate how an architecture for distributed eHealth databases can be designed without lacking the characteristic features of traditional sustainable databases. The approach is firstly to explain traditional architecture in central and homogeneous distributed database computing, followed by a possible approach to use an architectural framework to obtain sustainability across disparate systems i.e. heterogeneous databases, concluded with a discussion. It is seen that through a method of using relaxed ACID properties on a service-oriented architecture it is possible to achieve data consistency which is essential when ensuring sustainable interoperability.

  17. Semantic web for dummies

    CERN Document Server

    Pollock, Jeffrey T

    2009-01-01

    Semantic Web technology is already changing how we interact with data on the Web. By connecting random information on the Internet in new ways, Web 3.0, as it is sometimes called, represents an exciting online evolution. Whether you're a consumer doing research online, a business owner who wants to offer your customers the most useful Web site, or an IT manager eager to understand Semantic Web solutions, Semantic Web For Dummies is the place to start! It will help you:Know how the typical Internet user will recognize the effects of the Semantic WebExplore all the benefits the data Web offers t

  18. The contribution of executive control to semantic cognition: Convergent evidence from semantic aphasia and executive dysfunction.

    Science.gov (United States)

    Thompson, Hannah E; Almaghyuli, Azizah; Noonan, Krist A; Barak, Ohr; Lambon Ralph, Matthew A; Jefferies, Elizabeth

    2018-01-03

    Semantic cognition, as described by the controlled semantic cognition (CSC) framework (Rogers et al., , Neuropsychologia, 76, 220), involves two key components: activation of coherent, generalizable concepts within a heteromodal 'hub' in combination with modality-specific features (spokes), and a constraining mechanism that manipulates and gates this knowledge to generate time- and task-appropriate behaviour. Executive-semantic goal representations, largely supported by executive regions such as frontal and parietal cortex, are thought to allow the generation of non-dominant aspects of knowledge when these are appropriate for the task or context. Semantic aphasia (SA) patients have executive-semantic deficits, and these are correlated with general executive impairment. If the CSC proposal is correct, patients with executive impairment should not only exhibit impaired semantic cognition, but should also show characteristics that align with those observed in SA. This possibility remains largely untested, as patients selected on the basis that they show executive impairment (i.e., with 'dysexecutive syndrome') have not been extensively tested on tasks tapping semantic control and have not been previously compared with SA cases. We explored conceptual processing in 12 patients showing symptoms consistent with dysexecutive syndrome (DYS) and 24 SA patients, using a range of multimodal semantic assessments which manipulated control demands. Patients with executive impairments, despite not being selected to show semantic impairments, nevertheless showed parallel patterns to SA cases. They showed strong effects of distractor strength, cues and miscues, and probe-target distance, plus minimal effects of word frequency on comprehension (unlike semantic dementia patients with degradation of conceptual knowledge). This supports a component process account of semantic cognition in which retrieval is shaped by control processes, and confirms that deficits in SA patients reflect

  19. A Conceptual Framework to Enhance the Interoperability of Observatories among Countries, Continents and the World

    Science.gov (United States)

    Loescher, H.; Fundamental Instrument Unit

    2013-05-01

    , GEO-BON, NutNet, etc.) and domestically, (e.g., NSF-CZO, USDA-LTAR, DOE-NGEE, Soil Carbon Network, etc.), there is a strong and mutual desire to assure interoperability of data. Developing interoperability is the degree by which each of the following is mapped between observatories (entities), defined by linking i) science requirements with science questions, ii) traceability of measurements to nationally and internationally accepted standards, iii) how data product are derived, i.e., algorithms, procedures, and methods, and iv) the bioinformatics which broadly include data formats, metadata, controlled vocabularies, and semantics. Here, we explore the rationale and focus areas for interoperability, the governance and work structures, example projects (NSF-NEON, EU-ICOS, and AU-TERN), and the emergent roles of scientists in these endeavors.

  20. Inter-operator Variability in Defining Uterine Position Using Three-dimensional Ultrasound Imaging

    DEFF Research Database (Denmark)

    Baker, Mariwan; Jensen, Jørgen Arendt; Behrens, Claus F.

    2013-01-01

    significantly larger inter-fractional uterine positional displacement, in some cases up to 20 mm, which outweighs the magnitude of current inter-operator variations. Thus, the current US-phantom-study suggests that the inter-operator variability in addressing uterine position is clinically irrelevant.......In radiotherapy the treatment outcome of gynecological (GYN) cancer patients is crucially related to reproducibility of the actual uterine position. The purpose of this study is to evaluate the inter-operator variability in addressing uterine position using a novel 3-D ultrasound (US) system....... The study is initiated by US-scanning of a uterine phantom (CIRS 404, Universal Medical, Norwood, USA) by seven experienced US operators. The phantom represents a female pelvic region, containing a uterus, bladder and rectal landmarks readily definable in the acquired US-scans. The organs are subjected...

  1. A unified architecture for biomedical search engines based on semantic web technologies.

    Science.gov (United States)

    Jalali, Vahid; Matash Borujerdi, Mohammad Reza

    2011-04-01

    There is a huge growth in the volume of published biomedical research in recent years. Many medical search engines are designed and developed to address the over growing information needs of biomedical experts and curators. Significant progress has been made in utilizing the knowledge embedded in medical ontologies and controlled vocabularies to assist these engines. However, the lack of common architecture for utilized ontologies and overall retrieval process, hampers evaluating different search engines and interoperability between them under unified conditions. In this paper, a unified architecture for medical search engines is introduced. Proposed model contains standard schemas declared in semantic web languages for ontologies and documents used by search engines. Unified models for annotation and retrieval processes are other parts of introduced architecture. A sample search engine is also designed and implemented based on the proposed architecture in this paper. The search engine is evaluated using two test collections and results are reported in terms of precision vs. recall and mean average precision for different approaches used by this search engine.

  2. Structured Semantic Knowledge Can Emerge Automatically from Predicting Word Sequences in Child-Directed Speech

    Science.gov (United States)

    Huebner, Philip A.; Willits, Jon A.

    2018-01-01

    Previous research has suggested that distributional learning mechanisms may contribute to the acquisition of semantic knowledge. However, distributional learning mechanisms, statistical learning, and contemporary “deep learning” approaches have been criticized for being incapable of learning the kind of abstract and structured knowledge that many think is required for acquisition of semantic knowledge. In this paper, we show that recurrent neural networks, trained on noisy naturalistic speech to children, do in fact learn what appears to be abstract and structured knowledge. We trained two types of recurrent neural networks (Simple Recurrent Network, and Long Short-Term Memory) to predict word sequences in a 5-million-word corpus of speech directed to children ages 0–3 years old, and assessed what semantic knowledge they acquired. We found that learned internal representations are encoding various abstract grammatical and semantic features that are useful for predicting word sequences. Assessing the organization of semantic knowledge in terms of the similarity structure, we found evidence of emergent categorical and hierarchical structure in both models. We found that the Long Short-term Memory (LSTM) and SRN are both learning very similar kinds of representations, but the LSTM achieved higher levels of performance on a quantitative evaluation. We also trained a non-recurrent neural network, Skip-gram, on the same input to compare our results to the state-of-the-art in machine learning. We found that Skip-gram achieves relatively similar performance to the LSTM, but is representing words more in terms of thematic compared to taxonomic relations, and we provide reasons why this might be the case. Our findings show that a learning system that derives abstract, distributed representations for the purpose of predicting sequential dependencies in naturalistic language may provide insight into emergence of many properties of the developing semantic system. PMID

  3. Structured Semantic Knowledge Can Emerge Automatically from Predicting Word Sequences in Child-Directed Speech

    Directory of Open Access Journals (Sweden)

    Philip A. Huebner

    2018-02-01

    Full Text Available Previous research has suggested that distributional learning mechanisms may contribute to the acquisition of semantic knowledge. However, distributional learning mechanisms, statistical learning, and contemporary “deep learning” approaches have been criticized for being incapable of learning the kind of abstract and structured knowledge that many think is required for acquisition of semantic knowledge. In this paper, we show that recurrent neural networks, trained on noisy naturalistic speech to children, do in fact learn what appears to be abstract and structured knowledge. We trained two types of recurrent neural networks (Simple Recurrent Network, and Long Short-Term Memory to predict word sequences in a 5-million-word corpus of speech directed to children ages 0–3 years old, and assessed what semantic knowledge they acquired. We found that learned internal representations are encoding various abstract grammatical and semantic features that are useful for predicting word sequences. Assessing the organization of semantic knowledge in terms of the similarity structure, we found evidence of emergent categorical and hierarchical structure in both models. We found that the Long Short-term Memory (LSTM and SRN are both learning very similar kinds of representations, but the LSTM achieved higher levels of performance on a quantitative evaluation. We also trained a non-recurrent neural network, Skip-gram, on the same input to compare our results to the state-of-the-art in machine learning. We found that Skip-gram achieves relatively similar performance to the LSTM, but is representing words more in terms of thematic compared to taxonomic relations, and we provide reasons why this might be the case. Our findings show that a learning system that derives abstract, distributed representations for the purpose of predicting sequential dependencies in naturalistic language may provide insight into emergence of many properties of the developing

  4. Towards Compatible and Interderivable Semantic Specifications for the Scheme Programming Language, Part I: Denotational Semantics, Natural Semantics, and Abstract Machines

    DEFF Research Database (Denmark)

    Danvy, Olivier

    2008-01-01

    We derive two big-step abstract machines, a natural semantics, and the valuation function of a denotational semantics based on the small-step abstract machine for Core Scheme presented by Clinger at PLDI'98. Starting from a functional implementation of this small-step abstract machine, (1) we fuse...... its transition function with its driver loop, obtaining the functional implementation of a big-step abstract machine; (2) we adjust this big-step abstract machine so that it is in defunctionalized form, obtaining the functional implementation of a second big-step abstract machine; (3) we...... refunctionalize this adjusted abstract machine, obtaining the functional implementation of a natural semantics in continuation style; and (4) we closure-unconvert this natural semantics, obtaining a compositional continuation-passing evaluation function which we identify as the functional implementation...

  5. Temporal Representation in Semantic Graphs

    Energy Technology Data Exchange (ETDEWEB)

    Levandoski, J J; Abdulla, G M

    2007-08-07

    A wide range of knowledge discovery and analysis applications, ranging from business to biological, make use of semantic graphs when modeling relationships and concepts. Most of the semantic graphs used in these applications are assumed to be static pieces of information, meaning temporal evolution of concepts and relationships are not taken into account. Guided by the need for more advanced semantic graph queries involving temporal concepts, this paper surveys the existing work involving temporal representations in semantic graphs.

  6. Flow Logics and Operational Semantics

    DEFF Research Database (Denmark)

    Nielson, Flemming; Nielson, Hanne Riis

    1998-01-01

    Flow logic is a “fast prototyping” approach to program analysis that shows great promise of being able to deal with a wide variety of languages and calculi for computation. However, seemingly innocent choices in the flow logic as well as in the operational semantics may inhibit proving the analys...... correct. Our main conclusion is that environment based semantics is more flexible than either substitution based semantics or semantics making use of structural congruences (like alpha-renaming)....

  7. Semantic Versus Syntactic Cutting Planes

    OpenAIRE

    Filmus, Yuval; Hrubeš, Pavel; Lauria, Massimo

    2016-01-01

    In this paper, we compare the strength of the semantic and syntactic version of the cutting planes proof system. First, we show that the lower bound technique of Pudlák applies also to semantic cutting planes: the proof system has feasible interpolation via monotone real circuits, which gives an exponential lower bound on lengths of semantic cutting planes refutations. Second, we show that semantic refutations are stronger than syntactic ones. In particular, we give a formula for whic...

  8. Semantic word category processing in semantic dementia and posterior cortical atrophy.

    Science.gov (United States)

    Shebani, Zubaida; Patterson, Karalyn; Nestor, Peter J; Diaz-de-Grenu, Lara Z; Dawson, Kate; Pulvermüller, Friedemann

    2017-08-01

    There is general agreement that perisylvian language cortex plays a major role in lexical and semantic processing; but the contribution of additional, more widespread, brain areas in the processing of different semantic word categories remains controversial. We investigated word processing in two groups of patients whose neurodegenerative diseases preferentially affect specific parts of the brain, to determine whether their performance would vary as a function of semantic categories proposed to recruit those brain regions. Cohorts with (i) Semantic Dementia (SD), who have anterior temporal-lobe atrophy, and (ii) Posterior Cortical Atrophy (PCA), who have predominantly parieto-occipital atrophy, performed a lexical decision test on words from five different lexico-semantic categories: colour (e.g., yellow), form (oval), number (seven), spatial prepositions (under) and function words (also). Sets of pseudo-word foils matched the target words in length and bi-/tri-gram frequency. Word-frequency was matched between the two visual word categories (colour and form) and across the three other categories (number, prepositions, and function words). Age-matched healthy individuals served as controls. Although broad word processing deficits were apparent in both patient groups, the deficit was strongest for colour words in SD and for spatial prepositions in PCA. The patterns of performance on the lexical decision task demonstrate (a) general lexicosemantic processing deficits in both groups, though more prominent in SD than in PCA, and (b) differential involvement of anterior-temporal and posterior-parietal cortex in the processing of specific semantic categories of words. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  9. Simplifying the Reuse and Interoperability of Geoscience Data Sets and Models with Semantic Metadata that is Human-Readable and Machine-actionable

    Science.gov (United States)

    Peckham, S. D.

    2017-12-01

    Standardized, deep descriptions of digital resources (e.g. data sets, computational models, software tools and publications) make it possible to develop user-friendly software systems that assist scientists with the discovery and appropriate use of these resources. Semantic metadata makes it possible for machines to take actions on behalf of humans, such as automatically identifying the resources needed to solve a given problem, retrieving them and then automatically connecting them (despite their heterogeneity) into a functioning workflow. Standardized model metadata also helps model users to understand the important details that underpin computational models and to compare the capabilities of different models. These details include simplifying assumptions on the physics, governing equations and the numerical methods used to solve them, discretization of space (the grid) and time (the time-stepping scheme), state variables (input or output), model configuration parameters. This kind of metadata provides a "deep description" of a computational model that goes well beyond other types of metadata (e.g. author, purpose, scientific domain, programming language, digital rights, provenance, execution) and captures the science that underpins a model. A carefully constructed, unambiguous and rules-based schema to address this problem, called the Geoscience Standard Names ontology will be presented that utilizes Semantic Web best practices and technologies. It has also been designed to work across science domains and to be readable by both humans and machines.

  10. UltiMatch-NL: a Web service matchmaker based on multiple semantic filters.

    Science.gov (United States)

    Mohebbi, Keyvan; Ibrahim, Suhaimi; Zamani, Mazdak; Khezrian, Mojtaba

    2014-01-01

    In this paper, a Semantic Web service matchmaker called UltiMatch-NL is presented. UltiMatch-NL applies two filters namely Signature-based and Description-based on different abstraction levels of a service profile to achieve more accurate results. More specifically, the proposed filters rely on semantic knowledge to extract the similarity between a given pair of service descriptions. Thus it is a further step towards fully automated Web service discovery via making this process more semantic-aware. In addition, a new technique is proposed to weight and combine the results of different filters of UltiMatch-NL, automatically. Moreover, an innovative approach is introduced to predict the relevance of requests and Web services and eliminate the need for setting a threshold value of similarity. In order to evaluate UltiMatch-NL, the repository of OWLS-TC is used. The performance evaluation based on standard measures from the information retrieval field shows that semantic matching of OWL-S services can be significantly improved by incorporating designed matching filters.

  11. UltiMatch-NL: A Web Service Matchmaker Based on Multiple Semantic Filters

    Science.gov (United States)

    Mohebbi, Keyvan; Ibrahim, Suhaimi; Zamani, Mazdak; Khezrian, Mojtaba

    2014-01-01

    In this paper, a Semantic Web service matchmaker called UltiMatch-NL is presented. UltiMatch-NL applies two filters namely Signature-based and Description-based on different abstraction levels of a service profile to achieve more accurate results. More specifically, the proposed filters rely on semantic knowledge to extract the similarity between a given pair of service descriptions. Thus it is a further step towards fully automated Web service discovery via making this process more semantic-aware. In addition, a new technique is proposed to weight and combine the results of different filters of UltiMatch-NL, automatically. Moreover, an innovative approach is introduced to predict the relevance of requests and Web services and eliminate the need for setting a threshold value of similarity. In order to evaluate UltiMatch-NL, the repository of OWLS-TC is used. The performance evaluation based on standard measures from the information retrieval field shows that semantic matching of OWL-S services can be significantly improved by incorporating designed matching filters. PMID:25157872

  12. Semantic Drift in Espresso-style Bootstrapping: Graph-theoretic Analysis and Evaluation in Word Sense Disambiguation

    Science.gov (United States)

    Komachi, Mamoru; Kudo, Taku; Shimbo, Masashi; Matsumoto, Yuji

    Bootstrapping has a tendency, called semantic drift, to select instances unrelated to the seed instances as the iteration proceeds. We demonstrate the semantic drift of Espresso-style bootstrapping has the same root as the topic drift of Kleinberg's HITS, using a simplified graph-based reformulation of bootstrapping. We confirm that two graph-based algorithms, the von Neumann kernels and the regularized Laplacian, can reduce the effect of semantic drift in the task of word sense disambiguation (WSD) on Senseval-3 English Lexical Sample Task. Proposed algorithms achieve superior performance to Espresso and previous graph-based WSD methods, even though the proposed algorithms have less parameters and are easy to calibrate.

  13. Towards multi-layer interoperability of heterogeneous IoT platforms : the INTER-IoT approach

    NARCIS (Netherlands)

    Fortino, Giancarlo; Savaglio, Claudio; Palau, Carlos E.; de Puga, Jara Suarez; Ghanza, Maria; Paprzycki, Marcin; Montesinos, Miguel; Liotta, Antonio; Llop, Miguel; Gravina, R.; Palau, C.E.; Manso, M.; Liotta, A.; Fortino, G.

    2018-01-01

    Open interoperability delivers on the promise of enabling vendors and developers to interact and interoperate, without interfering with anyone’s ability to compete by delivering a superior product and experience. In the absence of global IoT standards, the INTER-IoT voluntary approach will support

  14. CCSDS SM and C Mission Operations Interoperability Prototype

    Science.gov (United States)

    Lucord, Steven A.

    2010-01-01

    This slide presentation reviews the prototype of the Spacecraft Monitor and Control (SM&C) Operations for interoperability among other space agencies. This particular prototype uses the German Space Agency (DLR) to test the ideas for interagency coordination.

  15. 47 CFR 0.192 - Emergency Response Interoperability Center.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Emergency Response Interoperability Center. 0.192 Section 0.192 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL COMMISSION ORGANIZATION..., industry representatives, and service providers. [75 FR 28207, May 20, 2010] ...

  16. CINERGI: Community Inventory of EarthCube Resources for Geoscience Interoperability

    Science.gov (United States)

    Zaslavsky, Ilya; Bermudez, Luis; Grethe, Jeffrey; Gupta, Amarnath; Hsu, Leslie; Lehnert, Kerstin; Malik, Tanu; Richard, Stephen; Valentine, David; Whitenack, Thomas

    2014-05-01

    Organizing geoscience data resources to support cross-disciplinary data discovery, interpretation, analysis and integration is challenging because of different information models, semantic frameworks, metadata profiles, catalogs, and services used in different geoscience domains, not to mention different research paradigms and methodologies. The central goal of CINERGI, a new project supported by the US National Science Foundation through its EarthCube Building Blocks program, is to create a methodology and assemble a large inventory of high-quality information resources capable of supporting data discovery needs of researchers in a wide range of geoscience domains. The key characteristics of the inventory are: 1) collaboration with and integration of metadata resources from a number of large data facilities; 2) reliance on international metadata and catalog service standards; 3) assessment of resource "interoperability-readiness"; 4) ability to cross-link and navigate data resources, projects, models, researcher directories, publications, usage information, etc.; 5) efficient inclusion of "long-tail" data, which are not appearing in existing domain repositories; 6) data registration at feature level where appropriate, in addition to common dataset-level registration, and 7) integration with parallel EarthCube efforts, in particular focused on EarthCube governance, information brokering, service-oriented architecture design and management of semantic information. We discuss challenges associated with accomplishing CINERGI goals, including defining the inventory scope; managing different granularity levels of resource registration; interaction with search systems of domain repositories; explicating domain semantics; metadata brokering, harvesting and pruning; managing provenance of the harvested metadata; and cross-linking resources based on the linked open data (LOD) approaches. At the higher level of the inventory, we register domain-wide resources such as domain

  17. SemantEco: a semantically powered modular architecture for integrating distributed environmental and ecological data

    Science.gov (United States)

    Patton, Evan W.; Seyed, Patrice; Wang, Ping; Fu, Linyun; Dein, F. Joshua; Bristol, R. Sky; McGuinness, Deborah L.

    2014-01-01

    We aim to inform the development of decision support tools for resource managers who need to examine large complex ecosystems and make recommendations in the face of many tradeoffs and conflicting drivers. We take a semantic technology approach, leveraging background ontologies and the growing body of linked open data. In previous work, we designed and implemented a semantically enabled environmental monitoring framework called SemantEco and used it to build a water quality portal named SemantAqua. Our previous system included foundational ontologies to support environmental regulation violations and relevant human health effects. In this work, we discuss SemantEco’s new architecture that supports modular extensions and makes it easier to support additional domains. Our enhanced framework includes foundational ontologies to support modeling of wildlife observation and wildlife health impacts, thereby enabling deeper and broader support for more holistically examining the effects of environmental pollution on ecosystems. We conclude with a discussion of how, through the application of semantic technologies, modular designs will make it easier for resource managers to bring in new sources of data to support more complex use cases.

  18. The 2nd DBCLS BioHackathon: interoperable bioinformatics Web services for integrated applications

    Directory of Open Access Journals (Sweden)

    Katayama Toshiaki

    2011-08-01

    Full Text Available Abstract Background The interaction between biological researchers and the bioinformatics tools they use is still hampered by incomplete interoperability between such tools. To ensure interoperability initiatives are effectively deployed, end-user applications need to be aware of, and support, best practices and standards. Here, we report on an initiative in which software developers and genome biologists came together to explore and raise awareness of these issues: BioHackathon 2009. Results Developers in attendance came from diverse backgrounds, with experts in Web services, workflow tools, text mining and visualization. Genome biologists provided expertise and exemplar data from the domains of sequence and pathway analysis and glyco-informatics. One goal of the meeting was to evaluate the ability to address real world use cases in these domains using the tools that the developers represented. This resulted in i a workflow to annotate 100,000 sequences from an invertebrate species; ii an integrated system for analysis of the transcription factor binding sites (TFBSs enriched based on differential gene expression data obtained from a microarray experiment; iii a workflow to enumerate putative physical protein interactions among enzymes in a metabolic pathway using protein structure data; iv a workflow to analyze glyco-gene-related diseases by searching for human homologs of glyco-genes in other species, such as fruit flies, and retrieving their phenotype-annotated SNPs. Conclusions Beyond deriving prototype solutions for each use-case, a second major purpose of the BioHackathon was to highlight areas of insufficiency. We discuss the issues raised by our exploration of the problem/solution space, concluding that there are still problems with the way Web services are modeled and annotated, including: i the absence of several useful data or analysis functions in the Web service "space"; ii the lack of documentation of methods; iii lack of

  19. The 2nd DBCLS BioHackathon: interoperable bioinformatics Web services for integrated applications

    Science.gov (United States)

    2011-01-01

    Background The interaction between biological researchers and the bioinformatics tools they use is still hampered by incomplete interoperability between such tools. To ensure interoperability initiatives are effectively deployed, end-user applications need to be aware of, and support, best practices and standards. Here, we report on an initiative in which software developers and genome biologists came together to explore and raise awareness of these issues: BioHackathon 2009. Results Developers in attendance came from diverse backgrounds, with experts in Web services, workflow tools, text mining and visualization. Genome biologists provided expertise and exemplar data from the domains of sequence and pathway analysis and glyco-informatics. One goal of the meeting was to evaluate the ability to address real world use cases in these domains using the tools that the developers represented. This resulted in i) a workflow to annotate 100,000 sequences from an invertebrate species; ii) an integrated system for analysis of the transcription factor binding sites (TFBSs) enriched based on differential gene expression data obtained from a microarray experiment; iii) a workflow to enumerate putative physical protein interactions among enzymes in a metabolic pathway using protein structure data; iv) a workflow to analyze glyco-gene-related diseases by searching for human homologs of glyco-genes in other species, such as fruit flies, and retrieving their phenotype-annotated SNPs. Conclusions Beyond deriving prototype solutions for each use-case, a second major purpose of the BioHackathon was to highlight areas of insufficiency. We discuss the issues raised by our exploration of the problem/solution space, concluding that there are still problems with the way Web services are modeled and annotated, including: i) the absence of several useful data or analysis functions in the Web service "space"; ii) the lack of documentation of methods; iii) lack of compliance with the SOAP

  20. Semantic Web Primer

    NARCIS (Netherlands)

    Antoniou, Grigoris; Harmelen, Frank van

    2004-01-01

    The development of the Semantic Web, with machine-readable content, has the potential to revolutionize the World Wide Web and its use. A Semantic Web Primer provides an introduction and guide to this still emerging field, describing its key ideas, languages, and technologies. Suitable for use as a

  1. Wernicke's Aphasia Reflects a Combination of Acoustic-Phonological and Semantic Control Deficits: A Case-Series Comparison of Wernicke's Aphasia, Semantic Dementia and Semantic Aphasia

    Science.gov (United States)

    Robson, Holly; Sage, Karen; Lambon Ralph, Matthew A.

    2012-01-01

    Wernicke's aphasia (WA) is the classical neurological model of comprehension impairment and, as a result, the posterior temporal lobe is assumed to be critical to semantic cognition. This conclusion is potentially confused by (a) the existence of patient groups with semantic impairment following damage to other brain regions (semantic dementia and…

  2. Modeling Interoperable Information Systems with 3LGM² and IHE.

    Science.gov (United States)

    Stäubert, S; Schaaf, M; Jahn, F; Brandner, R; Winter, A

    2015-01-01

    Strategic planning of information systems (IS) in healthcare requires descriptions of the current and the future IS state. Enterprise architecture planning (EAP) tools like the 3LGM² tool help to build up and to analyze IS models. A model of the planned architecture can be derived from an analysis of current state IS models. Building an interoperable IS, i. e. an IS consisting of interoperable components, can be considered a relevant strategic information management goal for many IS in healthcare. Integrating the healthcare enterprise (IHE) is an initiative which targets interoperability by using established standards. To link IHE concepts to 3LGM² concepts within the 3LGM² tool. To describe how an information manager can be supported in handling the complex IHE world and planning interoperable IS using 3LGM² models. To describe how developers or maintainers of IHE profiles can be supported by the representation of IHE concepts in 3LGM². Conceptualization and concept mapping methods are used to assign IHE concepts such as domains, integration profiles actors and transactions to the concepts of the three-layer graph-based meta-model (3LGM²). IHE concepts were successfully linked to 3LGM² concepts. An IHE-master-model, i. e. an abstract model for IHE concepts, was modeled with the help of 3LGM² tool. Two IHE domains were modeled in detail (ITI, QRPH). We describe two use cases for the representation of IHE concepts and IHE domains as 3LGM² models. Information managers can use the IHE-master-model as reference model for modeling interoperable IS based on IHE profiles during EAP activities. IHE developers are supported in analyzing consistency of IHE concepts with the help of the IHE-master-model and functions of the 3LGM² tool The complex relations between IHE concepts can be modeled by using the EAP method 3LGM². 3LGM² tool offers visualization and analysis features which are now available for the IHE-master-model. Thus information managers and IHE

  3. COTARD SYNDROME IN SEMANTIC DEMENTIA

    Science.gov (United States)

    Mendez, Mario F.; Ramírez-Bermúdez, Jesús

    2011-01-01

    Background Semantic dementia is a neurodegenerative disorder characterized by the loss of meaning of words or concepts. semantic dementia can offer potential insights into the mechanisms of content-specific delusions. Objective The authors present a rare case of semantic dementia with Cotard syndrome, a delusion characterized by nihilism or self-negation. Method The semantic deficits and other features of semantic dementia were evaluated in relation to the patient's Cotard syndrome. Results Mrs. A developed the delusional belief that she was wasting and dying. This occurred after she lost knowledge for her somatic discomforts and sensations and for the organs that were the source of these sensations. Her nihilistic beliefs appeared to emerge from her misunderstanding of her somatic sensations. Conclusion This unique patient suggests that a mechanism for Cotard syndrome is difficulty interpreting the nature and source of internal pains and sensations. We propose that loss of semantic knowledge about one's own body may lead to the delusion of nihilism or death. PMID:22054629

  4. Hierarchical Semantic Model of Geovideo

    Directory of Open Access Journals (Sweden)

    XIE Xiao

    2015-05-01

    Full Text Available The public security incidents were getting increasingly challenging with regard to their new features, including multi-scale mobility, multistage dynamic evolution, as well as spatiotemporal concurrency and uncertainty in the complex urban environment. However, the existing video models, which were used/designed for independent archive or local analysis of surveillance video, have seriously inhibited emergency response to the urgent requirements.Aiming at the explicit representation of change mechanism in video, the paper proposed a novel hierarchical geovideo semantic model using UML. This model was characterized by the hierarchical representation of both data structure and semantics based on the change-oriented three domains (feature domain, process domain and event domain instead of overall semantic description of video streaming; combining both geographical semantics and video content semantics, in support of global semantic association between multiple geovideo data. The public security incidents by video surveillance are inspected as an example to illustrate the validity of this model.

  5. Mapping the Structure of Semantic Memory

    Science.gov (United States)

    Morais, Ana Sofia; Olsson, Henrik; Schooler, Lael J.

    2013-01-01

    Aggregating snippets from the semantic memories of many individuals may not yield a good map of an individual's semantic memory. The authors analyze the structure of semantic networks that they sampled from individuals through a new snowball sampling paradigm during approximately 6 weeks of 1-hr daily sessions. The semantic networks of individuals…

  6. Causality in the semantics of Esterel : revisited

    NARCIS (Netherlands)

    Mousavi, M.R.; Klin, B.; Sobocinski, P.

    2010-01-01

    We re-examine the challenges concerning causality in the semantics of Esterel and show that they pertain to the known issues in the semantics of Structured Operational Semantics with negative premises. We show that the solutions offered for the semantics of SOS also provide answers to the semantic

  7. Semantic Representatives of the Concept

    Directory of Open Access Journals (Sweden)

    Elena N. Tsay

    2013-01-01

    Full Text Available In the article concept as one of the principle notions of cognitive linguistics is investigated. Considering concept as culture phenomenon, having language realization and ethnocultural peculiarities, the description of the concept “happiness” is presented. Lexical and semantic paradigm of the concept of happiness correlates with a great number of lexical and semantic variants. In the work semantic representatives of the concept of happiness, covering supreme spiritual values are revealed and semantic interpretation of their functioning in the Biblical discourse is given.

  8. System semantics of explanatory dictionaries

    Directory of Open Access Journals (Sweden)

    Volodymyr Shyrokov

    2015-11-01

    Full Text Available System semantics of explanatory dictionaries Some semantic properties of the language to be followed from the structure of lexicographical systems of big explanatory dictionaries are considered. The hyperchains and hypercycles are determined as the definite kind of automorphisms of the lexicographical system of explanatory dictionary. Some semantic consequencies following from the principles of lexicographic closure and lexicographic completeness are investigated using the hyperchains and hypercycles formalism. The connection between the hypercyle properties of the lexicographical system semantics and Goedel’s incompleteness theorem is discussed.

  9. Legacy2Drupal - Conversion of an existing oceanographic relational database to a semantically enabled Drupal content management system

    Science.gov (United States)

    Maffei, A. R.; Chandler, C. L.; Work, T.; Allen, J.; Groman, R. C.; Fox, P. A.

    2009-12-01

    Content Management Systems (CMSs) provide powerful features that can be of use to oceanographic (and other geo-science) data managers. However, in many instances, geo-science data management offices have previously designed customized schemas for their metadata. The WHOI Ocean Informatics initiative and the NSF funded Biological Chemical and Biological Data Management Office (BCO-DMO) have jointly sponsored a project to port an existing, relational database containing oceanographic metadata, along with an existing interface coded in Cold Fusion middleware, to a Drupal6 Content Management System. The goal was to translate all the existing database tables, input forms, website reports, and other features present in the existing system to employ Drupal CMS features. The replacement features include Drupal content types, CCK node-reference fields, themes, RDB, SPARQL, workflow, and a number of other supporting modules. Strategic use of some Drupal6 CMS features enables three separate but complementary interfaces that provide access to oceanographic research metadata via the MySQL database: 1) a Drupal6-powered front-end; 2) a standard SQL port (used to provide a Mapserver interface to the metadata and data; and 3) a SPARQL port (feeding a new faceted search capability being developed). Future plans include the creation of science ontologies, by scientist/technologist teams, that will drive semantically-enabled faceted search capabilities planned for the site. Incorporation of semantic technologies included in the future Drupal 7 core release is also anticipated. Using a public domain CMS as opposed to proprietary middleware, and taking advantage of the many features of Drupal 6 that are designed to support semantically-enabled interfaces will help prepare the BCO-DMO database for interoperability with other ecosystem databases.

  10. Enabling Technologies for Smart Grid Integration and Interoperability of Electric Vehicles

    DEFF Research Database (Denmark)

    Martinenas, Sergejus

    Conventional, centralized power plants are being replaced by intermittent, distributed renewable energy sources, thus raising the concern about the stability of the power grid in its current state. All the while, electrification of all forms of transportation is increasing the load...... for successful EV integration into the smart grid, as a smart, mobile distributed energy resource. The work is split into three key topics: enabling technologies, grid service applications and interoperability issues. The current state of e-mobility technologies is surveyed. Technologies and protocols...... EVs to not only mitigate their own effects on the grid, but also provide value to grid operators, locally as well as system wide. Finally, it is shown that active integration of EVs into the smart grid, is not only achievable, but is well on its way to becoming a reality....

  11. The Semantic Web Revisited

    OpenAIRE

    Shadbolt, Nigel; Berners-Lee, Tim; Hall, Wendy

    2006-01-01

    The original Scientific American article on the Semantic Web appeared in 2001. It described the evolution of a Web that consisted largely of documents for humans to read to one that included data and information for computers to manipulate. The Semantic Web is a Web of actionable information--information derived from data through a semantic theory for interpreting the symbols.This simple idea, however, remains largely unrealized. Shopbots and auction bots abound on the Web, but these are esse...

  12. Semantic Search of Web Services

    Science.gov (United States)

    Hao, Ke

    2013-01-01

    This dissertation addresses semantic search of Web services using natural language processing. We first survey various existing approaches, focusing on the fact that the expensive costs of current semantic annotation frameworks result in limited use of semantic search for large scale applications. We then propose a vector space model based service…

  13. The development of a prototype level-three interoperable catalog system

    Science.gov (United States)

    Hood, Carroll A.; Howie, Randy; Verhanovitz, Rich

    1993-08-01

    The development of a level-three interoperable catalog system is defined by a new paradigm for metadata access. The old paradigm is characterized by a hierarchy of metadata layers, the transfer of control to target systems, and the requirement for the user to be familiar with the syntax and data dictionaries of several catalog system elements. Attributes of the new paradigm are exactly orthogonal: the directory and inventories are peer entities, there is a single user interface, and the system manages the complexity of interacting transparently with remote elements. We have designed and implemented a prototype level-three interoperable catalog system based on the new paradigm. Through a single intelligent interface, users can interoperably access a master directory, inventories for selected satellite datasets, and an in situ meteorological dataset inventory. This paper describes the development of the prototype system and three of the formidable challenges that were addressed in the process. The first involved the interoperable integration of satellite and in situ inventories, which to our knowledge, has never been operationally demonstrated. The second was the development of a search strategy for orbital and suborbital granules which preserves the capability to identify temporally or spatially coincident subsets between them. The third involved establishing a method of incorporating inventory-specific search criteria into user queries. We are working closely with selected science data users to obtain feedback on the system's design and performance. The lessons learned from this prototype will help direct future development efforts. Distributed data systems of the 1990s such as EOSDIS and the Global Change Data and Information System (GCDIS) will be able to build on this prototype.

  14. Formal semantic specifications as implementation blueprints for real-time programming languages

    Science.gov (United States)

    Feyock, S.

    1981-01-01

    Formal definitions of language and system semantics provide highly desirable checks on the correctness of implementations of programming languages and their runtime support systems. If these definitions can give concrete guidance to the implementor, major increases in implementation accuracy and decreases in implementation effort can be achieved. It is shown that of the wide variety of available methods the Hgraph (hypergraph) definitional technique (Pratt, 1975), is best suited to serve as such an implementation blueprint. A discussion and example of the Hgraph technique is presented, as well as an overview of the growing body of implementation experience of real-time languages based on Hgraph semantic definitions.

  15. Rollout Strategy to Implement Interoperable Traceability in the Seafood Industry.

    Science.gov (United States)

    Gooch, Martin; Dent, Benjamin; Sylvia, Gilbert; Cusack, Christopher

    2017-08-01

    Verifying the accuracy and rigor of data exchanged within and between businesses for the purposes of traceability rests on the existence of effective and efficient interoperable information systems that meet users' needs. Interoperability, particularly given the complexities intrinsic to the seafood industry, requires that the systems used by businesses operating along the supply chain share a common technology architecture that is robust, resilient, and evolves as industry needs change. Technology architectures are developed through engaging industry stakeholders in understanding why an architecture is required, the benefits provided to the industry and individual businesses and supply chains, and how the architecture will translate into practical results. This article begins by reiterating the benefits that the global seafood industry can capture by implementing interoperable chain-length traceability and the reason for basing the architecture on a peer-to-peer networked database concept versus more traditional centralized or linear approaches. A summary of capabilities that already exist within the seafood industry that the proposed architecture uses is discussed; and a strategy for implementing the architecture is presented. The 6-step strategy is presented in the form of a critical path. © 2017 Institute of Food Technologists®.

  16. Hierarchical layered and semantic-based image segmentation using ergodicity map

    Science.gov (United States)

    Yadegar, Jacob; Liu, Xiaoqing

    2010-04-01

    Image segmentation plays a foundational role in image understanding and computer vision. Although great strides have been made and progress achieved on automatic/semi-automatic image segmentation algorithms, designing a generic, robust, and efficient image segmentation algorithm is still challenging. Human vision is still far superior compared to computer vision, especially in interpreting semantic meanings/objects in images. We present a hierarchical/layered semantic image segmentation algorithm that can automatically and efficiently segment images into hierarchical layered/multi-scaled semantic regions/objects with contextual topological relationships. The proposed algorithm bridges the gap between high-level semantics and low-level visual features/cues (such as color, intensity, edge, etc.) through utilizing a layered/hierarchical ergodicity map, where ergodicity is computed based on a space filling fractal concept and used as a region dissimilarity measurement. The algorithm applies a highly scalable, efficient, and adaptive Peano- Cesaro triangulation/tiling technique to decompose the given image into a set of similar/homogenous regions based on low-level visual cues in a top-down manner. The layered/hierarchical ergodicity map is built through a bottom-up region dissimilarity analysis. The recursive fractal sweep associated with the Peano-Cesaro triangulation provides efficient local multi-resolution refinement to any level of detail. The generated binary decomposition tree also provides efficient neighbor retrieval mechanisms for contextual topological object/region relationship generation. Experiments have been conducted within the maritime image environment where the segmented layered semantic objects include the basic level objects (i.e. sky/land/water) and deeper level objects in the sky/land/water surfaces. Experimental results demonstrate the proposed algorithm has the capability to robustly and efficiently segment images into layered semantic objects

  17. Radio Interoperability: There Is More to It Than Hardware

    National Research Council Canada - National Science Library

    Hutchins, Susan G; Timmons, Ronald P

    2007-01-01

    Radio Interoperability: The Problem *Superfluous radio transmissions contribute to auditory overload of first responders -Obscure development of an accurate operational picture for all involved -Radio spectrum is a limited commodity once...

  18. LAIR: A Language for Automated Semantics-Aware Text Sanitization based on Frame Semantics

    DEFF Research Database (Denmark)

    Hedegaard, Steffen; Houen, Søren; Simonsen, Jakob Grue

    2009-01-01

    We present \\lair{}: A domain-specific language that enables users to specify actions to be taken upon meeting specific semantic frames in a text, in particular to rephrase and redact the textual content. While \\lair{} presupposes superficial knowledge of frames and frame semantics, it requires on...... with automated redaction of web pages for subjectively undesirable content; initial experiments suggest that using a small language based on semantic recognition of undesirable terms can be highly useful as a supplement to traditional methods of text sanitization.......We present \\lair{}: A domain-specific language that enables users to specify actions to be taken upon meeting specific semantic frames in a text, in particular to rephrase and redact the textual content. While \\lair{} presupposes superficial knowledge of frames and frame semantics, it requires only...... limited prior programming experience. It neither contain scripting or I/O primitives, nor does it contain general loop constructions and is not Turing-complete. We have implemented a \\lair{} compiler and integrated it in a pipeline for automated redaction of web pages. We detail our experience...

  19. A Proposed Information Architecture for Telehealth System Interoperability

    Energy Technology Data Exchange (ETDEWEB)

    Craft, R.L.; Funkhouser, D.R.; Gallagher, L.K.; Garica, R.J.; Parks, R.C.; Warren, S.

    1999-04-20

    We propose an object-oriented information architecture for telemedicine systems that promotes secure `plug-and-play' interaction between system components through standardized interfaces, communication protocols, messaging formats, and data definitions. In this architecture, each component functions as a black box, and components plug together in a ''lego-like'' fashion to achieve the desired device or system functionality. Introduction Telemedicine systems today rely increasingly on distributed, collaborative information technology during the care delivery process. While these leading-edge systems are bellwethers for highly advanced telemedicine, most are custom-designed and do not interoperate with other commercial offerings. Users are limited to a set of functionality that a single vendor provides and must often pay high prices to obtain this functionality, since vendors in this marketplace must deliver en- tire systems in order to compete. Besides increasing corporate research and development costs, this inhibits the ability of the user to make intelligent purchasing decisions regarding best-of-breed technologies. This paper proposes a reference architecture for plug-and-play telemedicine systems that addresses these issues.

  20. An application of ETICS Co-Scheduling Mechanism to Interoperability and Compliance Validation of Grid Services

    CERN Document Server

    Ronchieri, Elisabetta; Diez-andino Sancho, Guillermo; DI Meglio, Alberto; Marzolla, Moreno

    2008-01-01

    Grid software projects require infrastructures in order to evaluate interoperability with other projects and compliance with predefined standards. Interoperability and compliance are quality attributes that are expected from all distributed projects. ETICS is designed to automate the investigation of this kind of problems. It integrates well-established procedures, tools and resources in a coherent framework and adaptes them to the special needs of these projects. Interoperability and compliance to standards are important quality attributes of software developed for Grid environments where many different parts of an interconnected system have to interact. Compliance to standard is one of the major factors in making sure that interoperating parts of a distributed system can actually interconnect and exchange information. Taking the case of the Grid environment (Foster and Kesselman, 2003), most of the projects that are developing software have not reached the maturity level of other communities yet and have di...