WorldWideScience

Sample records for achieving semantic interoperability

  1. Semantically Interoperable XML Data

    Vergara-Niedermayr, Cristobal; Wang, Fusheng; Pan, Tony; Kurc, Tahsin; Saltz, Joel

    2013-01-01

    XML is ubiquitously used as an information exchange platform for web-based applications in healthcare, life sciences, and many other domains. Proliferating XML data are now managed through latest native XML database technologies. XML data sources conforming to common XML schemas could be shared and integrated with syntactic interoperability. Semantic interoperability can be achieved through semantic annotations of data models using common data elements linked to concepts from ontologies. In t...

  2. Real Time Semantic Interoperability in AD HOC Networks of Geospatial Data Sources: Challenges, Achievements and Perspectives

    Mostafavi, M. A.; Bakillah, M.

    2012-07-01

    Recent advances in geospatial technologies have made available large amount of geospatial data. Meanwhile, new developments in Internet and communication technologies created a shift from isolated geospatial databases to ad hoc networks of geospatial data sources, where data sources can join or leave the network, and form groups to share data and services. However, effective integration and sharing of geospatial data among these data sources and their users are hampered by semantic heterogeneities. These heterogeneities affect the spatial, temporal and thematic aspects of geospatial concepts. There have been many efforts to address semantic interoperability issues in the geospatial domain. These efforts were mainly focused on resolving heterogeneities caused by different and implicit representations of the concepts. However, many approaches have focused on the thematic aspects, leaving aside the explicit representation of spatial and temporal aspects. Also, most semantic interoperability approaches for networks have focused on automating the semantic mapping process. However, the ad hoc network structure is continuously modified by source addition or removal, formation of groups, etc. This dynamic aspect is often neglected in those approaches. This paper proposes a conceptual framework for real time semantic interoperability in ad hoc networks of geospatial data sources. The conceptual framework presents the fundamental elements of real time semantic interoperability through a hierarchy of interrelated semantic states and processes. Then, we use the conceptual framework to set the discussion on the achievements that have already been made, the challenges that remain to be addressed and perspectives with respect to these challenges.

  3. Semantic Interoperability in Electronic Business

    Juha Puustjarvi

    2010-09-01

    Full Text Available E-business refers to the utilization of information and communication technologies (ICT in support of all the activities of business. The standards developed for e-business help to facilitate the deployment of e-business. In particular, several organizations in e-business sector have produced standards and representation forms using XML. It serves as an interchange format for exchanging data between communicating applications. However, XML says nothing about the semantics of the used tags. XML is merely a standard notation for markup languages, which provides a means for structuring documents. Therefore the XML-based e-business software is developed by hard-coding. Hard-coding is proven to be a valuable and powerful way for exchanging structured and persistent business documents. However, if we use hard-coding in the case of non-persistent documents and non-static environments we will encounter problems in deploying new document types as it requires a long lasting standardization process. Replacing existing hard-coded ebusiness systems by open systems that support semantic interoperability, and which are easily extensible, is the topic of this article. We first consider XML-based technologies and standards developed for B2B interoperation. Then, we consider electronic auctions, which represent a form of e-business. In particular, we represent how semantic interoperability can be achieved in electronic auctions.

  4. Semantic and Process Interoperability

    Félix Oscar Fernández Peña

    2010-05-01

    Full Text Available Knowledge management systems support education at different levels of the education. This is very important for the process in which the higher education of Cuba is involved. Structural transformations of teaching are focused on supporting the foundation of the information society in the country. This paper describes technical aspects of the designing of a model for the integration of multiple knowledgemanagement tools supporting teaching. The proposal is based on the definition of an ontology for the explicit formal description of the semantic of motivations of students and teachers in the learning process. Its target is to facilitate knowledge spreading.

  5. Semantic Interoperability in Multimedia Distributed Health Record

    Hanzlíček, Petr; Nagy, Miroslav; Přečková, Petra; Říha, Antonín; Dioszegi, Matěj; Zvárová, Jana

    Göteborg, 2008. [MIE 2008. International Congress of the European Federation for Medical Informatics /21./. 25.05.2008-28.05.2008, Göteborg] R&D Projects: GA AV ČR 1ET200300413 Institutional research plan: CEZ:AV0Z10300504 Keywords : electronic health record * semantic interoperability * nomenclatures Subject RIV: IN - Informatics, Computer Science http://www.sfmi.se/home/page.asp?sid=63&mid=2&PageId=1826

  6. Review of Semantically Interoperable Electronic Health Records for Ubiquitous Healthcare

    Hwang, Kyung Hoon; Chung, Kyo-IL; Chung, Myung-Ae; Choi, Duckjoo

    2010-01-01

    In order to provide more effective and personalized healthcare services to patients and healthcare professionals, intelligent active knowledge management and reasoning systems with semantic interoperability are needed. Technological developments have changed ubiquitous healthcare making it more semantically interoperable and individual patient-based; however, there are also limitations to these methodologies. Based upon an extensive review of international literature, this paper describes two...

  7. Formal Semantic Annotations for Models Interoperability in a PLM environment

    Liao, Yongxin; Lezoche, Mario; Panetto, Hervé; Boudjlida, Nacer; Rocha Loures, Eduardo

    2014-01-01

    Nowadays, the need for system interoperability in or across enterprises has become more and more ubiquitous. Lots of research works have been carried out in the information exchange, transformation, discovery and reuse. One of the main challenges in these researches is to overcome the semantic heterogeneity between enterprise applications along the lifecycle of a product. As a possible solution to assist the semantic interoperability, semantic annotation has gained more and more attentions an...

  8. Providing semantic interoperability between clinical care and clinical research domains.

    Laleci, Gokce Banu; Yuksel, Mustafa; Dogac, Asuman

    2013-03-01

    Improving the efficiency with which clinical research studies are conducted can lead to faster medication innovation and decreased time to market for new drugs. To increase this efficiency, the parties involved in a regulated clinical research study, namely, the sponsor, the clinical investigator and the regulatory body, each with their own software applications, need to exchange data seamlessly. However, currently, the clinical research and the clinical care domains are quite disconnected because each use different standards and terminology systems. In this article, we describe an initial implementation of the Semantic Framework developed within the scope of SALUS project to achieve interoperability between the clinical research and the clinical care domains. In our Semantic Framework, the core ontology developed for semantic mediation is based on the shared conceptual model of both of these domains provided by the BRIDG initiative. The core ontology is then aligned with the extracted semantic models of the existing clinical care and research standards as well as with the ontological representations of the terminology systems to create a model of meaning for enabling semantic mediation. Although SALUS is a research and development effort rather than a product, the current SALUS knowledge base contains around 4.7 million triples representing BRIDG DAM, HL7 CDA model, CDISC standards and several terminology ontologies. In order to keep the reasoning process within acceptable limits without sacrificing the quality of mediation, we took an engineering approach by developing a number of heuristic mechanisms. The results indicate that it is possible to build a robust and scalable semantic framework with a solid theoretical foundation for achieving interoperability between the clinical research and clinical care domains. PMID:23008263

  9. Local ontologies for semantic interoperability in supply chain networks

    Zdravković, Milan; Trajanović, Miroslav; Panetto, Hervé

    2011-01-01

    ISBN: 978-989-8425-53-9 International audience Most of the issues of current supply chain management practices are related to the challenges of interoperability of relevant enterprise information systems (EIS). In this paper, we present the ontological framework for semantic interoperability of EISs in supply chain networks, based on Supply Chain Operations Reference (SCOR) model, its semantic enrichment and mappings with relevant enterprise conceptualizations. In order to introduce the...

  10. State of the art on semantic IS standardization, interoperability & quality

    Folmer, Erwin; Verhoosel, Jack

    2011-01-01

    This book contains a broad overview of relevant studies in the area of semantic IS standards. It includes an introduction in the general topic of standardization and introduces the concept of interoperability. The primary focus is however on semantic IS standards, their characteristics, and the qual

  11. State of the Art on Semantic IS Standardization, Interoperability & Quality

    Folmer, E.J.A.; Verhoosel, J.P.C.

    2011-01-01

    This book contains a broad overview of relevant studies in the area of semantic IS standards. It includes an introduction in the general topic of standardization and introduces the concept of interoperability. The primary focus is however on semantic IS standards, their characteristics, and the qual

  12. Semantics-Based Interoperability Framework for the Geosciences

    Sinha, A.; Malik, Z.; Raskin, R.; Barnes, C.; Fox, P.; McGuinness, D.; Lin, K.

    2008-12-01

    Interoperability between heterogeneous data, tools and services is required to transform data to knowledge. To meet geoscience-oriented societal challenges such as forcing of climate change induced by volcanic eruptions, we suggest the need to develop semantic interoperability for data, services, and processes. Because such scientific endeavors require integration of multiple data bases associated with global enterprises, implicit semantic-based integration is impossible. Instead, explicit semantics are needed to facilitate interoperability and integration. Although different types of integration models are available (syntactic or semantic) we suggest that semantic interoperability is likely to be the most successful pathway. Clearly, the geoscience community would benefit from utilization of existing XML-based data models, such as GeoSciML, WaterML, etc to rapidly advance semantic interoperability and integration. We recognize that such integration will require a "meanings-based search, reasoning and information brokering", which will be facilitated through inter-ontology relationships (ontologies defined for each discipline). We suggest that Markup languages (MLs) and ontologies can be seen as "data integration facilitators", working at different abstraction levels. Therefore, we propose to use an ontology-based data registration and discovery approach to compliment mark-up languages through semantic data enrichment. Ontologies allow the use of formal and descriptive logic statements which permits expressive query capabilities for data integration through reasoning. We have developed domain ontologies (EPONT) to capture the concept behind data. EPONT ontologies are associated with existing ontologies such as SUMO, DOLCE and SWEET. Although significant efforts have gone into developing data (object) ontologies, we advance the idea of developing semantic frameworks for additional ontologies that deal with processes and services. This evolutionary step will

  13. Semantic interoperability in sensor applications : Making sense of sensor data

    Brandt, P Paul; Basten, T Twan; Stuijk, S Sander; Bui, TV The Vinh; Clercq, de, Willem; Ferreira Pires, L.; Sinderen, van, Marten

    2013-01-01

    Much effort has been spent on the optimization of sensor networks, mainly concerning their performance and power efficiency. Furthermore, open communication protocols for the exchange of sensor data have been developed and widely adopted, making sensor data widely available for software applications. However, less attention has been given to the interoperability of sensor networks and sensor network applications at a semantic level. This hinders the reuse of sensor networks in different appli...

  14. Open PHACTS: semantic interoperability for drug discovery.

    Williams, Antony J; Harland, Lee; Groth, Paul; Pettifer, Stephen; Chichester, Christine; Willighagen, Egon L; Evelo, Chris T; Blomberg, Niklas; Ecker, Gerhard; Goble, Carole; Mons, Barend

    2012-11-01

    Open PHACTS is a public-private partnership between academia, publishers, small and medium sized enterprises and pharmaceutical companies. The goal of the project is to deliver and sustain an 'open pharmacological space' using and enhancing state-of-the-art semantic web standards and technologies. It is focused on practical and robust applications to solve specific questions in drug discovery research. OPS is intended to facilitate improvements in drug discovery in academia and industry and to support open innovation and in-house non-public drug discovery research. This paper lays out the challenges and how the Open PHACTS project is hoping to address these challenges technically and socially. PMID:22683805

  15. An adaptive semantic based mediation system for data interoperability among Health Information Systems.

    Khan, Wajahat Ali; Khattak, Asad Masood; Hussain, Maqbool; Amin, Muhammad Bilal; Afzal, Muhammad; Nugent, Christopher; Lee, Sungyoung

    2014-08-01

    Heterogeneity in the management of the complex medical data, obstructs the attainment of data level interoperability among Health Information Systems (HIS). This diversity is dependent on the compliance of HISs with different healthcare standards. Its solution demands a mediation system for the accurate interpretation of data in different heterogeneous formats for achieving data interoperability. We propose an adaptive AdapteR Interoperability ENgine mediation system called ARIEN, that arbitrates between HISs compliant to different healthcare standards for accurate and seamless information exchange to achieve data interoperability. ARIEN stores the semantic mapping information between different standards in the Mediation Bridge Ontology (MBO) using ontology matching techniques. These mappings are provided by our System for Parallel Heterogeneity (SPHeRe) matching system and Personalized-Detailed Clinical Model (P-DCM) approach to guarantee accuracy of mappings. The realization of the effectiveness of the mappings stored in the MBO is evaluation of the accuracy in transformation process among different standard formats. We evaluated our proposed system with the transformation process of medical records between Clinical Document Architecture (CDA) and Virtual Medical Record (vMR) standards. The transformation process achieved over 90 % of accuracy level in conversion process between CDA and vMR standards using pattern oriented approach from the MBO. The proposed mediation system improves the overall communication process between HISs. It provides an accurate and seamless medical information exchange to ensure data interoperability and timely healthcare services to patients. PMID:24964780

  16. Database Semantic Interoperability based on Information Flow Theory and Formal Concept Analysis

    Guanghui Yang

    2012-07-01

    Full Text Available As databases become widely used, there is a growing need to translate information between multiple databases. Semantic interoperability and integration has been a long standing challenge for the database community and has now become a prominent area of database research. In this paper, we aim to answer the question how semantic interoperability between two databases can be achieved by using Formal Concept Analysis (FCA for short and Information Flow (IF for short theories. For our purposes, firstly we discover knowledge from different databases by using FCA, and then align what is discovered by using IF and FCA. The development of FCA has led to some software systems such as TOSCANA and TUPLEWARE, which can be used as a tool for discovering knowledge in databases. A prototype based on the IF and FCA has been developed. Our method is tested and verified by using this prototype and TUPLEWARE.

  17. An approach to define semantics for BPM systems interoperability

    Rico, Mariela; Caliusco, María Laura; Chiotti, Omar; Rosa Galli, María

    2015-04-01

    This article proposes defining semantics for Business Process Management systems interoperability through the ontology of Electronic Business Documents (EBD) used to interchange the information required to perform cross-organizational processes. The semantic model generated allows aligning enterprise's business processes to support cross-organizational processes by matching the business ontology of each business partner with the EBD ontology. The result is a flexible software architecture that allows dynamically defining cross-organizational business processes by reusing the EBD ontology. For developing the semantic model, a method is presented, which is based on a strategy for discovering entity features whose interpretation depends on the context, and representing them for enriching the ontology. The proposed method complements ontology learning techniques that can not infer semantic features not represented in data sources. In order to improve the representation of these entity features, the method proposes using widely accepted ontologies, for representing time entities and relations, physical quantities, measurement units, official country names, and currencies and funds, among others. When the ontologies reuse is not possible, the method proposes identifying whether that feature is simple or complex, and defines a strategy to be followed. An empirical validation of the approach has been performed through a case study.

  18. A federated semantic metadata registry framework for enabling interoperability across clinical research and care domains.

    Sinaci, A Anil; Laleci Erturkmen, Gokce B

    2013-10-01

    In order to enable secondary use of Electronic Health Records (EHRs) by bridging the interoperability gap between clinical care and research domains, in this paper, a unified methodology and the supporting framework is introduced which brings together the power of metadata registries (MDR) and semantic web technologies. We introduce a federated semantic metadata registry framework by extending the ISO/IEC 11179 standard, and enable integration of data element registries through Linked Open Data (LOD) principles where each Common Data Element (CDE) can be uniquely referenced, queried and processed to enable the syntactic and semantic interoperability. Each CDE and their components are maintained as LOD resources enabling semantic links with other CDEs, terminology systems and with implementation dependent content models; hence facilitating semantic search, much effective reuse and semantic interoperability across different application domains. There are several important efforts addressing the semantic interoperability in healthcare domain such as IHE DEX profile proposal, CDISC SHARE and CDISC2RDF. Our architecture complements these by providing a framework to interlink existing data element registries and repositories for multiplying their potential for semantic interoperability to a greater extent. Open source implementation of the federated semantic MDR framework presented in this paper is the core of the semantic interoperability layer of the SALUS project which enables the execution of the post marketing safety analysis studies on top of existing EHR systems. PMID:23751263

  19. CityGML - Interoperable semantic 3D city models

    Gröger, Gerhard; Plümer, Lutz

    2012-07-01

    CityGML is the international standard of the Open Geospatial Consortium (OGC) for the representation and exchange of 3D city models. It defines the three-dimensional geometry, topology, semantics and appearance of the most relevant topographic objects in urban or regional contexts. These definitions are provided in different, well-defined Levels-of-Detail (multiresolution model). The focus of CityGML is on the semantical aspects of 3D city models, its structures, taxonomies and aggregations, allowing users to employ virtual 3D city models for advanced analysis and visualization tasks in a variety of application domains such as urban planning, indoor/outdoor pedestrian navigation, environmental simulations, cultural heritage, or facility management. This is in contrast to purely geometrical/graphical models such as KML, VRML, or X3D, which do not provide sufficient semantics. CityGML is based on the Geography Markup Language (GML), which provides a standardized geometry model. Due to this model and its well-defined semantics and structures, CityGML facilitates interoperable data exchange in the context of geo web services and spatial data infrastructures. Since its standardization in 2008, CityGML has become used on a worldwide scale: tools from notable companies in the geospatial field provide CityGML interfaces. Many applications and projects use this standard. CityGML is also having a strong impact on science: numerous approaches use CityGML, particularly its semantics, for disaster management, emergency responses, or energy-related applications as well as for visualizations, or they contribute to CityGML, improving its consistency and validity, or use CityGML, particularly its different Levels-of-Detail, as a source or target for generalizations. This paper gives an overview of CityGML, its underlying concepts, its Levels-of-Detail, how to extend it, its applications, its likely future development, and the role it plays in scientific research. Furthermore, its

  20. Interoperability of learning objects copyright in the LUISA semantic learning management system

    García González, Roberto; Pariente, Tomas

    2009-01-01

    Semantic Web technology is able to provide the required computational semantics for interoperability of learning resources across different Learning Management Systems (LMS) and Learning Object Repositories (LOR). The EU research project LUISA (Learning Content Management System Using Innovative Semantic Web Services Architecture) addresses the development of a reference semantic architecture for the major challenges in the search, interchange and delivery of learning objects in a service-...

  1. A semantic annotation framework to assist the knowledge interoperability along a product life cycle

    Liao, Yongxin; Lezoche, Mario; Rocha Loures, Eduardo; Panetto, Hervé; Boudjlida, Nacer

    2014-01-01

    The interoperability among a variety of systems, in or across manufacturing enterprises, has been widely accepted as one of the important factors that affects the efficiency of production. The aim of this study is to deal with the semantic interoperability issues in a product lifecycle management environment. Through the investigation of related works, the need for the formalization of semantic annotation was discovered. This paper addresses this drawback and introduces a framework that uses ...

  2. Shape-function-relationship (SFR) framework for semantic interoperability of product model

    Gupta, Ravi Kumar; Gurumoorthy, B

    2009-01-01

    The problem of semantic interoperability arises while integrating applications in different task domains across the product life cycle. A new shape-function-relationship (SFR) framework is proposed as a taxonomy based on which an ontology is developed. Ontology based on the SFR framework, that captures explicit definition of terminology and knowledge relationships in terms of shape, function and relationship descriptors, offers an attractive approach for solving semantic interoperability issu...

  3. Reporting Device Observations for semantic interoperability of surgical devices and clinical information systems.

    Andersen, Björn; Ulrich, Hannes; Rehmann, Daniel; Kock, Ann-Kristin; Wrage, Jan-Hinrich; Ingenerf, Josef

    2015-08-01

    Service-oriented medical device architectures make the progress from interdisciplinary research projects to international standardisation: A new set of IEEE 11073 proposals shall pave the way to industry acceptance. This expected availability of device observations in a standardised representation enables secondary usage if interoperability with clinical information systems can be achieved. The Device Observation Reporter (DOR) described in this work is a gateway that connects these realms. After a user chooses a selection of signals from different devices in the digital operating room, the DOR records these semantically described values for a specified duration. Upon completion, the signals descriptions and values are transformed to Health Level Seven version 2 messages and sent to a hospital information system/electronic health record system within the clinical IT network. The successful integration of device data for documentation and usage in clinical information systems can further leverage the novel device communication standard proposals. Complementing these, an Integrating the Healthcare Enterprise profile will aid commercial implementers in achieving interoperability. Their solutions could incorporate clinical knowledge to autonomously select signal combinations and generate reports of diagnostic and interventional procedures, thus saving time and effort for surgical documentation. PMID:26736610

  4. Facilitating Semantic Interoperability Among Ocean Data Systems: ODIP-R2R Student Outcomes

    Stocks, K. I.; Chen, Y.; Shepherd, A.; Chandler, C. L.; Dockery, N.; Elya, J. L.; Smith, S. R.; Ferreira, R.; Fu, L.; Arko, R. A.

    2014-12-01

    With informatics providing an increasingly important set of tools for geoscientists, it is critical to train the next generation of scientists in information and data techniques. The NSF-supported Rolling Deck to Repository (R2R) Program works with the academic fleet community to routinely document, assess, and preserve the underway sensor data from U.S. research vessels. The Ocean Data Interoperability Platform (ODIP) is an EU-US-Australian collaboration fostering interoperability among regional e-infrastructures through workshops and joint prototype development. The need to align terminology between systems is a common challenge across all of the ODIP prototypes. Five R2R students were supported to address aspects of semantic interoperability within ODIP. Developing a vocabulary matching service that links terms from different vocabularies with similar concept. The service implements Google Refine reconciliation service interface such that users can leverage Google Refine application as a friendly user interface while linking different vocabulary terms. Developing Resource Description Framework (RDF) resources that map Shipboard Automated Meteorological Oceanographic System (SAMOS) vocabularies to internationally served vocabularies. Each SAMOS vocabulary term (data parameter and quality control flag) will be described as an RDF resource page. These RDF resources allow for enhanced discoverability and retrieval of SAMOS data by enabling data searches based on parameter. Improving data retrieval and interoperability by exposing data and mapped vocabularies using Semantic Web technologies. We have collaborated with ODIP participating organizations in order to build a generalized data model that will be used to populate a SPARQL endpoint in order to provide expressive querying over our data files. Mapping local and regional vocabularies used by R2R to those used by ODIP partners. This work is described more fully in a companion poster. Making published Linked Data

  5. Achieving interoperability in critical IT and communication systems

    Desourdis, Robert I

    2009-01-01

    Supported by over 90 illustrations, this unique book provides a detailed examination of the subject, focusing on the use of voice, data, and video systems for public safety and emergency response. This practical resource makes in-depth recommendations spanning technical, planning, and procedural approaches to provide efficient public safety response performance. You find covered the many approaches used to achieve interoperability, including a synopsis of the enabling technologies and systems intended to provide radio interoperability. Featuring specific examples nationwide, the book takes you

  6. RuleML-Based Learning Object Interoperability on the Semantic Web

    Biletskiy, Yevgen; Boley, Harold; Ranganathan, Girish R.

    2008-01-01

    Purpose: The present paper aims to describe an approach for building the Semantic Web rules for interoperation between heterogeneous learning objects, namely course outlines from different universities, and one of the rule uses: identifying (in)compatibilities between course descriptions. Design/methodology/approach: As proof of concept, a rule…

  7. An Approach to Semantic Interoperability for Improved Capability Exchanges in Federations of Systems

    Moschoglou, Georgios

    2013-01-01

    This study seeks an affirmative answer to the question whether a knowledge-based approach to system of systems interoperation using semantic web standards and technologies can provide the centralized control of the capability for exchanging data and services lacking in a federation of systems. Given the need to collect and share real-time…

  8. Semantic Interoperability in Body Area Sensor Networks and Applications

    Bui, V.T.; Brandt, P.; Liu, H.; Basten, T.; Lukkien, J.

    2014-01-01

    Crucial to the success of Body Area Sensor Networks is the flexibility with which stakeholders can share, extend and adapt the system with respect to sensors, data and functionality. The first step is to develop an interoperable platform with explicit interfaces, which takes care of common managemen

  9. Sharing meanings: developing interoperable semantic technologies to enhance reproducibility in earth and environmental science research

    Schildhauer, M.

    2015-12-01

    Earth and environmental scientists are familiar with the entities, processes, and theories germane to their field of study, and comfortable collecting and analyzing data in their area of interest. Yet, while there appears to be consistency and agreement as to the scientific "terms" used to describe features in their data and analyses, aside from a few fundamental physical characteristics—such as mass or velocity-- there can be broad tolerances, if not considerable ambiguity, in how many earth science "terms" map to the underlying "concepts" that they actually represent. This ambiguity in meanings, or "semantics", creates major problems for scientific reproducibility. It greatly impedes the ability to replicate results—by making it difficult to determine the specifics of the intended meanings of terms such as deforestation or carbon flux -- as to scope, composition, magnitude, etc. In addition, semantic ambiguity complicates assemblage of comparable data for reproducing results, due to ambiguous or idiosyncratic labels for measurements, such as percent cover of forest, where the term "forest" is undefined; or where a reported output of "total carbon-emissions" might just include CO2 emissions, but not methane emissions. In this talk, we describe how the NSF-funded DataONE repository for earth and environmental science data (http://dataone.org), is using W3C-standard languages (RDF/OWL) to build an ontology for clarifying concepts embodied in heterogeneous data and model outputs. With an initial focus on carbon cycling concepts using terrestrial biospheric model outputs and LTER productivity data, we describe how we are achieving interoperability with "semantic vocabularies" (or ontologies) from aligned earth and life science domains, including OBO-foundry ontologies such as ENVO and BCO; the ISO/OGC O&M; and the NSF Earthcube GeoLink project. Our talk will also discuss best practices that may be helpful for other groups interested in constructing their own

  10. Interoperability and different ways of knowing: How semantics can aid in cross-cultural understanding

    Pulsifer, P. L.; Parsons, M. A.; Duerr, R. E.; Fox, P. A.; Khalsa, S. S.; McCusker, J. P.; McGuinness, D. L.

    2012-12-01

    differences in its application. Furthermore, it is an analog encoding scheme whose meaning has evolved over time. By semantically modeling the egg code, its subtle variations, and how it connects to other data, we illustrate a mechanism for translating across data formats and representations. But there are limits to what semantically modeling the egg-code can achieve. The egg-code and common operational sea ice formats do not address community needs, notably the timing and processes of sea ice freeze-up and break-up which have profound impact on local hunting, shipping, oil exploration, and safety. We work with local experts from four very different Indigenous communities and scientific creators of sea ice forecasts to establish an understanding of concepts and terminology related to fall freeze-up and spring break up from the individually represented regions. This helps expand our conceptions of sea ice while also aiding in understanding across cultures and communities, and in passing knowledge to younger generations. This is an early step to expanding concepts of interoperability to very different ways of knowing to make data truly relevant and locally useful.

  11. Interoperable cross-domain semantic and geospatial framework for automatic change detection

    Kuo, Chiao-Ling; Hong, Jung-Hong

    2016-01-01

    With the increasingly diverse types of geospatial data established over the last few decades, semantic interoperability in integrated applications has attracted much interest in the field of Geographic Information System (GIS). This paper proposes a new strategy and framework to process cross-domain geodata at the semantic level. This framework leverages the semantic equivalence of concepts between domains through bridge ontology and facilitates the integrated use of different domain data, which has been long considered as an essential superiority of GIS, but is impeded by the lack of understanding about the semantics implicitly hidden in the data. We choose the task of change detection to demonstrate how the introduction of ontology concept can effectively make the integration possible. We analyze the common properties of geodata and change detection factors, then construct rules and summarize possible change scenario for making final decisions. The use of topographic map data to detect changes in land use shows promising success, as far as the improvement of efficiency and level of automation is concerned. We believe the ontology-oriented approach will enable a new way for data integration across different domains from the perspective of semantic interoperability, and even open a new dimensionality for the future GIS.

  12. Enabling Semantics-Aware Collaborative Tagging and Social Search in an Open Interoperable Tagosphere

    Soriano Camino, Francisco Javier; López Pardo, Javier; Jiménez Gañán, Miguel; Alonso Amo, Fernando

    2008-01-01

    To make the most of a global network effect and to search and filter the Long Tail, a collaborative tagging approach to social search should be based on the global activity of tagging, rating and filtering. We take a further step towards this objective by proposing a shared conceptualization of both the activity of tagging and the organization of the tagosphere in which tagging takes place. We also put forward the necessary data standards to interoperate at both data format and semantic level...

  13. Semantic Interoperability in Czech Healthcare Environment Supported by HL7 Version 3

    Nagy, Miroslav; Hanzlíček, Petr; Přečková, Petra; Říha, Antonín; Dioszegi, Matěj; Seidl, Libor; Zvárová, Jana

    2010-01-01

    Roč. 49, č. 2 (2010), s. 186-195. ISSN 0026-1270 R&D Projects: GA MŠk(CZ) 1M06014; GA AV ČR 1ET200300413 Institutional research plan: CEZ:AV0Z10300504 Keywords : information storage and retrieval * electronic health record * HL7 * semantic interoperability * communication standards Subject RIV: IN - Informatics, Computer Science Impact factor: 1.472, year: 2010

  14. Semantic modeling and interoperability in product and process engineering a technology for engineering informatics

    2013-01-01

    In the past decade, feature-based design and manufacturing has gained some momentum in various engineering domains to represent and reuse semantic patterns with effective applicability. However, the actual scope of feature application is still very limited. Semantic Modeling and Interoperability in Product and Process Engineering provides a systematic solution for the challenging engineering informatics field aiming at the enhancement of sustainable knowledge representation, implementation and reuse in an open and yet practically manageable scale.   This semantic modeling technology supports uniform, multi-facet and multi-level collaborative system engineering with heterogeneous computer-aided tools, such as CADCAM, CAE, and ERP.  This presented unified feature model can be applied to product and process representation, development, implementation and management. Practical case studies and test samples are provided to illustrate applications which can be implemented by the readers in real-world scenarios. �...

  15. Interoperability Between Coastal Web Atlases Using Semantic Mediation: A Case Study of the International Coastal Atlas Network (ICAN)

    Wright, D. J.; Lassoued, Y.; Dwyer, N.; Haddad, T.; Bermudez, L. E.; Dunne, D.

    2009-12-01

    Coastal mapping plays an important role in informing marine spatial planning, resource management, maritime safety, hazard assessment and even national sovereignty. As such, there is now a plethora of data/metadata catalogs, pre-made maps, tabular and text information on resource availability and exploitation, and decision-making tools. A recent trend has been to encapsulate these in a special class of web-enabled geographic information systems called a coastal web atlas (CWA). While multiple benefits are derived from tailor-made atlases, there is great value added from the integration of disparate CWAs. CWAs linked to one another can query more successfully to optimize planning and decision-making. If a dataset is missing in one atlas, it may be immediately located in another. Similar datasets in two atlases may be combined to enhance study in either region. *But how best to achieve semantic interoperability to mitigate vague data queries, concepts or natural language semantics when retrieving and integrating data and information?* We report on the development of a new prototype seeking to interoperate between two initial CWAs: the Marine Irish Digital Atlas (MIDA) and the Oregon Coastal Atlas (OCA). These two mature atlases are used as a testbed for more regional connections, with the intent for the OCA to use lessons learned to develop a regional network of CWAs along the west coast, and for MIDA to do the same in building and strengthening atlas networks with the UK, Belgium, and other parts of Europe. Our prototype uses semantic interoperability via services harmonization and ontology mediation, allowing local atlases to use their own data structures, and vocabularies (ontologies). We use standard technologies such as OGC Web Map Services (WMS) for delivering maps, and OGC Catalogue Service for the Web (CSW) for delivering and querying ISO-19139 metadata. The metadata records of a given CWA use a given ontology of terms called local ontology. Human or machine

  16. A Reusable and Interoperable Semantic Classification Tool which Integrates Owl Ontology

    Saadia Lgarch

    2012-11-01

    Full Text Available In e-Learning systems, tutor plays a very important role to support learners, and guarantee a learning of quality. A successful collaboration between learners and their tutor requires the use of communication tools. Thanks to their flexibility in terms of time, the asynchronous tools as discussion forum are the most used. However this type of tools generates a great mass of messages making tutoring an operation complex to manage, hence the need of a classification tool of messages. We proposed in a first step a semantics classification tool, which is based on the LSA and thesaurus. The possibility that ontology provides to overcome the limitations of the thesaurus encouraged us to use it to control our vocabulary. By the way of our proposed selection algorithm, the OWL ontology is queried to generate new terms which are used to build the LSA matrix. The integration of formal OWL ontology provides a highly relevant semantic classification of messages, and the reuse by other applications of ontological knowledge base is also guaranteed. The interoperability and the knowledge exchange between systems are also ensured by ontology integrated. In order to ensure its reuse and interoperability with systems which requesting for its service of classification, the implementation of our semantic classifier tool basing on the SOA is adopted and it will be explained and tested in this work.

  17. Building Semantically Interoperable EHR Systems Using International Nomenclatures and Enterprise Programming Techniques

    Nagy, Miroslav; Hanzlíček, Petr; Přečková, Petra; Kolesa, Petr; Mišúr, J.; Dioszegi, Matěj; Zvárová, Jana

    Amsterdam: IOS Press, 2008 - (Blobel, B.; Pharow, P.; Zvárová, J.; Lopez, D.), s. 105-110 ISBN 978-1-58603-834-2. [CeHR: International Conference 2007. eHealth: Combining Health Telematics, Telemedicine, Biomedical Engineering and Bioinformatics to the Edge. Regensburg (DE), 02.12.2007-05.12.2007] R&D Projects: GA AV ČR 1ET200300413 Institutional research plan: CEZ:AV0Z10300504 Keywords : electronic health record * semantic interoperability * information storage and retrieval Subject RIV: IN - Informatics, Computer Science

  18. Implementation of a metadata architecture and knowledge collection to support semantic interoperability in an enterprise data warehouse.

    Dhaval, Rakesh; Borlawsky, Tara; Ostrander, Michael; Santangelo, Jennifer; Kamal, Jyoti; Payne, Philip R O

    2008-01-01

    In order to enhance interoperability between enterprise systems, and improve data validity and reliability throughout The Ohio State University Medical Center (OSUMC), we have initiated the development of an ontology-anchored metadata architecture and knowledge collection for our enterprise data warehouse. The metadata and corresponding semantic relationships stored in the OSUMC knowledge collection are intended to promote consistency and interoperability across the heterogeneous clinical, research, business and education information managed within the data warehouse. PMID:18999040

  19. Case Study for Integration of an Oncology Clinical Site in a Semantic Interoperability Solution based on HL7 v3 and SNOMED-CT: Data Transformation Needs.

    Ibrahim, Ahmed; Bucur, Anca; Perez-Rey, David; Alonso, Enrique; de Hoog, Matthy; Dekker, Andre; Marshall, M Scott

    2015-01-01

    This paper describes the data transformation pipeline defined to support the integration of a new clinical site in a standards-based semantic interoperability environment. The available datasets combined structured and free-text patient data in Dutch, collected in the context of radiation therapy in several cancer types. Our approach aims at both efficiency and data quality. We combine custom-developed scripts, standard tools and manual validation by clinical and knowledge experts. We identified key challenges emerging from the several sources of heterogeneity in our case study (systems, language, data structure, clinical domain) and implemented solutions that we will further generalize for the integration of new sites. We conclude that the required effort for data transformation is manageable which supports the feasibility of our semantic interoperability solution. The achieved semantic interoperability will be leveraged for the deployment and evaluation at the clinical site of applications enabling secondary use of care data for research. This work has been funded by the European Commission through the INTEGRATE (FP7-ICT-2009-6-270253) and EURECA (FP7-ICT-2011-288048) projects. PMID:26306242

  20. Importance of achieving semantic interoperability for national health information systems

    Evelyn Johanna Sophia Hovenga

    2008-01-01

    Full Text Available En el presente artículo se examina de manera general las relaciones entre los dirigentes gubernamentales de las políticas de salud, de los proveedores de cuidado en salud y la adopción de las informaciones de cuidado en salud, así como de las tecnologías de comunicación y conocimiento. Esas tecnologías incluyen la adopción de estructuras de lenguaje nacional de salud y los patrones de informática en salud. Reflexiones esas que están basadas en las observaciones de los autores y en la participación internacional en el desarrollo de los patrones y en el desarrollo e implantación durante muchos años de las Tecnologías de Información y Comunicación Guvernamentales. Un considerable número de conceptos críticos parece ser mal comprendido por los responsables por la tomada de desiciones claves o, alternativamente, por las agendas políticas y por la necesidad de cuidar de una variedad de intereses propios que continuan dominando. Se concluye que nosotros debemos establecer y promover activamente un sólido ejemplo profesional para la adopción de una estrategia nacional de informática en salud que esté basada en la mejor evidencia científica disponible para apoyar un sistema de salud sustentable.

  1. A Joint Initiative to Support the Semantic Interoperability within the GIIDA Project

    Plini, Paolo; De Santis, Valentina; Uricchio, Vito F; De Carlo, Dario; D'Arpa, Stefania; De Martino, Monica; Albertoni, Riccardo

    2010-01-01

    The GIIDA project aims to develop a digital infrastructure for the spatial information within CNR. It is foreseen to use semantic-oriented technologies to ease information modeling and connecting, according to international standards like the ISO/IEC 11179. Complex information management systems, like GIIDA, will take benefit from the use of terminological tools like thesauri that make available a reference lexicon for the indexing and retrieval of information. Within GIIDA the goal is to make available the EARTh thesaurus (Environmental Applications Reference Thesaurus), developed by the CNR-IIA-EKOLab. A web-based software, developed by the CNR-Water Research Institute (IRSA) was implemented to allow consultation and utilization of thesaurus through the web. This service is a useful tool to ensure interoperability between thesaurus and other systems of the indexing, with, the idea of cooperating to develop a comprehensive system of knowledge organization, that could be defined integrated, open, multi-functi...

  2. A Proof-of-Concept for Semantically Interoperable Federation of IoT Experimentation Facilities.

    Lanza, Jorge; Sanchez, Luis; Gomez, David; Elsaleh, Tarek; Steinke, Ronald; Cirillo, Flavio

    2016-01-01

    The Internet-of-Things (IoT) is unanimously identified as one of the main pillars of future smart scenarios. The potential of IoT technologies and deployments has been already demonstrated in a number of different application areas, including transport, energy, safety and healthcare. However, despite the growing number of IoT deployments, the majority of IoT applications tend to be self-contained, thereby forming application silos. A lightweight data centric integration and combination of these silos presents several challenges that still need to be addressed. Indeed, the ability to combine and synthesize data streams and services from diverse IoT platforms and testbeds, holds the promise to increase the potentiality of smart applications in terms of size, scope and targeted business context. In this article, a proof-of-concept implementation that federates two different IoT experimentation facilities by means of semantic-based technologies will be described. The specification and design of the implemented system and information models will be described together with the practical details of the developments carried out and its integration with the existing IoT platforms supporting the aforementioned testbeds. Overall, the system described in this paper demonstrates that it is possible to open new horizons in the development of IoT applications and experiments at a global scale, that transcend the (silo) boundaries of individual deployments, based on the semantic interconnection and interoperability of diverse IoT platforms and testbeds. PMID:27367695

  3. Towards a conceptual framework for user-driven semantic metadata interoperability in digital libraries: A social constructivist approach

    Alemu, Getaneh; Stevens, Brett; Ross, Penny

    2012-01-01

    Purpose – With the aim of developing a conceptual framework which aims to facilitate semantic metadata interoperability, this paper explores overarching conceptual issues on how traditional library information organization schemes such as Online Public Access Catalogues (OPACs), taxonomies, thesauri, and ontologies on the one hand versus Web 2.0 technologies such as social tagging (folksonomies) can be harnessed to provide users with satisfying experiences. Design/methodology/approach –This p...

  4. Proposed Information Sharing Security Approach for Security Personnels, Vertical Integration, Semantic Interoperability Architecture and Framework for Digital Government

    Headayetullah, Md; Biswas, Sanjay; Puthal, B

    2011-01-01

    This paper mainly depicts the conceptual overview of vertical integration, semantic interoperability architecture such as Educational Sector Architectural Framework (ESAF) for New Zealand government and different interoperability framework solution for digital government. In this paper, we try to develop a secure information sharing approach for digital government to improve home land security. This approach is a role and cooperation based approach for security personnel of different government departments. In order to run any successful digital government of any country in the world, it is necessary to interact with their citizen and to share secure information via different network among the citizen or other government. Consequently, in order to smooth the progress of users to cooperate with and share information without darkness and flawlessly transversely different networks and databases universally, a safe and trusted information-sharing environment has been renowned as a very important requirement and t...

  5. Semantic Interoperability for Computational Mineralogy: Experiences of the eMinerals Consortium

    Walker, A. M.; White, T. O.; Dove, M. T.; Bruin, R. P.; Couch, P. A.; Tyer, R. P.

    2006-12-01

    The use of atomic scale computer simulation of minerals to obtain information for geophysics and environmental science has grown enormously over the past couple of decades. It is now routine to probe mineral behavior in the Earth's deep interior and in the surface environment by borrowing methods and simulation codes from computational chemistry and physics. It is becoming increasingly important to use methods embodied in more than one of these codes to solve any single scientific problem. However, scientific codes are rarely designed for easy interoperability and data exchange; data formats are often code-specific, poorly documented and fragile, liable to frequent change between software versions, and even compiler versions. This means that the scientist's simple desire to use the methodological approaches offered by multiple codes is frustrated, and even the sharing of data between collaborators becomes fraught with difficulties. The eMinerals consortium was formed in the early stages of the UK eScience program with the aim of developing the tools needed to apply atomic scale simulation to environmental problems in a grid-enabled world, and to harness the computational power offered by grid technologies to address some outstanding mineralogical problems. One example of the kind of problem we can tackle is the origin of the compressibility anomaly in silica glass. By passing data directly between simulation and analysis tools we were able to probe this effect in more detail than has previously been possible and have shown how the anomaly is related to the details of the amorphous structure. In order to approach this kind of problem we have constructed a mini-grid, a small scale and extensible combined compute- and data-grid that allows the execution of many calculations in parallel, and the transparent storage of semantically-rich marked-up result data. Importantly, we automatically capture multiple kinds of metadata and key results from each calculation. We

  6. An Integrated Framework to Achieve Interoperability in Person-Centric Health Management

    Fabio Vergari

    2011-01-01

    Full Text Available The need for high-quality out-of-hospital healthcare is a known socioeconomic problem. Exploiting ICT's evolution, ad-hoc telemedicine solutions have been proposed in the past. Integrating such ad-hoc solutions in order to cost-effectively support the entire healthcare cycle is still a research challenge. In order to handle the heterogeneity of relevant information and to overcome the fragmentation of out-of-hospital instrumentation in person-centric healthcare systems, a shared and open source interoperability component can be adopted, which is ontology driven and based on the semantic web data model. The feasibility and the advantages of the proposed approach are demonstrated by presenting the use case of real-time monitoring of patients' health and their environmental context.

  7. Towards an interoperability certification method for semantic federated experimental IoT testbeds

    Zhao, Mengxuan; Kefalakis, Nikos; Grace, Paul; Soldatos, John; Le-Gall, Franck; Cousin, Phillippe

    2016-01-01

    IoT deployments and then related experiments tend to be highly heterogeneous leading to fragmented and non-interoperable silo solutions. Yet there is a growing need to interconnect such experiments to create rich infrastructures that will underpin the next generation of cross sector IoT applications in particular as using massive number of data. While research have been carried out for IoT test beds and interoperability for some infrastructures less has been done on the data. In this paper, w...

  8. A Service Oriented Architecture Approach to Achieve Interoperability between Immunization Information Systems in Iran.

    Hosseini, Masoud; Ahmadi, Maryam; Dixon, Brian E

    2014-01-01

    Clinical decision support (CDS) systems can support vaccine forecasting and immunization reminders; however, immunization decision-making requires data from fragmented, independent systems. Interoperability and accurate data exchange between immunization information systems (IIS) is an essential factor to utilize Immunization CDS systems. Service oriented architecture (SOA) and Health Level 7 (HL7) are dominant standards for web-based exchange of clinical information. We implemented a system based on SOA and HL7 v3 to support immunization CDS in Iran. We evaluated system performance by exchanging 1500 immunization records for roughly 400 infants between two IISs. System turnaround time is less than a minute for synchronous operation calls and the retrieved immunization history of infants were always identical in different systems. CDS generated reports were accordant to immunization guidelines and the calculations for next visit times were accurate. Interoperability is rare or nonexistent between IIS. Since inter-state data exchange is rare in United States, this approach could be a good prototype to achieve interoperability of immunization information. PMID:25954452

  9. Quality measurement of semantic standards

    Folmer, E.J.A.; Oude Luttighuis, P.H.W.M.; Hillegersberg, van, R.

    2010-01-01

    Quality of semantic standards is unadressed in current research while there is an explicit need from standard developers. The business importance is evident since quality of standards will have impact on its diffusion and achieved interoperability in practice. An instrument to measure the quality of semantic standards is designed to contribute to the knowledge domain, standards developers and might ultimo lead to improved interoperability. This instrument is iteratively designed with multiple...

  10. Semantic interoperability of ambient intelligent medical devices and e-health systems

    Ali, Safdar

    2010-01-01

    State-of-the-art mobile medical devices provide important therapeutic functions with valuable information of treatment patterns at the point-of-care. However, such devices mostly remain independent islands of information being unable to share the medical data they gather with other medical devices, hospital information system or laboratory information system on a real-time basis. Standards organizations such as IEEE have made various attempts to resolve the medical devices' interoperability p...

  11. Community-Driven Initiatives to Achieve Interoperability for Ecological and Environmental Data

    Madin, J.; Bowers, S.; Jones, M.; Schildhauer, M.

    2007-12-01

    interoperability by describing the semantics of data at the level of observation and measurement (rather than the traditional focus at the level of the data set) and will define the necessary specifications and technologies to facilitate semantic interpretation and integration of observational data for the environmental sciences. As such, this initiative will focus on unifying the various existing approaches for representing and describing observation data (e.g., SEEK's Observation Ontology, CUAHSI's Observation Data Model, NatureServe's Observation Data Standard, to name a few). Products of this initiative will be compatible with existing standards and build upon recent advances in knowledge representation (e.g., W3C's recommended Web Ontology Language, OWL) that have demonstrated practical utility in enhancing scientific communication and data interoperability in other communities (e.g., the genomics community). A community-sanctioned, extensible, and unified model for observational data will support metadata standards such as EML while reducing the "babel" of scientific dialects that currently impede effective data integration, which will in turn provide a strong foundation for enabling cross-disciplinary synthetic research in the ecological and environmental sciences.

  12. The Semantic Management of Environmental Resources within the Interoperable Context of the EuroGEOSS: Alignment of GEMET and the GEOSS SBAs

    Cialone, Claudia; Stock, Kristin

    2010-05-01

    EuroGEOSS is a European Commission funded project. It aims at improving a scientific understanding of the complex mechanisms which drive changes affecting our planet, identifying and establishing interoperable arrangements between environmental information systems. These systems would be sustained and operated by organizations with a clear mandate and resources and rendered available following the specifications of already existent frameworks such as GEOSS (the Global Earth Observation System of systems)1 and INSPIRE (the Infrastructure for Spatial Information in the European Community)2. The EuroGEOSS project's infrastructure focuses on three thematic areas: forestry, drought and biodiversity. One of the important activities in the project is the retrieval, parsing and harmonization of the large amount of heterogeneous environmental data available at local, regional and global levels between these strategic areas. The challenge is to render it semantically and technically interoperable in a simple way. An initial step in achieving this semantic and technical interoperability involves the selection of appropriate classification schemes (for example, thesauri, ontologies and controlled vocabularies) to describe the resources in the EuroGEOSS framework. These classifications become a crucial part of the interoperable framework scaffolding because they allow data providers to describe their resources and thus support resource discovery, execution and orchestration of varying levels of complexity. However, at present, given the diverse range of environmental thesauri, controlled vocabularies and ontologies and the large number of resources provided by project participants, the selection of appropriate classification schemes involves a number of considerations. First of all, there is the semantic difficulty of selecting classification schemes that contain concepts that are relevant to each thematic area. Secondly, EuroGEOSS is intended to accommodate a number of

  13. Using architectures for semantic interoperability to create journal clubs for emergency response

    Powell, James E [Los Alamos National Laboratory; Collins, Linn M [Los Alamos National Laboratory; Martinez, Mark L B [Los Alamos National Laboratory

    2009-01-01

    In certain types of 'slow burn' emergencies, careful accumulation and evaluation of information can offer a crucial advantage. The SARS outbreak in the first decade of the 21st century was such an event, and ad hoc journal clubs played a critical role in assisting scientific and technical responders in identifying and developing various strategies for halting what could have become a dangerous pandemic. This research-in-progress paper describes a process for leveraging emerging semantic web and digital library architectures and standards to (1) create a focused collection of bibliographic metadata, (2) extract semantic information, (3) convert it to the Resource Description Framework /Extensible Markup Language (RDF/XML), and (4) integrate it so that scientific and technical responders can share and explore critical information in the collections.

  14. 基于本体的空间信息语义互操作研究%Geospatial Semantic Interoperability Based on Ontology

    王艳东; 龚健雅; 吴小凰

    2007-01-01

    In GIS field, great varieties of information from different domains are involved in order to solve actual problems. But usually spatial information is stored in diverse spatial databases, manipulated by different GIS platforms. Semantic heterogeneity is caused due to the distinctions of conception explanations among various GIS implements. It will result in the information obtaining and understanding gaps for spatial data sharing and usage. An ontology-based model for spatial information semantic interoperability is put forward after the comprehensive review of progress in ontology theory, methodology and application research in GIS domain.

  15. PROPOSED INFORMATION SHARING SECURITY APPROACH FOR SECURITY PERSONNELS, VERTICAL INTEGRATION, SEMANTIC INTEROPERABILITY ARCHITECTURE AND FRAMEWORK FOR DIGITAL GOVERNMENT

    Md.Headayetullah

    2011-06-01

    Full Text Available This paper mainly depicts the conceptual overview of vertical integration, semantic interoperability architecture such as Educational Sector Architectural Framework (ESAF for New Zealand governmentand different interoperability framework solution for digital government. In this paper, we try to develop a secure information sharing approach for digital government to improve home land security. This approach is a role and cooperation based approach for security personnel of different government departments. In order to run any successful digital government of any country in the world, it is necessary to interact with their citizen and to share secure information via different network among the citizen or other government. Consequently, in order to smooth the progress of users to cooperate with and share information without darkness and flawlessly transversely different networks and databases universally, a safe and trusted information-sharing environment has been renowned as a very important requirement and to press forward homeland security endeavor. The key incentive following this research is to put up a secure and trusted information-sharing approach for government departments. This paper presents a proficient function and teamwork based information sharing approach for safe exchange of hush-hush and privileged information amid security personnels and government departments inside the national boundaries by means of public key cryptography. The expanded approach makes use of cryptographic hash function; public key cryptosystem and a unique and complex mapping function for securely swapping over secret information. Moreover, the projected approach facilitates privacy preserving information sharing with probable restrictions based on the rank of the security personnels. The projected function and collaboration based information sharing approach ensures protected and updated information sharing between security personnels and government

  16. Quality model for semantic IS standards

    Folmer, E.J.A.

    2011-01-01

    Semantic IS (Information Systems) standards are essential for achieving interoperability between organizations. However a recent survey suggests that not the full benefits of standards are achieved, due to the quality issues. This paper presents a quality model for semantic IS standards, that should

  17. An HL7-CDA wrapper for facilitating semantic interoperability to rule-based Clinical Decision Support Systems.

    Sáez, Carlos; Bresó, Adrián; Vicente, Javier; Robles, Montserrat; García-Gómez, Juan Miguel

    2013-03-01

    The success of Clinical Decision Support Systems (CDSS) greatly depends on its capability of being integrated in Health Information Systems (HIS). Several proposals have been published up to date to permit CDSS gathering patient data from HIS. Some base the CDSS data input on the HL7 reference model, however, they are tailored to specific CDSS or clinical guidelines technologies, or do not focus on standardizing the CDSS resultant knowledge. We propose a solution for facilitating semantic interoperability to rule-based CDSS focusing on standardized input and output documents conforming an HL7-CDA wrapper. We define the HL7-CDA restrictions in a HL7-CDA implementation guide. Patient data and rule inference results are mapped respectively to and from the CDSS by means of a binding method based on an XML binding file. As an independent clinical document, the results of a CDSS can present clinical and legal validity. The proposed solution is being applied in a CDSS for providing patient-specific recommendations for the care management of outpatients with diabetes mellitus. PMID:23199936

  18. Geospatial semantic web

    Zhang, Chuanrong; Li, Weidong

    2015-01-01

    This book covers key issues related to Geospatial Semantic Web, including geospatial web services for spatial data interoperability; geospatial ontology for semantic interoperability; ontology creation, sharing, and integration; querying knowledge and information from heterogeneous data source; interfaces for Geospatial Semantic Web, VGI (Volunteered Geographic Information) and Geospatial Semantic Web; challenges of Geospatial Semantic Web; and development of Geospatial Semantic Web applications. This book also describes state-of-the-art technologies that attempt to solve these problems such

  19. Achieving control and interoperability through unified model-based systems and software engineering

    Rasmussen, Robert; Ingham, Michel; Dvorak, Daniel

    2005-01-01

    Control and interoperation of complex systems is one of the most difficult challenges facing NASA's Exploration Systems Mission Directorate. An integrated but diverse array of vehicles, habitats, and supporting facilities, evolving over the long course of the enterprise, must perform ever more complex tasks while moving steadily away from the sphere of ground support and intervention.

  20. Semantic Conflicts Reconciliation as a Viable Solution for Semantic Heterogeneity Problems

    Walaa S. Ismail

    2013-05-01

    Full Text Available Achieving semantic interoperability is a current challenge in the field of data integration in order to bridge semantic conflicts occurring when the participating sources and receivers use different or implicit data assumptions. Providing a framework that automatically detects and resolves semantic conflicts is considered as a daunting task for many reasons, it should preserve the local autonomy of the integrated sources, as well as provides a standard query language for accessing the integrated data on a global basis. Many existing traditional and ontology-based approaches have tried to achieve semantic interoperability, but they have certain drawbacks that make them inappropriate for integrating data from a large number of participating sources. We propose semantic conflicts reconciliation (SCR framework, it is ontology-based system in which all data semantics explicitly described in the knowledge representation phase and automatically taken into account through the interpretation mediation service phase, so conflicts detected and resolved automatically at the query time

  1. A Formal Approach to Protocol Interoperability Testing

    郝瑞兵; 吴建平

    1998-01-01

    Porotocol Interoperability testing is an important means to ensure the interconnection and interoperation between protocol products.In this paper,we proposed a formal approach to protocol interoperability testing based on the operational semantics of Concurrent TTCN.We define Concurrent TTCN's operational semantics by using Labeled Transition System,and describe the interoperability test execution and test verdict based on Concurrent TTCN.This approach is very helpful for the formation of formal interoperability testing theory and construction of general interoperability testing system.

  2. Quality measurement of semantic standards

    Folmer, E.J.A.; Oude Luttighuis, P.H.W.M.; Hillegersberg, J. van

    2010-01-01

    Quality of semantic standards is unadressed in current research while there is an explicit need from standard developers. The business importance is evident since quality of standards will have impact on its diffusion and achieved interoperability in practice. An instrument to measure the quality of

  3. Benchmarking Semantic Web technology

    García-Castro, Raúl

    2008-01-01

    Semantic Web technologies need to interchange ontologies for further use. Due to the heterogeneity in the knowledge representation formalisms of the different existing technologies, interoperability is a problem in the SemanticWeb and the limits of the interoperability of current technologies are yet unknown. A massive improvement of the interoperability of current SemanticWeb technologies, or of any other characteristic of these technologies, requires continuous evaluations that should be de...

  4. Tuning Ontology Interoperability

    Giunchiglia, Fausto; Pan, Jeff Z.; Serafini, Luciano

    2005-01-01

    The main contribution of this paper is the notion of ontology space, which allows us to move from an ontology-centric vision to a constellation-centric vision of the Web, where multiple ontologies and their interactions can be explicitly modeled and studied. This, in turn, allows us to study how OWL ontologies can interoperate, and, in particular, to provide two main results. The first is a formalization of the intended semantics of the OWL importing operator as opaque semantics. This result ...

  5. An HL7-CDA wrapper for facilitating semantic interoperability to rule-based Clinical Decision Support Systems

    Sáez Silvestre, Carlos; BRESÓ GUARDADO, ADRIÁN; Vicente Robledo, Javier; Robles Viejo, Montserrat; García Gómez, Juan Miguel

    2013-01-01

    The success of Clinical Decision Support Systems (CDSS) greatly depends on its capability of being integrated in Health Information Systems (HIS). Several proposals have been published up to date to permit CDSS gathering patient data from HIS. Some base the CDSS data input on the HL7 reference model, however, they are tailored to specific CDSS or clinical guidelines technologies, or do not focus on standardizing the CDSS resultant knowledge. We propose a solution for facilitating semantic int...

  6. The XML and Semantic Web Worlds: Technologies, Interoperability and Integration. A Survey of the State of the Art

    Bikakis, Nikos; Tsinaraki, Chrisa; Gioldasis, Nektarios; Stavrakantonakis, Ioannis; Christodoulakis, Stavros

    2016-01-01

    In the context of the emergent Web of Data, a large number of organizations, institutes and companies (e.g., DBpedia, Geonames, PubMed ACM, IEEE, NASA, BBC) adopt the Linked Data practices and publish their data utilizing Semantic Web (SW) technologies. On the other hand, the dominant standard for information exchange in the Web today is XML. Many international standards (e.g., Dublin Core, MPEG-7, METS, TEI, IEEE LOM) have been expressed in XML Schema resulting to a large number of XML datas...

  7. The HL7-OMG Healthcare Services Specification Project: Motivation, Methodology, and Deliverables for Enabling a Semantically Interoperable Service-oriented Architecture for Healthcare

    Kawamoto, Kensaku; Honey, Alan; Rubin, Ken

    2009-01-01

    Context The healthcare industry could achieve significant benefits through the adoption of a service-oriented architecture (SOA). The specification and adoption of standard software service interfaces will be critical to achieving these benefits. Objective To develop a replicable, collaborative framework for standardizing the interfaces of software services important to healthcare. Design Iterative, peer-reviewed development of a framework for generating interoperable service specifications that build on existing and ongoing standardization efforts. The framework was created under the auspices of the Healthcare Services Specification Project (HSSP), which was initiated in 2005 as a joint initiative between Health Level7 (HL7) and the Object Management Group (OMG). In this framework, known as the HSSP Service Specification Framework, HL7 identifies candidates for service standardization and defines normative Service Functional Models (SFMs) that specify the capabilities and conformance criteria for these services. OMG then uses these SFMs to generate technical service specifications as well as reference implementations. Measurements The ability of the framework to support the creation of multiple, interoperable service specifications useful for healthcare. Results Functional specifications have been defined through HL7 for four services: the Decision Support Service; the Entity Identification Service; the Clinical Research Filtered Query Service; and the Retrieve, Locate, and Update Service. Technical specifications and commercial implementations have been developed for two of these services within OMG. Furthermore, three additional functional specifications are being developed through HL7. Conclusions The HSSP Service Specification Framework provides a replicable and collaborative approach to defining standardized service specifications for healthcare. PMID:19717796

  8. Towards technical interoperability in telemedicine.

    Craft, Richard Layne, II

    2004-05-01

    For telemedicine to realize the vision of anywhere, anytime access to care, the question of how to create a fully interoperable technical infrastructure must be addressed. After briefly discussing how 'technical interoperability' compares with other types of interoperability being addressed in the telemedicine community today, this paper describes reasons for pursuing technical interoperability, presents a proposed framework for realizing technical interoperability, identifies key issues that will need to be addressed if technical interoperability is to be achieved, and suggests a course of action that the telemedicine community might follow to accomplish this goal.

  9. Data interchange standards in healthcare IT--computable semantic interoperability: now possible but still difficult, do we really need a better mousetrap?

    Mead, Charles N

    2006-01-01

    The following article on HL7 Version 3 will give readers a glimpse into the significant differences between "what came before"--that is, HL7 Version 2.x--and "what today and the future will bring," which is the HL7 Version 3 family of data interchange specifications. The difference between V2.x and V3 is significant, and it exists because the various stakeholders in the HL7 development process believe that the increased depth, breadth, and, to some degree, complexity that characterize V3 are necessary to solve many of today's and tomorrow's increasingly wide, deep and complex healthcare information data interchange requirements. Like many healthcare or technology discussions, this discussion has its own vocabulary of somewhat obscure, but not difficult, terms. This article will define the minimum set that is necessary for readers to appreciate the relevance and capabilities of HL7 Version 3, including how it is different than HL7 Version 2. After that, there will be a brief overview of the primary motivations for HL7 Version 3 in the presence of the unequivocal success of Version 2. In this context, the article will give readers an overview of one of the prime constructs of Version 3, the Reference Information Model (RIM). There are 'four pillars that are necessary but not sufficient to obtain computable semantic interoperability." These four pillars--a cross-domain information model; a robust data type specification; a methodology for separating domain-specific terms from, as well as binding them to, the common model; and a top-down interchange specification methodology and tools for using 1, 2, 3 and defining Version 3 specification--collectively comprise the "HL7 Version 3 Toolkit." Further, this article will present a list of questions and answers to help readers assess the scope and complexity of the problems facing healthcare IT today, and which will further enlighten readers on the "reality" of HL7 Version 3. The article will conclude with a "pseudo

  10. Semantic Registration and Discovery System of Subsystems and Services within an Interoperable Coordination Platform in Smart Cities.

    Rubio, Gregorio; Martínez, José Fernán; Gómez, David; Li, Xin

    2016-01-01

    Smart subsystems like traffic, Smart Homes, the Smart Grid, outdoor lighting, etc. are built in many urban areas, each with a set of services that are offered to citizens. These subsystems are managed by self-contained embedded systems. However, coordination and cooperation between them are scarce. An integration of these systems which truly represents a "system of systems" could introduce more benefits, such as allowing the development of new applications and collective optimization. The integration should allow maximum reusability of available services provided by entities (e.g., sensors or Wireless Sensor Networks). Thus, it is of major importance to facilitate the discovery and registration of available services and subsystems in an integrated way. Therefore, an ontology-based and automatic system for subsystem and service registration and discovery is presented. Using this proposed system, heterogeneous subsystems and services could be registered and discovered in a dynamic manner with additional semantic annotations. In this way, users are able to build customized applications across different subsystems by using available services. The proposed system has been fully implemented and a case study is presented to show the usefulness of the proposed method. PMID:27347965

  11. Semantic Registration and Discovery System of Subsystems and Services within an Interoperable Coordination Platform in Smart Cities

    Gregorio Rubio

    2016-06-01

    Full Text Available Smart subsystems like traffic, Smart Homes, the Smart Grid, outdoor lighting, etc. are built in many urban areas, each with a set of services that are offered to citizens. These subsystems are managed by self-contained embedded systems. However, coordination and cooperation between them are scarce. An integration of these systems which truly represents a “system of systems” could introduce more benefits, such as allowing the development of new applications and collective optimization. The integration should allow maximum reusability of available services provided by entities (e.g., sensors or Wireless Sensor Networks. Thus, it is of major importance to facilitate the discovery and registration of available services and subsystems in an integrated way. Therefore, an ontology-based and automatic system for subsystem and service registration and discovery is presented. Using this proposed system, heterogeneous subsystems and services could be registered and discovered in a dynamic manner with additional semantic annotations. In this way, users are able to build customized applications across different subsystems by using available services. The proposed system has been fully implemented and a case study is presented to show the usefulness of the proposed method.

  12. GENESIS SciFlo: Choreographing Interoperable Web Services on the Grid using a Semantically-Enabled Dataflow Execution Environment

    Wilson, B. D.; Manipon, G.; Xing, Z.

    2007-12-01

    Access Protocol (OpenDAP) servers. SciFlo also publishes its own SOAP services for space/time query and subsetting of Earth Science datasets, and automated access to large datasets via lists of (FTP, HTTP, or DAP) URLs which point to on-line HDF or netCDF files. Typical distributed workflows obtain datasets by calling standard WMS/WCS servers or discovering and fetching data granules from ftp sites; invoke remote analysis operators available as SOAP services (interface described by a WSDL document); and merge results into binary containers (netCDF or HDF files) for further analysis using local executable operators. Naming conventions (HDFEOS and CF-1.0 for netCDF) are exploited to automatically understand and read on-line datasets. More interoperable conventions, and broader adoption of existing converntions, are vital if we are to "scale up" automated choreography of Web Services beyond toy applications. Recently, the ESIP Federation sponsored a collaborative activity in which several ESIP members developed some collaborative science scenarios for atmospheric and aerosol science, and then choreographed services from multiple groups into demonstration workflows using the SciFlo engine and a Business Process Execution Language (BPEL) workflow engine. We will discuss the lessons learned from this activity, the need for standardized interfaces (like WMS/WCS), the difficulty in agreeing on even simple XML formats and interfaces, the benefits of doing collaborative science analysis at the "touch of a button" once services are connected, and further collaborations that are being pursued.

  13. Multilateral Interoperability Programme

    Burita, L.

    2009-01-01

    The Multilateral Interoperability Programme (MIP) is a voluntary and independent activity in NATO environment by the participating nations and organizations. The MIP concept is based on data exchange in form of common exchange data model to achieve the international interoperability in command and control information systems (C2IS) of the tactical units. The article describes the basis of the MIP organizations, structure, planning and testing processes. The core of the MIP solution is the Inf...

  14. Polynomial-Time, Semantically-Secure Encryption Achieving the Secrecy Capacity

    Bellare, Mihir

    2012-01-01

    In the wiretap channel setting, one aims to get information-theoretic privacy of communicated data based only on the assumption that the channel from sender to receiver is noisier than the one from sender to adversary. The secrecy capacity is the optimal (highest possible) rate of a secure scheme, and the existence of schemes achieving it has been shown. For thirty years the ultimate and unreached goal has been to achieve this optimal rate with a scheme that is polynomial-time. (This means both encryption and decryption are proven polynomial time algorithms.) This paper finally delivers such a scheme. In fact it does more. Our scheme not only meets the classical notion of security from the wiretap literature, called MIS-R (mutual information security for random messages) but achieves the strictly stronger notion of semantic security, thus delivering more in terms of security without loss of rate.

  15. The Fractal Nature of the Semantic Web

    Berners-Lee, Tim; Massachusetts Institute of Technology; Kagal, Lalana; Massachusetts Institute of Technology

    2008-01-01

    In the past, many knowledge representation systems failed because they were too monolithic and didn’t scale well, whereas other systems failed to have an impact because they were small and isolated. Along with this trade-off in size, there is also a constant tension between the cost involved in building a larger community that can interoperate through common terms and the cost of the lack of interoperability. The semantic web offers a good compromise between these approaches as it achieves wi...

  16. Basic semantic architecture of interoperability for the intelligent distribution in the CFE electrical system; Arquitectura base de interoperabilidad semantica para el sistema electrico de distribucion inteligente en la CFE

    Espinosa Reza, Alfredo; Garcia Mendoza, Raul; Borja Diaz, Jesus Fidel; Sierra Rodriguez, Benjamin [Instituto de Investigaciones Electricas, Cuernavaca, Morelos (Mexico)

    2010-07-01

    The physical and logical architecture of the interoperability platform defined for the distribution management systems (DMS), of the Distribution Subdivision of Comision Federal de Electricidad (CFE) in Mexico is presented. The adopted architecture includes the definition of a technological platform to manage the exchange of information between systems and applications, sustained in the Model of Common Information (CIM), established in norms IEC61968 and IEC 61970. The architecture based on SSOA (Semantic Services Oriented Architecture), on EIB (Enterprise Integration Bus) and on GID (Generic Interface Definition) is presented, as well as the sequence to obtain the interoperability of systems related to the Distribution Management of the of electrical energy in Mexico. Of equal way it is described the process to establish a Semantic Model of the Electrical System of Distribution (SED) and the creation of instances CIM/XML, oriented to the interoperability of the information systems in the DMS scope, by means of the interchange of messages conformed and validated according to the structure obtained and agreed to the rules established by Model CIM. In this way, the messages and the information interchanged among systems, assure the compatibility and correct interpretation in an independent way to the developer, mark or manufacturer of the system source and destiny. The primary target is to establish the infrastructure semantic base of interoperability, cradle in standards that sustain the strategic definition of an Electrical System of Intelligent Distribution (SEDI) in Mexico. [Spanish] Se presenta la arquitectura fisica y logica de la plataforma de interoperabilidad definida para los sistemas de gestion de la distribucion (DMS por sus siglas en ingles), de la Subdireccion de Distribucion de la Comision Federal de Electricidad (CFE) en Mexico. La arquitectura adoptada incluye la definicion de una plataforma tecnologica para gestionar el intercambio de informacion

  17. Multilateral Interoperability Programme

    L. Burita

    2009-12-01

    Full Text Available The Multilateral Interoperability Programme (MIP is a voluntary and independent activity in NATO environment by the participating nations and organizations. The MIP concept is based on data exchange in form of common exchange data model to achieve the international interoperability in command and control information systems (C2IS of the tactical units. The article describes the basis of the MIP organizations, structure, planning and testing processes. The core of the MIP solution is the Information Exchange Data Model (IEDM. The Czech Armed Forces (CAF MIP process implementation is mentioned. The MIP example is a part of university education process.

  18. Benchmarking semantic web technology

    García-Castro, R

    2009-01-01

    This book addresses the problem of benchmarking Semantic Web Technologies; first, from a methodological point of view, proposing a general methodology to follow in benchmarking activities over Semantic Web Technologies and, second, from a practical point of view, presenting two international benchmarking activities that involved benchmarking the interoperability of Semantic Web technologies using RDF(S) as the interchange language in one activity and OWL in the other.The book presents in detail how the different resources needed for these interoperability benchmarking activities were defined:

  19. Semantic Web

    Anna Lamandini

    2011-01-01

    The semantic Web is a technology at the service of knowledge which is aimed at accessibility and the sharing of content; facilitating interoperability between different systems and as such is one of the nine key technological pillars of TIC (technologies for information and communication) within the third theme, programme specific cooperation of the seventh programme framework for research and development (7°PQRS, 2007-2013). As a system it seeks to overcome overload or excess of irrelevant i...

  20. A Semantically Automated Protocol Adapter for Mapping SOAP Web Services to RESTful HTTP Format to Enable the Web Infrastructure, Enhance Web Service Interoperability and Ease Web Service Migration

    Frank Doheny

    2012-04-01

    Full Text Available Semantic Web Services (SWS are Web Service (WS descriptions augmented with semantic information. SWS enable intelligent reasoning and automation in areas such as service discovery, composition, mediation, ranking and invocation. This paper applies SWS to a previous protocol adapter which, operating within clearly defined constraints, maps SOAP Web Services to RESTful HTTP format. However, in the previous adapter, the configuration element is manual and the latency implications are locally based. This paper applies SWS technologies to automate the configuration element and the latency tests are conducted in a more realistic Internet based setting.

  1. Semantic based P2P System for local e-Government

    Ortiz-Rodriguez, F.; Palma, R.; Villazón-Terrazas, B.

    2006-01-01

    The Electronic Government is an emerging field of applications for the Semantic Web where ontologies are becoming an important research technology. The e-Government faces considerable challenges to achieve interoperability given the semantic differences of interpretation, omplexity and width of scope. This paper addresses the importance of providing an infrastructure capable of dealing with issues such as: communications between public administrations across government and retrieval of offici...

  2. Lemnos Interoperable Security Program

    Stewart, John [Tennessee Valley Authority, Knoxville, TN (United States); Halbgewachs, Ron [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Chavez, Adrian [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Smith, Rhett [Schweitzer Engineering Laboratories, Chattanooga, TN (United States); Teumim, David [Teumim Technical, Allentown, PA (United States)

    2012-01-31

    The manner in which the control systems are being designed and operated in the energy sector is undergoing some of the most significant changes in history due to the evolution of technology and the increasing number of interconnections to other system. With these changes however come two significant challenges that the energy sector must face; 1) Cyber security is more important than ever before, and 2) Cyber security is more complicated than ever before. A key requirement in helping utilities and vendors alike in meeting these challenges is interoperability. While interoperability has been present in much of the discussions relating to technology utilized within the energy sector and especially the Smart Grid, it has been absent in the context of cyber security. The Lemnos project addresses these challenges by focusing on the interoperability of devices utilized within utility control systems which support critical cyber security functions. In theory, interoperability is possible with many of the cyber security solutions available to utilities today. The reality is that the effort required to achieve cyber security interoperability is often a barrier for utilities. For example, consider IPSec, a widely-used Internet Protocol to define Virtual Private Networks, or tunnels , to communicate securely through untrusted public and private networks. The IPSec protocol suite has a significant number of configuration options and encryption parameters to choose from, which must be agreed upon and adopted by both parties establishing the tunnel. The exercise in getting software or devices from different vendors to interoperate is labor intensive and requires a significant amount of security expertise by the end user. Scale this effort to a significant number of devices operating over a large geographical area and the challenge becomes so overwhelming that it often leads utilities to pursue solutions from a single vendor. These single vendor solutions may inadvertently lock

  3. SEMANTIC INTEGRATION FOR AUTOMATIC ONTOLOGY MAPPING

    Siham AMROUCH

    2013-11-01

    Full Text Available In the last decade, ontologies have played a key technology role for information sharing and agents interoperability in different application domains. In semantic web domain, ontologies are efficiently used to face the great challenge of representing the semantics of data, in order to bring the actual web to its full power and hence, achieve its objective. However, using ontologies as common and shared vocabularies requires a certain degree of interoperability between them. To confront this requirement, mapping ontologies is a solution that is not to be avoided. In deed, ontology mapping build a meta layer that allows different applications and information systems to access and share their informations, of course, after resolving the different forms of syntactic, semantic and lexical mismatches. In the contribution presented in this paper, we have integrated the semantic aspect based on an external lexical resource, wordNet, to design a new algorithm for fully automatic ontology mapping. This fully automatic character features the main difference of our contribution with regards to the most of the existing semi-automatic algorithms of ontology mapping, such as Chimaera, Prompt, Onion, Glue, etc. To better enhance the performances of our algorithm, the mapping discovery stage is based on the combination of two sub-modules. The former analysis the concept’s names and the later analysis their properties. Each one of these two sub-modules is it self based on the combination of lexical and semantic similarity measures.

  4. Web Feature Service Semantic Mediation

    Hobona, G.; Bermudez, L. E.; Brackin, R.; Percivall, G. S.

    2012-12-01

    Scientists from different organizations and disciplines need to work together to find the solutions to complex problems. Multi-disciplinary science typically involves users with specialized tools and their own preferred view of the data including unique characteristics of the user's information model and symbology. Even though organizations use web services to expose data, there are still semantic inconsistencies that need to be solved. Recent activities within the OGC Interoperability Program (IP) have helped advance semantic mediation solutions when using OGC services to help solve complex problems. The OGC standards development process is influenced by the feedback of activities within the Interoperability Program, which conducts international interoperability initiatives such as Testbeds, Pilot Projects, Interoperability Experiments, and Interoperability Support Services. These activities are designed to encourage rapid development, testing, validation, demonstration and adoption of open, consensus based standards and best practices. Two recent Testbeds, the OGC Web Services Phase 8 and Phase 9, have advanced the use of semantic mediation approaches to increase semantic interoperability among geospatial communities. The Cross-Community Interoperability (CCI) thread within these two testbeds, advanced semantic mediation approaches for data discovery, access and use of heterogeneous data models and heterogeneous metadata models. This presentation will provide an overview of the interoperability program, the CCI Thread and will explain the methodology to mediate heterogeneous GML Application Profiles served via WFS, including discovery of services via a catalog standard interface and mediating symbology applicable to each application profile.

  5. Web-of-Objects Based User-Centric Semantic Service Composition Methodology in the Internet of Things

    Safina Showkat Ara; Zia Ush Shamszaman; Ilyoung Chong

    2014-01-01

    The general goal of the Web-of-Objects (WoO) is to simplify object and application deployment, maintenance, and operation of IoT infrastructures. WoO also aim to provide user-centric IoT service by enabling object virtualization and semantic ontology based service composition. In WoO, semantic modeling of objects plays a distinguished role in achieving interoperability of device and service through semantic ontology model. In this paper, we propose a semantic functional module for user centri...

  6. HeartDrive: A Broader Concept of Interoperability to Implement Care Processes for Heart Failure.

    Lettere, M; Guerri, D; La Manna, S; Groccia, M C; Lofaro, D; Conforti, D

    2016-01-01

    This paper originates from the HeartDrive project, a platform of services for a more effective, efficient and integrated management of heart failure and comorbidities. HeartDrive establishes a cooperative approach based on the concepts of continuity of care and extreme, patient oriented, customization of diagnostic, therapeutic and follow-up procedures. Definition and development of evidence based processes, migration from parceled and episode based healthcare provisioning to a workflow oriented model and increased awareness and responsibility of citizens towards their own health and wellness are key objectives of HeartDrive. In two scenarios for rehabilitation and home monitoring we show how the results are achieved by providing a solution that highlights a broader concept of cooperation that goes beyond technical interoperability towards semantic interoperability explicitly sharing process definitions, decision support strategies and information semantics. PMID:27225572

  7. Towards Interoperability for Public Health Surveillance: Experiences from Two States

    Dixon, Brian E.; Siegel, Jason A.; Oemig, Tanya V.; Grannis, Shaun J

    2013-01-01

    Objective To characterize the use of standardized vocabularies in real-world electronic laboratory reporting (ELR) messages sent to public health agencies for surveillance. Introduction The use of health information systems to electronically deliver clinical data necessary for notifiable disease surveillance is growing. For health information systems to be effective at improving population surveillance functions, semantic interoperability is necessary. Semantic interoperability is “the abilit...

  8. Interoperability driven integration of biomedical data sources

    Teodoro, Douglas Henrique; Choquet, Rémy; Schober, Daniel; Mels, Giovanni; Pasche, Emilie; Ruch, Patrick; Lovis, Christian

    2011-01-01

    In this paper, we introduce a data integration methodology that promotes technical, syntactic and semantic interoperability for operational healthcare data sources. ETL processes provide access to different operational databases at the technical level. Furthermore, data instances have they syntax aligned according to biomedical terminologies using natural language processing. Finally, semantic web technologies are used to ensure common meaning and to provide ubiquitous access to the data. The...

  9. An Interoperability Infrastructure for Developing Multidatabase Systems

    Doğaç, Asuman; Özhan, Gökhan; Kılıç, Ebru; Özcan, Fatma; Nural, Sena; Sema

    1998-01-01

    A multidatabase system (MDBS) allows the users to simultaneously access autonomous, heterogeneous databases using a single data model and a query language. This provides for achieving interoperability among heterogeneous, federated DBMSs. In this paper, we describe the interoperability infrastructure of a multidatabase system, namely METU Interoperable DBMS (MIND). The architecture of MIND is based on OMG distributed object management model. It is implemented on top of a CORBA compl...

  10. SOF and conventional force interoperability through SOF reconfiguration

    McHale, Edward J.

    1996-01-01

    The goal of this thesis was to decide what environmental variables affected past SOF attempts at achieving interoperability with the conventional military, to examine the status of SOF and conventional forces interoperability as it exists today, and to explain why now is the time for SOP to engage in the reconfiguration of its forces to achieve an optimal level of interoperability. Five variables were used in the examination of SOFs organizational evolution toward interoperability with conven...

  11. A methodology for the development of software agent based interoperable telemedicine systems: a tele-electrocardiography perspective.

    Ganguly, P; Ray, P

    2000-01-01

    Telemedicine involves the integration of information, human-machine, and healthcare technologies. Because different modalities of patient care require applications running on heterogeneous computing environment, software interoperability is a major issue in telemedicine. Software agent technology provides a range of promising techniques to solve this problem. This article discusses the development of a methodology for the design of interoperable telemedicine systems (illustrated with a tele-electrocardiography application). Software interoperability between different applications can be modeled at different levels of abstraction such as physical interoperability, data-type interoperability, specification-level interoperability, and semantic interoperability. Software agents address the issue of software interoperability at semantic level. A popular object-oriented software development methodology - unified modeling language (UML) - has been used for this development. This research has demonstrated the feasibility of the development of agent-based interoperable telemedicine systems. More research is needed before widespread deployment of such systems can take place. PMID:10957742

  12. A Framework of Semantic Information Representation in Distributed Environments

    2006-01-01

    An information representation framework is designed to overcome the problem of semantic heterogeneity in distributed environments in this paper. Emphasis is placed on establishing an XML-oriented semantic data model and the mapping between XML data based on a global ontology semantic view. The framework is implemented in Web Service, which enhances information process efficiency, accuracy and the semantic interoperability as well.

  13. SomeRDFS in the Semantic Web

    Adjiman, Philippe; Goasdoué, François; Rousset, Marie-Christine

    2006-01-01

    The Semantic Web envisions a world-wide distributed architecture where computational resources will easily inter-operate to coordinate complex tasks such as query answering. Semantic marking up of web resources using ontologies is expected to provide the necessary glue for making this vision work. Using ontology languages, (communities of) users will build their own ontologies in order to describe their own data. Adding semantic mappings between those ontologies, in order to semantically rela...

  14. Combining Archetypes with Fast Health Interoperability Resources in Future-proof Health Information Systems.

    Bosca, Diego; Moner, David; Maldonado, Jose Alberto; Robles, Montserrat

    2015-01-01

    Messaging standards, and specifically HL7 v2, are heavily used for the communication and interoperability of Health Information Systems. HL7 FHIR was created as an evolution of the messaging standards to achieve semantic interoperability. FHIR is somehow similar to other approaches like the dual model methodology as both are based on the precise modeling of clinical information. In this paper, we demonstrate how we can apply the dual model methodology to standards like FHIR. We show the usefulness of this approach for data transformation between FHIR and other specifications such as HL7 CDA, EN ISO 13606, and openEHR. We also discuss the advantages and disadvantages of defining archetypes over FHIR, and the consequences and outcomes of this approach. Finally, we exemplify this approach by creating a testing data server that supports both FHIR resources and archetypes. PMID:25991126

  15. Controlled Vocabularies, Mini Ontologies and Interoperability (Invited)

    King, T. A.; Walker, R. J.; Roberts, D.; Thieman, J.; Ritschel, B.; Cecconi, B.; Genot, V. N.

    2013-12-01

    Interoperability has been an elusive goal, but in recent years advances have been made using controlled vocabularies, mini-ontologies and a lot of collaboration. This has led to increased interoperability between disciplines in the U.S. and between international projects. We discuss the successful pattern followed by SPASE, IVOA and IPDA to achieve this new level of international interoperability. A key aspect of the pattern is open standards and open participation with interoperability achieved with shared services, public APIs, standard formats and open access to data. Many of these standards are expressed as controlled vocabularies and mini ontologies. To illustrate the pattern we look at SPASE related efforts and participation of North America's Heliophysics Data Environment and CDPP; Europe's Cluster Active Archive, IMPEx, EuroPlanet, ESPAS and HELIO; and Japan's magnetospheric missions. Each participating project has its own life cycle and successful standards development must always take this into account. A major challenge for sustained collaboration and interoperability is the limited lifespan of many of the participating projects. Innovative approaches and new tools and frameworks are often developed as competitively selected, limited term projects, but for sustainable interoperability successful approaches need to become part of a long term infrastructure. This is being encouraged and achieved in many domains and we are entering a golden age of interoperability.

  16. A Semantics-Based Approachfor Achieving Self Fault-Tolerance of Protocols

    李腊元; 李春林

    2000-01-01

    The cooperation of different processes may be lost by mistake when a protocol is executed. The protocol cannot be normally operated under this condition. In this paper, the self fault-tolerance of protocols is discussed, and a semanticsbased approach for achieving self fault-tolerance of protocols is presented. Some main characteristics of self fault-tolerance of protocols concerning liveness, nontermination and infinity are also presented. Meanwhile, the sufficient and necessary conditions for achieving self fault-tolerance of protocols are given. Finally, a typical protocol that does not satisfy the self fault-tolerance is investigated, and a new redesign version of this existing protocol using the proposed approach is given.

  17. Matchmaking Semantic Based for Information System Interoperability

    Wicaksana, I Wayan Simri

    2011-01-01

    Unlike the traditional model of information pull, matchmaking is base on a cooperative partnership between information providers and consumers, assisted by an intelligent facilitator (the matchmaker). Refer to some experiments, the matchmaking to be most useful in two different ways: locating information sources or services that appear dynamically and notification of information changes. Effective information and services sharing in distributed such as P2P based environments raises many challenges, including discovery and localization of resources, exchange over heterogeneous sources, and query processing. One traditional approach for dealing with some of the above challenges is to create unified integrated schemas or services to combine the heterogeneous sources. This approach does not scale well when applied in dynamic distributed environments and has many drawbacks related to the large numbers of sources. The main issues in matchmaking are how to represent advertising and request, and how to calculate poss...

  18. A Semantic Web Service and Simulation Framework to Intelligent Distributed Manufacturing

    Son, Young Jun [University of Arizona; Kulvatunyou, Boonserm [ORNL; Cho, Hyunbo [POSTECH University, South Korea; Feng, Shaw [National Institute of Standards and Technology (NIST)

    2005-11-01

    To cope with today's fluctuating markets, a virtual enterprise (VE) concept can be employed to achieve the cooperation among independently operating enterprises. The success of VE depends on reliable interoperation among trading partners. This paper proposes a framework based on semantic web of manufacturing and simulation services to enable business and engineering collaborations between VE partners, particularly a design house and manufacturing suppliers.

  19. Rationale and design considerations for a semantic mediator in health information systems.

    Degoulet, P; Sauquet, D; Jaulent, M C; Zapletal, E; Lavril, M

    1998-11-01

    Rapid development of community health information networks raises the issue of semantic interoperability between distributed and heterogeneous systems. Indeed, operational health information systems originate from heterogeneous teams of independent developers and have to cooperate in order to exchange data and services. A good cooperation is based on a good understanding of the messages exchanged between the systems. The main issue of semantic interoperability is to ensure that the exchange is not only possible but also meaningful. The main objective of this paper is to analyze semantic interoperability from a software engineering point of view. It describes the principles for the design of a semantic mediator (SM) in the framework of a distributed object manager (DOM). The mediator is itself a component that should allow the exchange of messages independently of languages and platforms. The functional architecture of such a SM is detailed. These principles have been partly applied in the context of the HELIOS object-oriented software engineering environment. The resulting service components are presented with their current state of achievement. PMID:9865050

  20. Intelligent interoperable application for employment exchange system using ontology

    Kavidha Ayechetty

    2013-12-01

    Full Text Available Semantic web technologies have the potential to simplify heterogeneous data integration using explicit semantics. The paper proposes a framework for building intelligent interoperable application for employment exchange system by collaborating among distributed heterogeneous data models using semantic web technologies. The objective of the application development using semantic technologies is to provide a better inference for the query against dynamic collection of information in collaborating data models. The employment exchange system provides interface for the users to register their details thereby managing the knowledge base dynamically. Semantic server transforms the queries from the employer and jobseeker semantically for possible integration of the two heterogeneous data models to drive intelligent inference. The semantic agent reconcile the syntax and semantic conflicts exists among the contributing ontologies in different granularity levels and performs automatic integration of two source ontologies and gives better response to the user. The benefits of building interoperable application using semantic web are data sharing, reusing the knowledge, best query response, independent maintenance of the model, extending the application for extra features.

  1. Governance of Interoperability in Intergovernmental Services - Towards an Empirical Taxonomy

    Herbert Kubicek

    2008-12-01

    Full Text Available High quality and comfortable online delivery of governmental services often requires the seamless exchange of data between two or more government agencies. Smooth data exchange, in turn, requires interoperability of the databases and workflows in the agencies involved. Interoperability (IOP is a complex issue covering purely technical aspects such as transmission protocols and data exchange formats, but also content-related semantic aspects such as identifiers and the meaning of codes as well as organizational, contractual or legal issues. Starting from IOP frameworks which provide classifications of what has to be standardized, this paper, based on an ongoing research project, adopts a political and managerial view and tries to clarify the governance of achieving IOP, i.e. where and by whom IOPstandards are developed and established and how they are put into operation. By analyzing 32 cases of successful implementation of IOP in E-Government services within the European Union empirical indicators for different aspects of governance are proposed and applied to develop an empirical taxonomy of different types of IOP governance which can be used for future comparative research regarding success factors, barriers etc.

  2. Towards a contract-based interoperation model

    Fernández Peña, Félix Oscar; Willmott, Steven Nicolás

    2007-01-01

    Web Services-based solutions for interoperating processes are considered to be one of the most promising technologies for achieving truly interoperable functioning in open environments. In the last three years, the specification in particular of agreements between resource / service providers and consumers, as well as protocols for their negotiation have been proposed as a possible solution for managing the resulting computing systems. In this report, the state of the art in the area of contr...

  3. Semantic Description of Web Services

    Thabet Slimani

    2013-01-01

    The tasks of semantic web service (discovery, selection, composition, and execution) are supposed to enable seamless interoperation between systems, whereby human intervention is kept at a minimum. In the field of Web service description research, the exploitation of descriptions of services through semantics is a better support for the life-cycle of Web services. The large number of developed ontologies, languages of representations, and integrated frameworks supporting the discovery, compos...

  4. Standards-based data interoperability in the climate sciences

    Woolf, Andrew; Cramer, Ray; Gutierrez, Marta; Kleese van Dam, Kerstin; Kondapalli, Siva; Latham, Susan; Lawrence, Bryan; Lowry, Roy; O'Neill, Kevin

    2005-03-01

    Emerging developments in geographic information systems and distributed computing offer a roadmap towards an unprecedented spatial data infrastructure in the climate sciences. Key to this are the standards developments for digital geographic information being led by the International Organisation for Standardisation (ISO) technical committee on geographic information/geomatics (TC211) and the Open Geospatial Consortium (OGC). These, coupled with the evolution of standardised web services for applications on the internet by the World Wide Web Consortium (W3C), mean that opportunities for both new applications and increased interoperability exist. These are exemplified by the ability to construct ISO-compliant data models that expose legacy data sources through OGC web services. This paper concentrates on the applicability of these standards to climate data by introducing some examples and outlining the challenges ahead. An abstract data model is developed, based on ISO standards, and applied to a range of climate data both observational and modelled. An OGC Web Map Server interface is constructed for numerical weather prediction (NWP) data stored in legacy data files. A W3C web service for remotely accessing gridded climate data is illustrated. Challenges identified include the following: first, both the ISO and OGC specifications require extensions to support climate data. Secondly, OGC services need to fully comply with W3C web services, and support complex access control. Finally, to achieve real interoperability, broadly accepted community-based semantic data models are required across the range of climate data types. These challenges are being actively pursued, and broad data interoperability for the climate sciences appears within reach.

  5. The Semantic SPASE

    Hughes, S.; Crichton, D.; Thieman, J.; Ramirez, P.; King, T.; Weiss, M.

    2005-12-01

    The Semantic SPASE (Space Physics Archive Search and Extract) prototype demonstrates the use of semantic web technologies to capture, document, and manage the SPASE data model, support facet- and text-based search, and provide flexible and intuitive user interfaces. The SPASE data model, under development since late 2003 by a consortium of space physics domain experts, is intended to serve as the basis for interoperability between independent data systems. To develop the Semantic SPASE prototype, the data model was first analyzed to determine the inherit object classes and their attributes. These were entered into Stanford Medical Informatics' Protege ontology tool and annotated using definitions from the SPASE documentation. Further analysis of the data model resulted in the addition of class relationships. Finally attributes and relationships that support broad-scope interoperability were added from research associated with the Object-Oriented Data Technology task. To validate the ontology and produce a knowledge base, example data products were ingested. The capture of the data model as an ontology results in a more formal specification of the model. The Protege software is also a powerful management tool and supports plug-ins that produce several graphical notations as output. The stated purpose of the semantic web is to support machine understanding of web-based information. Protege provides an export capability to RDF/XML and RDFS/XML for this purpose. Several research efforts use RDF/XML knowledge bases to provide semantic search. MIT's Simile/Longwell project provides both facet- and text-based search using a suite of metadata browsers and the text-based search engine Lucene. Using the Protege generated RDF knowledge-base a semantic search application was easily built and deployed to run as a web application. Configuration files specify the object attributes and values to be designated as facets (i.e. search) constraints. Semantic web technologies provide

  6. Supporting spatial data harmonization process with the use of ontologies and Semantic Web technologies

    Strzelecki, M.; Iwaniak, A.; Łukowicz, J.; Kaczmarek, I.

    2013-10-01

    Nowadays, spatial information is not only used by professionals, but also by common citizens, who uses it for their daily activities. Open Data initiative states that data should be freely and unreservedly available for all users. It also applies to spatial data. As spatial data becomes widely available it is essential to publish it in form which guarantees the possibility of integrating it with other, heterogeneous data sources. Interoperability is the possibility to combine spatial data sets from different sources in a consistent way as well as providing access to it. Providing syntactic interoperability based on well-known data formats is relatively simple, unlike providing semantic interoperability, due to the multiple possible data interpretation. One of the issues connected with the problem of achieving interoperability is data harmonization. It is a process of providing access to spatial data in a representation that allows combining it with other harmonized data in a coherent way by using a common set of data product specification. Spatial data harmonization is performed by creating definition of reclassification and transformation rules (mapping schema) for source application schema. Creation of those rules is a very demanding task which requires wide domain knowledge and a detailed look into application schemas. The paper focuses on proposing methods for supporting data harmonization process, by automated or supervised creation of mapping schemas with the use of ontologies, ontology matching methods and Semantic Web technologies.

  7. Towards Semantic e-Science for Traditional Chinese Medicine

    Zhou Chunying

    2007-05-01

    Full Text Available Abstract Background Recent advances in Web and information technologies with the increasing decentralization of organizational structures have resulted in massive amounts of information resources and domain-specific services in Traditional Chinese Medicine. The massive volume and diversity of information and services available have made it difficult to achieve seamless and interoperable e-Science for knowledge-intensive disciplines like TCM. Therefore, information integration and service coordination are two major challenges in e-Science for TCM. We still lack sophisticated approaches to integrate scientific data and services for TCM e-Science. Results We present a comprehensive approach to build dynamic and extendable e-Science applications for knowledge-intensive disciplines like TCM based on semantic and knowledge-based techniques. The semantic e-Science infrastructure for TCM supports large-scale database integration and service coordination in a virtual organization. We use domain ontologies to integrate TCM database resources and services in a semantic cyberspace and deliver a semantically superior experience including browsing, searching, querying and knowledge discovering to users. We have developed a collection of semantic-based toolkits to facilitate TCM scientists and researchers in information sharing and collaborative research. Conclusion Semantic and knowledge-based techniques are suitable to knowledge-intensive disciplines like TCM. It's possible to build on-demand e-Science system for TCM based on existing semantic and knowledge-based techniques. The presented approach in the paper integrates heterogeneous distributed TCM databases and services, and provides scientists with semantically superior experience to support collaborative research in TCM discipline.

  8. Interoperability for electronic ID

    Zygadlo, Zuzanna

    2009-01-01

    Electronic Business, including eBanking, eCommerce and eGovernmental services, is today based on a large variety of security solutions, comprising electronic IDs provided by a broad community of Public Key Infrastructure (PKI) vendors. Significant differences in implementations of those solutions introduce a problem of lack of interoperability in electronic business, which have not yet been resolved by standardization and interoperability initiatives based on existing PKI trust models. It i...

  9. Role of semantics in Autonomic and Adaptive Web Services & Processes

    Sheth, Amit P.

    2007-01-01

    The emergence of Service Oriented Architectures (SOA) has created a new paradigm of loosely coupled distributed systems. In the METEOR-S project, we have studied the comprehensive role of semantics in all stages of the life cycle of service and process-- including annotation, publication, discovery, interoperability/data mediation, and composition. In 2002-2003, we had offered a broad framework of semantics consisting of four types:1) Data semantics, 2) Functional semantics...

  10. Semantic Web

    Anna Lamandini

    2011-06-01

    Full Text Available The semantic Web is a technology at the service of knowledge which is aimed at accessibility and the sharing of content; facilitating interoperability between different systems and as such is one of the nine key technological pillars of TIC (technologies for information and communication within the third theme, programme specific cooperation of the seventh programme framework for research and development (7°PQRS, 2007-2013. As a system it seeks to overcome overload or excess of irrelevant information in Internet, in order to facilitate specific or pertinent research. It is an extension of the existing Web in which the aim is for cooperation between and the computer and people (the dream of Sir Tim Berners –Lee where machines can give more support to people when integrating and elaborating data in order to obtain inferences and a global sharing of data. It is a technology that is able to favour the development of a “data web” in other words the creation of a space in both sets of interconnected and shared data (Linked Data which allows users to link different types of data coming from different sources. It is a technology that will have great effect on everyday life since it will permit the planning of “intelligent applications” in various sectors such as education and training, research, the business world, public information, tourism, health, and e-government. It is an innovative technology that activates a social transformation (socio-semantic Web on a world level since it redefines the cognitive universe of users and enables the sharing not only of information but of significance (collective and connected intelligence.

  11. Buildings Interoperability Landscape

    Hardin, Dave [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Stephan, Eric G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wang, Weimin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Corbin, Charles D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Widergren, Steven E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-12-31

    Through its Building Technologies Office (BTO), the United States Department of Energy’s Office of Energy Efficiency and Renewable Energy (DOE-EERE) is sponsoring an effort to advance interoperability for the integration of intelligent buildings equipment and automation systems, understanding the importance of integration frameworks and product ecosystems to this cause. This is important to BTO’s mission to enhance energy efficiency and save energy for economic and environmental purposes. For connected buildings ecosystems of products and services from various manufacturers to flourish, the ICT aspects of the equipment need to integrate and operate simply and reliably. Within the concepts of interoperability lie the specification, development, and certification of equipment with standards-based interfaces that connect and work. Beyond this, a healthy community of stakeholders that contribute to and use interoperability work products must be developed. On May 1, 2014, the DOE convened a technical meeting to take stock of the current state of interoperability of connected equipment and systems in buildings. Several insights from that meeting helped facilitate a draft description of the landscape of interoperability for connected buildings, which focuses mainly on small and medium commercial buildings. This document revises the February 2015 landscape document to address reviewer comments, incorporate important insights from the Buildings Interoperability Vision technical meeting, and capture thoughts from that meeting about the topics to be addressed in a buildings interoperability vision. In particular, greater attention is paid to the state of information modeling in buildings and the great potential for near-term benefits in this area from progress and community alignment.

  12. The caCORE Software Development Kit: Streamlining construction of interoperable biomedical information services

    Warzel Denise

    2006-01-01

    Full Text Available Abstract Background Robust, programmatically accessible biomedical information services that syntactically and semantically interoperate with other resources are challenging to construct. Such systems require the adoption of common information models, data representations and terminology standards as well as documented application programming interfaces (APIs. The National Cancer Institute (NCI developed the cancer common ontologic representation environment (caCORE to provide the infrastructure necessary to achieve interoperability across the systems it develops or sponsors. The caCORE Software Development Kit (SDK was designed to provide developers both within and outside the NCI with the tools needed to construct such interoperable software systems. Results The caCORE SDK requires a Unified Modeling Language (UML tool to begin the development workflow with the construction of a domain information model in the form of a UML Class Diagram. Models are annotated with concepts and definitions from a description logic terminology source using the Semantic Connector component. The annotated model is registered in the Cancer Data Standards Repository (caDSR using the UML Loader component. System software is automatically generated using the Codegen component, which produces middleware that runs on an application server. The caCORE SDK was initially tested and validated using a seven-class UML model, and has been used to generate the caCORE production system, which includes models with dozens of classes. The deployed system supports access through object-oriented APIs with consistent syntax for retrieval of any type of data object across all classes in the original UML model. The caCORE SDK is currently being used by several development teams, including by participants in the cancer biomedical informatics grid (caBIG program, to create compatible data services. caBIG compatibility standards are based upon caCORE resources, and thus the caCORE SDK has

  13. Semantic Matching of Web Services Capabilities

    Paolucci, Massimo; Kawamura, Takahiro; Payne, Terry R.; Sycara, Katia

    2002-01-01

    The Web is moving from being a collection of pages toward a collection of services that interoperate through the Internet. The first step towards this interoperation is the location of other services that can help towards the solution of a problem. In this paper we claim that location of web services should be based on the semantic match between a declaritive description of the service being sought, and a description of the service being offered. Furthermore, we claim that this match is outsi...

  14. AVO interoperability demonstration

    Genova, Francois; Allen, Mark; Ochsenbein, Francois; Wicenec, Andreas J.; Arviset, Christophe; Micol, Alberto; Mann, Robert G.; Rixon, Guy T.; Didelon, Pierre; Garrington, Simon T.; Richards, Anita M. S.

    2002-12-01

    AVO Work Area 2 consists of deployment and demonstration of an interoperability prototype. Access to archives of all the partners (ESO, ESA, AstroGrid, Terapix, Jodrell Bank) is implemented via the CDS data federation and integration tools: VizieR and Aladin. The prototype is available for science usage and more functionalities, based in particular on the usage of Uniform Content Descriptors (UCDs) for data mining, will be developed. Case by case discussion with data providers will help to establish a set of practical recommendations for interoperability. Science requirements and new technologies studied by the other AVO work Areas will also be tested. Discussions on standards are ongoing among all VO projects.

  15. Semantic Advertising

    Zamanzadeh, Ben; Ashish, Naveen; Ramakrishnan, Cartic; Zimmerman, John

    2013-01-01

    We present the concept of Semantic Advertising which we see as the future of online advertising. Semantic Advertising is online advertising powered by semantic technology which essentially enables us to represent and reason with concepts and the meaning of things. This paper aims to 1) Define semantic advertising, 2) Place it in the context of broader and more widely used concepts such as the Semantic Web and Semantic Search, 3) Provide a survey of work in related areas such as context matchi...

  16. Empowering open systems through cross-platform interoperability

    Lyke, James C.

    2014-06-01

    Most of the motivations for open systems lie in the expectation of interoperability, sometimes referred to as "plug-and-play". Nothing in the notion of "open-ness", however, guarantees this outcome, which makes the increased interest in open architecture more perplexing. In this paper, we explore certain themes of open architecture. We introduce the concept of "windows of interoperability", which can be used to align disparate portions of architecture. Such "windows of interoperability", which concentrate on a reduced set of protocol and interface features, might achieve many of the broader purposes assigned as benefits in open architecture. Since it is possible to engineer proprietary systems that interoperate effectively, this nuanced definition of interoperability may in fact be a more important concept to understand and nurture for effective systems engineering and maintenance.

  17. Interoperability for Global Observation Data by Ontological Information

    Masahiko Nagai; Masafumi Ono; Ryosuke Shibasaki

    2008-01-01

    The Ontology registry system is developed to collect, manage, and compare ontological informa-tion for integrating global observation data. Data sharing and data service such as support of metadata deign, structudng of data contents, support of text mining are applied for better use of data as data interop-erability. Semantic network dictionary and gazetteers are constructed as a trans-disciplinary dictionary. On-tological information is added to the system by digitalizing text based dictionaries, developing "knowledge writing tool" for experts, and extracting semantic relations from authodtative documents with natural lan-guage processing technique. The system is developed to collect lexicographic ontology and geographic ontology.

  18. Semantic modelling of learning objects and instruction

    Pahl, Claus; Melia, Mark

    2006-01-01

    We introduce an ontology-based semantic modelling framework that addresses subject domain modelling, instruction modelling, and interoperability aspects in the development of complex reusable learning objects. Ontologies are knowledge representation frameworks, ideally suited to support knowledge-based modelling of these learning objects. We illustrate the benefits of semantic modelling for learning object assemblies within the context of standards such as SCORM Sequencing and Navigation and ...

  19. Semantic tags for generative multiview product breakdown

    Paviot, Thomas; Cheutet, Vincent; Lamouri, Samir

    2010-01-01

    The interoperability of IT systems that drive engineering and production processes (i.e. Product Data Management and Enterprise Resource Planning systems) is still an issue. The semantic meaning of product information has to be explicit in order to be able to exchange information between these systems. However, the product breakdown activity generates many disconnected product views over which the product semantics is disseminated and mostly implicit. This paper introduces a methodology allow...

  20. Connecting Archaeological Data and Grey Literature via Semantic Cross Search

    Douglas Tudhope

    2011-07-01

    Full Text Available Differing terminology and database structure hinders meaningful cross search of excavation datasets. Matching free text grey literature reports with datasets poses yet more challenges. Conventional search techniques are unable to cross search between archaeological datasets and Web-based grey literature. Results are reported from two AHRC funded research projects that investigated the use of semantic techniques to link digital archive databases, vocabularies and associated grey literature. STAR (Semantic Technologies for Archaeological Resources was a collaboration between the University of Glamorgan, Hypermedia Research Unit and English Heritage (EH. The main outcome is a research Demonstrator (available online, which cross searches over excavation datasets from different database schemas, including Raunds Roman, Raunds Prehistoric, Museum of London, Silchester Roman and Stanwick sampling. The system additionally cross searches over an extract of excavation reports from the OASIS index of grey literature, operated by the Archaeology Data Service (ADS. A conceptual framework provided by the CIDOC Conceptual Reference Model (CRM integrates the different database structures and the metadata automatically generated from the OASIS reports by natural language processing techniques. The methods employed for extracting semantic RDF representations from the datasets and the information extraction from grey literature are described. The STELLAR project provides freely available tools to reduce the costs of mapping and extracting data to semantic search systems such as the Demonstrator and to linked data representation generally. Detailed use scenarios (and a screen capture video provide a basis for a discussion of key issues, including cost-benefits, ontology modelling, mapping, terminology control, semantic implementation and information extraction issues. The scenarios show that semantic interoperability can be achieved by mapping and extracting

  1. Model and Interoperability using Meta Data Annotations

    David, O.

    2011-12-01

    Software frameworks and architectures are in need for meta data to efficiently support model integration. Modelers have to know the context of a model, often stepping into modeling semantics and auxiliary information usually not provided in a concise structure and universal format, consumable by a range of (modeling) tools. XML often seems the obvious solution for capturing meta data, but its wide adoption to facilitate model interoperability is limited by XML schema fragmentation, complexity, and verbosity outside of a data-automation process. Ontologies seem to overcome those shortcomings, however the practical significance of their use remains to be demonstrated. OMS version 3 took a different approach for meta data representation. The fundamental building block of a modular model in OMS is a software component representing a single physical process, calibration method, or data access approach. Here, programing language features known as Annotations or Attributes were adopted. Within other (non-modeling) frameworks it has been observed that annotations lead to cleaner and leaner application code. Framework-supported model integration, traditionally accomplished using Application Programming Interfaces (API) calls is now achieved using descriptive code annotations. Fully annotated components for various hydrological and Ag-system models now provide information directly for (i) model assembly and building, (ii) data flow analysis for implicit multi-threading or visualization, (iii) automated and comprehensive model documentation of component dependencies, physical data properties, (iv) automated model and component testing, calibration, and optimization, and (v) automated audit-traceability to account for all model resources leading to a particular simulation result. Such a non-invasive methodology leads to models and modeling components with only minimal dependencies on the modeling framework but a strong reference to its originating code. Since models and

  2. Product-driven Enterprise Interoperability for Manufacturing Systems Integration

    Dassisti, Michele; Panetto, Hervé; Tursi, Angela

    2006-01-01

    International audience The “Babel tower effect”, induced by the heterogeneity of applications available in the operation of enterprises brings to a consistent lack of “exchangeability” and risk of semantic loss whenever cooperation has to take place within the same enterprise. Generally speaking, this kind of problem falls within the umbrella of interoperability between local reference information models .This position paper discuss some idea on this field and traces a research roadmap to ...

  3. Interoperability does matter

    Manfred Goepel

    2006-04-01

    Full Text Available In companies, the historically developed IT systems are mostly application islands. They always produce good results if the system's requirements and surroundings are not changed and as long as a system interface is not needed. With the ever increas-ing dynamic and globalization of the market, however, these IT islands are certain to collapse. Interoperability (IO is the bid of the hour, assuming the integration of users, data, applications and processes. In the following, important IO enablers such as ETL, EAI, and SOA will be examined on the basis of practica-bility. It will be shown that especially SOA produces a surge of interoperability that could rightly be referred to as IT evolution.

  4. Driving Innovation Through Interoperability

    John Weigelt

    2008-12-01

    Full Text Available Today's difficult economic environment provides a time of change where information technology matters more than ever. As business and service delivery leaders look to become even more effective and efficient in meeting their client's expectations, they are increasingly looking to electronic channels as an integral element of their business strategies. Regrettably, the ever increasing pace of technological change often disconnects the technology from the business requirements. This disconnection hides technology innovations from the business and has a broader impact of preventing business innovation. This article discusses the role service oriented architecture and interoperability can play in keeping an organization innovative and competitive. We also discuss Microsoft's interoperability principles, its commitment to its open source community, and the benefits of embracing openness as part of an organization's business strategy.

  5. Maturity model for enterprise interoperability

    Guédria, Wided; Naudet, Yannick; Chen, David

    2015-01-01

    Historically, progress occurs when entities communicate, share information and together create something that no one individually could do alone. Moving beyond people to machines and systems, interoperability is becoming a key factor of success in all domains. In particular, interoperability has become a challenge for enterprises, to exploit market opportunities, to meet their own objectives of cooperation or simply to survive in a growing competitive world where the networked enterprise is becoming a standard. Within this context, many research works have been conducted over the past few years and enterprise interoperability has become an important area of research, ensuring the competitiveness and growth of European enterprises. Among others, enterprises have to control their interoperability strategy and enhance their ability to interoperate. This is the purpose of the interoperability assessment. Assessing interoperability maturity allows a company to know its strengths and weaknesses in terms of interoperability with its current and potential partners, and to prioritise actions for improvement. The objective of this paper is to define a maturity model for enterprise interoperability that takes into account existing maturity models while extending the coverage of the interoperability domain. The assessment methodology is also presented. Both are demonstrated with a real case study.

  6. National Flood Interoperability Experiment

    Maidment, D. R.

    2014-12-01

    The National Flood Interoperability Experiment is led by the academic community in collaboration with the National Weather Service through the new National Water Center recently opened on the Tuscaloosa campus of the University of Alabama. The experiment will also involve the partners in IWRSS (Integrated Water Resources Science and Services), which include the USGS, the Corps of Engineers and FEMA. The experiment will address the following questions: (1) How can near-real-time hydrologic forecasting at high spatial resolution, covering the nation, be carried out using the NHDPlus or next generation geofabric (e.g. hillslope, watershed scales)? (2) How can this lead to improved emergency response and community resilience? (3) How can improved an improved interoperability framework support the first two goals and lead to sustained innovation in the research to operations process? The experiment will run from September 2014 through August 2015, in two phases. The mobilization phase from September 2014 until May 2015 will assemble the components of the interoperability framework. A Summer Institute to integrate the components will be held from June to August 2015 at the National Water Center involving faculty and students from the University of Alabama and other institutions coordinated by CUAHSI. It is intended that the insight that arises from this experiment will help lay the foundation for a new national scale, high spatial resolution, near-real-time hydrologic simulation system for the United States.

  7. Bringing Semantics to Web Services: The OWL-S Approach

    Martin, David; Paolucci, Massimo; McIlraith, Sheila; Burnstein, Mark; McDermott, Drew; McGuinness, Deborah; Parsia, Bijan; Payne, Terry R.; Sabou, Marta; Solanki, Monika; Srinivasan, Naveen; Sycara, Katia

    2004-01-01

    Service interface description languages such as WSDL, and related standards, are evolving rapidly to provide a foundation for interoperation between Web services. At the same time, Semantic Web service technologies, such as the Ontology Web Language for Services (OWL-S), are developing the means by which services can be given richer semantic specifications. Richer semantics can enable fuller, more flexible automation of service provision and use, and support the construction of more powerful ...

  8. ASP-SSN: An Effective Approach for Linking Semantic Social Networks

    Sanaa Kaddoura

    2012-11-01

    Full Text Available The dramatic increase of social networking sites forced web users to duplicate their identity on many ofthem. But, the lack of interoperability and linkage between these social networks allowed users’information to be disseminated within walled garden data islands. Achieving interoperability willcontribute to the creation of rich knowledge base that can be used for querying social networks anddiscovering some facts about social connections. This paper presents a new approach for linking semanticsocial networks (SSN. This approach is based on the Answer Set Programming (ASP Paradigm and FuzzyLogic. An ASP-SNN reasoner is developed using the DLV answer set solver and tested on data setsexported from seven different semantic social networks. Fuzzy logic is used to assign a degree of truth toevery discovered link. The proposed approach is simple, generic and intuitive.

  9. Principles of data integration and interoperability in the GEO Biodiversity Observation Network

    Saarenmaa, Hannu; Ó Tuama, Éamonn

    2010-05-01

    The goal of the Global Earth Observation System of Systems (GEOSS) is to link existing information systems into a global and flexible network to address nine areas of critical importance to society. One of these "societal benefit areas" is biodiversity and it will be supported by a GEOSS sub-system known as the GEO Biodiversity Observation Network (GEO BON). In planning the GEO BON, it was soon recognised that there are already a multitude of existing networks and initiatives in place worldwide. What has been lacking is a coordinated framework that allows for information sharing and exchange between the networks. Traversing across the various scales of biodiversity, in particular from the individual and species levels to the ecosystems level has long been a challenge. Furthermore, some of the major regions of the world have already taken steps to coordinate their efforts, but links between the regions have not been a priority until now. Linking biodiversity data to that of the other GEO societal benefit areas, in particular ecosystems, climate, and agriculture to produce useful information for the UN Conventions and other policy-making bodies is another need that calls for integration of information. Integration and interoperability are therefore a major theme of GEO BON, and a "system of systems" is very much needed. There are several approaches to integration that need to be considered. Data integration requires harmonising concepts, agreeing on vocabularies, and building ontologies. Semantic mediation of data using these building blocks is still not easy to achieve. Agreements on, or mappings between, the metadata standards that will be used across the networks is a major requirement that will need to be addressed early on. With interoperable metadata, service integration will be possible through registry of registries systems such as GBIF's forthcoming GBDRS and the GEO Clearinghouse. Chaining various services that build intermediate products using workflow

  10. Leveraging the Semantic Web for Adaptive Education

    Kravcik, Milos; Gasevic, Dragan

    2007-01-01

    In the area of technology-enhanced learning reusability and interoperability issues essentially influence the productivity and efficiency of learning and authoring solutions. There are two basic approaches how to overcome these problems--one attempts to do it via standards and the other by means of the Semantic Web. In practice, these approaches…

  11. -Means Based Fingerprint Segmentation with Sensor Interoperability

    Yang Xiukun

    2010-01-01

    Full Text Available A critical step in an automatic fingerprint recognition system is the segmentation of fingerprint images. Existing methods are usually designed to segment fingerprint images originated from a certain sensor. Thus their performances are significantly affected when dealing with fingerprints collected by different sensors. This work studies the sensor interoperability of fingerprint segmentation algorithms, which refers to the algorithm's ability to adapt to the raw fingerprints obtained from different sensors. We empirically analyze the sensor interoperability problem, and effectively address the issue by proposing a -means based segmentation method called SKI. SKI clusters foreground and background blocks of a fingerprint image based on the -means algorithm, where a fingerprint block is represented by a 3-dimensional feature vector consisting of block-wise coherence, mean, and variance (abbreviated as CMV. SKI also employs morphological postprocessing to achieve favorable segmentation results. We perform SKI on each fingerprint to ensure sensor interoperability. The interoperability and robustness of our method are validated by experiments performed on a number of fingerprint databases which are obtained from various sensors.

  12. Interoperability between .Net framework and Python in Component way

    M. K. Pawar; Ravindra Patel; Dr. N. S. Chaudhari

    2013-01-01

    The objective of this work is to make interoperability of the distributed object based on CORBA middleware technology and standards. The distributed objects for the client-server technology are implemented in C#.Net framework and the Python language. The interoperability result shows the possibilities of application in which objects can communicate in different environment and different languages. It is also analyzing that how to achieve client-server communication in heterogeneous environmen...

  13. Testing Virtual Private Network (VPN) Interoperability

    Tahir, Jemal

    2015-01-01

    While corporations are growing their businesses, they may demand additional remote branch offices in a disparate location. These remote offices need to have a connection to their central corporate network so as to get access to resources and services securely over the public network. To achieve this demand, deploying Virtual Private Networks (VPNs) is an alternate technology. The primary objective of this final year project was to test secure VPN interoperability between two different vend...

  14. An Interoperable Cartographic Database

    Slobodanka Ključanin

    2007-05-01

    Full Text Available The concept of producing a prototype of interoperable cartographic database is explored in this paper, including the possibilities of integration of different geospatial data into the database management system and their visualization on the Internet. The implementation includes vectorization of the concept of a single map page, creation of the cartographic database in an object-relation database, spatial analysis, definition and visualization of the database content in the form of a map on the Internet. 

  15. Inter-operability

    Building an internal gas market implies establishing harmonized rules for cross border trading between operators. To that effect, the European association EASEE-gas is carrying out standards and procedures, commonly called 'inter-operability'. Set up in 2002, the Association brings together all segments of the gas industry: producers, transporters, distributors, traders and shippers, suppliers, consumers and service providers. This workshop presents the latest status on issues such as barriers to gas trade in Europe, rules and procedures under preparation by EASEE-gas, and the implementation schedule of these rules by operators. This article gathers 5 presentations about this topic given at the gas conference

  16. Semantic Extraction for Multi-Enterprise Business Collaboration

    SUN Hongjun; FAN Yushun

    2009-01-01

    Semantic extraction is essential for semantic interoperability in multi-enterprise business collabo-ration environments. Although many studies on semantic extraction have been carried out, few have focused on how to precisely and effectively extract semantics from multiple heterogeneous data schemas. This paper presents a semi-automatic semantic extraction method based on a neutral representation format (NRF) for acquiring semantics from heterogeneous data schemas. As a unified syntax-independent model, NRF re-moves all the contingencies of heterogeneous data schemas from the original data environment. Conceptual extraction and keyword extraction are used to acquire the semantics from the NRF. Conceptual extraction entails constructing a conceptual model, while keyword extraction seeks to obtain the metadata. An industrial case is given to validate the approach. This method has good extensibility and flexibility. The results show that the method provides simple, accurate, and effective semantic intereperability in multi-enterprise busi-ness collaboration environments.

  17. Interoperability of heterogeneous distributed systems

    Zaschke, C.; Essendorfer, B.; Kerth, C.

    2016-05-01

    To achieve knowledge superiority in today's operations interoperability is the key. Budget restrictions as well as the complexity and multiplicity of threats combined with the fact that not single nations but whole areas are subject to attacks force nations to collaborate and share information as appropriate. Multiple data and information sources produce different kinds of data, real time and non-real time, in different formats that are disseminated to the respective command and control level for further distribution. The data is most of the time highly sensitive and restricted in terms of sharing. The question is how to make this data available to the right people at the right time with the right granularity. The Coalition Shared Data concept aims to provide a solution to these questions. It has been developed within several multinational projects and evolved over time. A continuous improvement process was established and resulted in the adaptation of the architecture as well as the technical solution and the processes it supports. Coming from the idea of making use of existing standards and basing the concept on sharing of data through standardized interfaces and formats and enabling metadata based query the concept merged with a more sophisticated service based approach. The paper addresses concepts for information sharing to facilitate interoperability between heterogeneous distributed systems. It introduces the methods that were used and the challenges that had to be overcome. Furthermore, the paper gives a perspective how the concept could be used in the future and what measures have to be taken to successfully bring it into operations.

  18. A semantically rich and standardised approach enhancing discovery of sensor data and metadata

    Kokkinaki, Alexandra; Buck, Justin; Darroch, Louise

    2016-04-01

    The marine environment plays an essential role in the earth's climate. To enhance the ability to monitor the health of this important system, innovative sensors are being produced and combined with state of the art sensor technology. As the number of sensors deployed is continually increasing,, it is a challenge for data users to find the data that meet their specific needs. Furthermore, users need to integrate diverse ocean datasets originating from the same or even different systems. Standards provide a solution to the above mentioned challenges. The Open Geospatial Consortium (OGC) has created Sensor Web Enablement (SWE) standards that enable different sensor networks to establish syntactic interoperability. When combined with widely accepted controlled vocabularies, they become semantically rich and semantic interoperability is achievable. In addition, Linked Data is the recommended best practice for exposing, sharing and connecting information on the Semantic Web using Uniform Resource Identifiers (URIs), Resource Description Framework (RDF) and RDF Query Language (SPARQL). As part of the EU-funded SenseOCEAN project, the British Oceanographic Data Centre (BODC) is working on the standardisation of sensor metadata enabling 'plug and play' sensor integration. Our approach combines standards, controlled vocabularies and persistent URIs to publish sensor descriptions, their data and associated metadata as 5 star Linked Data and OGC SWE (SensorML, Observations & Measurements) standard. Thus sensors become readily discoverable, accessible and useable via the web. Content and context based searching is also enabled since sensors descriptions are understood by machines. Additionally, sensor data can be combined with other sensor or Linked Data datasets to form knowledge. This presentation will describe the work done in BODC to achieve syntactic and semantic interoperability in the sensor domain. It will illustrate the reuse and extension of the Semantic Sensor

  19. The EuroGEOSS Brokering Framework for Multidisciplinary Interoperability

    Santoro, M.; Nativi, S.; Craglia, M.; Boldrini, E.; Vaccari, L.; Papeschi, F.; Bigagli, L.

    2011-12-01

    The Global Earth Observation System of Systems (GEOSS), envisioned by the group of eight most industrialized countries (G-8) in 2003, provides the indispensable framework to integrate the Earth observation efforts at a global level. The European Commission also contributes to the implementation of the GEOSS through research projects funded from its Framework Programme for Research & Development. The EuroGEOSS (A European Approach to GEOSS) project was launched on May 2009 for a three-year period with the aim of supporting existing Earth Observing systems and applications interoperability and use within the GEOSS and INSPIRE frameworks. EuroGEOSS developed a multidisciplinary interoperability infrastructure for the three strategic areas of Drought, Forestry and Biodiversity; this operating capacity is currently being extended to other scientific domains (i.e. Climate Change, Water, Ocean, Weather, etc.) Central to the multidisciplinary infrastructure is the "EuroGEOSS Brokering Framework", which is based on a Brokered SOA (Service Oriented Architecture) Approach. This approach extends the typical SOA archetype introducing "expert" components: the Brokers. The Brokers provide the mediation and distribution functionalities needed to interconnect the distributed and heterogeneous resources characterizing a System of Systems (SoS) environment. Such a solution addresses significant shortcomings characterizing the present SOA implementations for global frameworks, such as multiple protocols and data models interoperability. Currently, the EuroGEOSS multidisciplinary infrastructure is composed of the following brokering components: 1. The Discovery Broker: providing harmonized discovery functionalities by mediating and distributing user queries against tens of heterogeneous services. 2. The Semantic Discovery Augmentation Component: enhancing the capabilities of the discovery broker with semantic query-expansion. 3. The Data Access Broker: enabling users to seamlessly

  20. Extending the GI Brokering Suite to Support New Interoperability Specifications

    Boldrini, E.; Papeschi, F.; Santoro, M.; Nativi, S.

    2014-12-01

    The GI brokering suite provides the discovery, access, and semantic Brokers (i.e. GI-cat, GI-axe, GI-sem) that empower a Brokering framework for multi-disciplinary and multi-organizational interoperability. GI suite has been successfully deployed in the framework of several programmes and initiatives, such as European Union funded projects, NSF BCube, and the intergovernmental coordinated effort Global Earth Observation System of Systems (GEOSS). Each GI suite Broker facilitates interoperability for a particular functionality (i.e. discovery, access, semantic extension) among a set of brokered resources published by autonomous providers (e.g. data repositories, web services, semantic assets) and a set of heterogeneous consumers (e.g. client applications, portals, apps). A wide set of data models, encoding formats, and service protocols are already supported by the GI suite, such as the ones defined by international standardizing organizations like OGC and ISO (e.g. WxS, CSW, SWE, GML, netCDF) and by Community specifications (e.g. THREDDS, OpenSearch, OPeNDAP, ESRI APIs). Using GI suite, resources published by a particular Community or organization through their specific technology (e.g. OPeNDAP/netCDF) can be transparently discovered, accessed, and used by different Communities utilizing their preferred tools (e.g. a GIS visualizing WMS layers). Since Information Technology is a moving target, new standards and technologies continuously emerge and are adopted in the Earth Science context too. Therefore, GI Brokering suite was conceived to be flexible and accommodate new interoperability protocols and data models. For example, GI suite has recently added support to well-used specifications, introduced to implement Linked data, Semantic Web and precise community needs. Amongst the others, they included: DCAT: a RDF vocabulary designed to facilitate interoperability between Web data catalogs. CKAN: a data management system for data distribution, particularly used by

  1. Evaluation of Multistrategy Classifiers for Heterogeneous Ontology Matching On the Semantic Web

    PAN Le-yun; LIU Xiao-qiang; MA Fan-yuan

    2005-01-01

    On the semantic web, data interoperability and ontology heterogeneity are becoming ever more important issues. To resolve these problems, multiple classification methods can be used to learn the matching between ontologies. The paper uses the general statistic classification method to discover category features in data instances and use the first-order learning algorithm FOIL to exploit the semantic relations among data instances. When using mulfistrategy learning approach, a central problem is the evaluation of multistrategy classifiers. The goal and the conditions of using multistrategy classifiers within ontology matching are different from the ones for general text classification. This paper describes the combination rule of multiple classifiers called the Best Outstanding Champion, which is suitable for heterogeneous ontology mapping. On the prediction results of individual methods, the method can well accumulate the correct matching of alone classifier. The experiments show that the approach achieves high accuracy on real-world domain.

  2. An Approach towards Enterprise Interoperability Assessment

    Razavi, Mahsa; Aliee, Fereidoon Shams

    Enterprise Architecture (EA) as a discipline with numerous and enterprise-wide models, can support decision making on enterprise-wide issues. In order to provide such support, EA models should be amenable to analysis of various utilities and quality attributes. This paper provides a method towards EA interoperability analysis. This approach is based on Analytical Hierarchy Process (AHP) and considers the situation of the enterprise in giving weight to the different criteria and sub criteria of each utility. It proposes a quantitative method of assessing Interoperability achievement of different scenarios using AHP based on the knowledge and experience of EA experts and domain experts, and helps in deciding between them. The applicability of the proposed approach is demonstrated using a practical case study.

  3. Flexible Language Interoperability

    Ekman, Torbjörn; Mechlenborg, Peter; Schultz, Ulrik Pagh

    2007-01-01

    Virtual machines raise the abstraction level of the execution environment at the cost of restricting the set of supported languages. Moreover, the ability of a language implementation to integrate with other languages hosted on the same virtual machine typically constrains the features of the lan...... the Smalltalk object model, which provides interoperability for embedded versions of the Smalltalk, Java, and BETA programming languages....... language. In this paper, we present a highly flexible yet efficient approach to hosting multiple programming languages on an object-oriented virtual machine. Our approach is based on extending the interface of each class with language-specific wrapper methods, offering each language a tailored view of a......Virtual machines raise the abstraction level of the execution environment at the cost of restricting the set of supported languages. Moreover, the ability of a language implementation to integrate with other languages hosted on the same virtual machine typically constrains the features of the...

  4. Neuro-Semantics and Semantics.

    Holmes, Stewart W.

    1987-01-01

    Draws distinctions between the terms semantics (dealing with such verbal parameters as dictionaries and "laws" of logic and rhetoric), general semantics (semantics, plus the complex, dynamic, organismal properties of human beings and their physical environment), and neurosemantics (names for relations-based input from the neurosensory system, and…

  5. Fusion is possible only with interoperability agreements; the GEOSS experience

    Percivall, G.

    2008-12-01

    Data fusion is defined for this session as the merging of disparate data sources for multidisciplinary study. Implicit in this definition is that the data consumer may not be intimately familiar with the data sources. In order to achieve fusion of the data, there must be generalized concepts that apply to both the data sources and consumer; and those concepts must be implemented in our information systems. The successes of GEOSS depend on data and information providers accepting and implementing a set of interoperability arrangements, including technical specifications for collecting, processing, storing, and disseminating shared data, metadata, and products. GEOSS interoperability is based on non-proprietary standards, with preference to formal international standards. GEOSS requires a scientific basis for the collection, processing and interpretation of the data. Use of standards is a hallmark of a sound scientific basis. In order communicate effectively to achieve data fusion, interoperability arrangements must be based upon sound scientific principles that have been implemented in efficient and effective tools. Establishing such interoperability arrangements depends upon social processes and technology. Through the use of Interoperability Arrangements based upon standards, GEOSS achieves data fusion to in order to answer humanities critical questions. Decision making in support of societal benefit areas depends upon data fusion in multidisciplinary settings.

  6. Interoperability of clinical decision-support systems and electronic health records using archetypes: a case study in clinical trial eligibility.

    Marcos, Mar; Maldonado, Jose A; Martínez-Salvador, Begoña; Boscá, Diego; Robles, Montserrat

    2013-08-01

    patient recruitment in the framework of a clinical trial for colorectal cancer screening. The utilisation of archetypes not only has proved satisfactory to achieve interoperability between CDSSs and EHRs but also offers various advantages, in particular from a data model perspective. First, the VHR/data models we work with are of a high level of abstraction and can incorporate semantic descriptions. Second, archetypes can potentially deal with different EHR architectures, due to their deliberate independence of the reference model. Third, the archetype instances we obtain are valid instances of the underlying reference model, which would enable e.g. feeding back the EHR with data derived by abstraction mechanisms. Lastly, the medical and technical validity of archetype models would be assured, since in principle clinicians should be the main actors in their development. PMID:23707417

  7. Semantic Web

    O'Hara, Kieron; Hall, Wendy

    2009-01-01

    The Semantic Web is a vision of a web of linked data, allowing querying, integration and sharing of data from distributed sources in heterogeneous formats, using ontologies to provide an associated and explicit semantic interpretation. The article describes the series of layered formalisms and standards that underlie this vision, and chronicles their historical and ongoing development. A number of applications, scientific and otherwise, academic and commercial, are reviewed. The Semantic Web ...

  8. Modeling and formal representation of geospatial knowledge for the Geospatial Semantic Web

    Huang, Hong; Gong, Jianya

    2008-12-01

    GML can only achieve geospatial interoperation at syntactic level. However, it is necessary to resolve difference of spatial cognition in the first place in most occasions, so ontology was introduced to describe geospatial information and services. But it is obviously difficult and improper to let users to find, match and compose services, especially in some occasions there are complicated business logics. Currently, with the gradual introduction of Semantic Web technology (e.g., OWL, SWRL), the focus of the interoperation of geospatial information has shifted from syntactic level to Semantic and even automatic, intelligent level. In this way, Geospatial Semantic Web (GSM) can be put forward as an augmentation to the Semantic Web that additionally includes geospatial abstractions as well as related reasoning, representation and query mechanisms. To advance the implementation of GSM, we first attempt to construct the mechanism of modeling and formal representation of geospatial knowledge, which are also two mostly foundational phases in knowledge engineering (KE). Our attitude in this paper is quite pragmatical: we argue that geospatial context is a formal model of the discriminate environment characters of geospatial knowledge, and the derivation, understanding and using of geospatial knowledge are located in geospatial context. Therefore, first, we put forward a primitive hierarchy of geospatial knowledge referencing first order logic, formal ontologies, rules and GML. Second, a metamodel of geospatial context is proposed and we use the modeling methods and representation languages of formal ontologies to process geospatial context. Thirdly, we extend Web Process Service (WPS) to be compatible with local DLL for geoprocessing and possess inference capability based on OWL.

  9. Heterogeneous software system interoperability through computer-aided resolution of modeling differences

    Young, Paul E.

    2002-01-01

    Approved for public release; distribution is unlimited Meeting future system requirements by integrating existing stand-alone systems is attracting renewed interest. Computer communications advances, functional similarities in related systems, and enhanced information description mechanisms suggest that improved capabilities may be possible; but full realization of this potential can only be achieved if stand-alone systems are fully interoperable. Interoperability among independently devel...

  10. Approach for ontological modeling of database schema for the generation of semantic knowledge on the web

    Rozeva, Anna

    2015-11-01

    Currently there is large quantity of content on web pages that is generated from relational databases. Conceptual domain models provide for the integration of heterogeneous content on semantic level. The use of ontology as conceptual model of a relational data sources makes them available to web agents and services and provides for the employment of ontological techniques for data access, navigation and reasoning. The achievement of interoperability between relational databases and ontologies enriches the web with semantic knowledge. The establishment of semantic database conceptual model based on ontology facilitates the development of data integration systems that use ontology as unified global view. Approach for generation of ontologically based conceptual model is presented. The ontology representing the database schema is obtained by matching schema elements to ontology concepts. Algorithm of the matching process is designed. Infrastructure for the inclusion of mediation between database and ontology for bridging legacy data with formal semantic meaning is presented. Implementation of the knowledge modeling approach on sample database is performed.

  11. A Research on E - learning Resources Construction Based on Semantic Web

    Rui, Liu; Maode, Deng

    Traditional e-learning platforms have the flaws that it's usually difficult to query or positioning, and realize the cross platform sharing and interoperability. In the paper, the semantic web and metadata standard is discussed, and a kind of e - learning system framework based on semantic web is put forward to try to solve the flaws of traditional elearning platforms.

  12. Semantic Enterprise Optimizer and Coexistence of Data Models

    P. A. Sundararajan; Anupama Nithyanand; Subrahmanya, S. V.

    2012-01-01

    The authors propose a semantic ontology–driven enterprise data–model architecture for interoperability, integration, and adaptability for evolution, by autonomic agent-driven intelligent design of logical as well as physical data models in a heterogeneous distributed enterprise through its life cycle. An enterprise-standard ontology (in Web Ontology Language [OWL] and Semantic Web Rule Language [SWRL]) for data is required to enable an automated data platform that adds life-cycle activities t...

  13. Semantic Web,Agent and Network-Virtual Society%Semantic Web、Agent和网络虚拟社会

    戴欣; 申瑞民; 张同珍

    2003-01-01

    This paper tries to discuss one realizable mode of SW(Semantic Web). It is called NVS(Network-Virtual Society). SW is regarded as the next-generation Web. By adding semantics into Web,SW provides interoperability between applications and facilities to enable automated processing of Web resources. Agent will be the executer in the automated process. After analyzing relational theories and technologies, we put forward the concept and mode of NVS,and gives our reason.

  14. On Coreference and the Semantic Web

    Glaser, Hugh; Lewy, Tim; Millard, Ian; Dowling, Ben

    2007-01-01

    Much of the Semantic Web relies upon open and unhindered interoperability between diverse systems; the successful convergence of multiple ontologies and referencing schemes is key. However, this is hampered by the difficult problem of coreference, which is the occurrence of multiple or inconsistent identifiers for a single resource. This paper investigates the origins of this phenomenon and how it is resolved in other fields. With this in mind, we have developed and tested an effective method...

  15. Semantic Web technologies in software engineering

    Gall, H.C.; Reif, G

    2008-01-01

    Over the years, the software engineering community has developed various tools to support the specification, development, and maintainance of software. Many of these tools use proprietary data formats to store artifacts which hamper interoperability. However, the Semantic Web provides a common framework that allows data to be shared and reused across application, enterprise, and community boundaries. Ontologies are used define the concepts in the domain of discourse and their relationships an...

  16. Semantic resource management and interoperability between distributed computing platforms

    Ejarque Artigas, Jorge

    2015-01-01

    Distributed Computing is the paradigm where the application execution is distributed across different computers connected by a communication network. Distributed Computing platforms have evolved very fast during the las decades: starting from Clusters, where a set of computers were working together in a single location; then evolving to the Grids, where computing resources are shared by different entities, creating a global computing infrastructure which is available to different user commun...

  17. Semantic interoperability in sensor applications : Making sense of sensor data

    Brandt, P.; Basten, T.; Stuijk, S.; Bui, V.; Clercq, P. de; Ferreira Pires, L.; Sinderen, M. van

    2013-01-01

    Much effort has been spent on the optimization of sensor networks, mainly concerning their performance and power efficiency. Furthermore, open communication protocols for the exchange of sensor data have been developed and widely adopted, making sensor data widely available for software applications

  18. Semantic Interoperability in Biomedicine and Healthcare III. Editorial

    Svačina, Š.; Zvárová, Jana

    2012-01-01

    Roč. 8, č. 5 (2012), s. 2-2. ISSN 1801-5603 Institutional support: RVO:67985807 Keywords : editorial Subject RIV: IN - Informatics, Computer Science http://www.ejbi.org/img/ejbi/2012/5/Editorial_en.pdf

  19. D-ATM, a working example of health care interoperability: From dirt path to gravel road.

    DeClaris, John-William

    2009-01-01

    For many years, there have been calls for interoperability within health care systems. The technology currently exists and is being used in business areas like banking and commerce, to name a few. Yet the question remains, why has interoperability not been achieved in health care? This paper examines issues encountered and success achieved with interoperability during the development of the Digital Access To Medication (D-ATM) project, sponsored by the Substance Abuse and Mental Health Services Administration (SAMHSA). D-ATM is the first government funded interoperable patient management system. The goal of this paper is to provide lessons learned and propose one possible road map for health care interoperability within private industry and how government can help. PMID:19963614

  20. An ontological system for interoperable spatial generalisation in biodiversity monitoring

    Nieland, Simon; Moran, Niklas; Kleinschmit, Birgit; Förster, Michael

    2015-11-01

    Semantic heterogeneity remains a barrier to data comparability and standardisation of results in different fields of spatial research. Because of its thematic complexity, differing acquisition methods and national nomenclatures, interoperability of biodiversity monitoring information is especially difficult. Since data collection methods and interpretation manuals broadly vary there is a need for automatised, objective methodologies for the generation of comparable data-sets. Ontology-based applications offer vast opportunities in data management and standardisation. This study examines two data-sets of protected heathlands in Germany and Belgium which are based on remote sensing image classification and semantically formalised in an OWL2 ontology. The proposed methodology uses semantic relations of the two data-sets, which are (semi-)automatically derived from remote sensing imagery, to generate objective and comparable information about the status of protected areas by utilising kernel-based spatial reclassification. This automatised method suggests a generalisation approach, which is able to generate delineation of Special Areas of Conservation (SAC) of the European biodiversity Natura 2000 network. Furthermore, it is able to transfer generalisation rules between areas surveyed with varying acquisition methods in different countries by taking into account automated inference of the underlying semantics. The generalisation results were compared with the manual delineation of terrestrial monitoring. For the different habitats in the two sites an accuracy of above 70% was detected. However, it has to be highlighted that the delineation of the ground-truth data inherits a high degree of uncertainty, which is discussed in this study.

  1. The Effect of Adjunct Post-Questions, Metacognitive Process Prompts, Cognitive Feedback and Training in Facilitating Student Achievement from Semantic Maps

    Yamashiro, Kelly Ann C.; Dwyer, Francis

    2006-01-01

    The purpose of this study was to examine the instructional effectiveness of adjunct post-questions, metacognitive process prompts, cognitive feedback and training in complementing semantic maps. Two hundred seventy Taiwanese subjects were randomly assigned to eight treatments. After interacting with their respective treatments each completed three…

  2. Semantic Web for Manufacturing Web Services

    Kulvatunyou, Boonserm [ORNL; Ivezic, Nenad [ORNL

    2002-06-01

    As markets become unexpectedly turbulent with a shortened product life cycle and a power shift towards buyers, the need for methods to rapidly and cost-effectively develop products, production facilities and supporting software is becoming urgent. The use of a virtual enterprise plays a vital role in surviving turbulent markets. However, its success requires reliable and large-scale interoperation among trading partners via a semantic web of trading partners' services whose properties, capabilities, and interfaces are encoded in an unambiguous as well as computer-understandable form. This paper demonstrates a promising approach to integration and interoperation between a design house and a manufacturer by developing semantic web services for business and engineering transactions. To this end, detailed activity and information flow diagrams are developed, in which the two trading partners exchange messages and documents. The properties and capabilities of the manufacturer sites are defined using DARPA Agent Markup Language (DAML) ontology definition language. The prototype development of semantic webs shows that enterprises can widely interoperate in an unambiguous and autonomous manner; hence, virtual enterprise is realizable at a low cost.

  3. Potential interoperability problems facing multi-site radiation oncology centers in The Netherlands

    Scheurleer, J.; Koken, Ph; Wessel, R.

    2014-03-01

    Aim: To identify potential interoperability problems facing multi-site Radiation Oncology (RO) departments in the Netherlands and solutions for unambiguous multi-system workflows. Specific challenges confronting the RO department of VUmc (RO-VUmc), which is soon to open a satellite department, were characterized. Methods: A nationwide questionnaire survey was conducted to identify possible interoperability problems and solutions. Further detailed information was obtained by in-depth interviews at 3 Dutch RO institutes that already operate in more than one site. Results: The survey had a 100% response rate (n=21). Altogether 95 interoperability problems were described. Most reported problems were on a strategic and semantic level. The majority were DICOM(-RT) and HL7 related (n=65), primarily between treatment planning and verification systems or between departmental and hospital systems. Seven were identified as being relevant for RO-VUmc. Departments have overcome interoperability problems with their own, or with tailor-made vendor solutions. There was little knowledge about or utilization of solutions developed by Integrating the Healthcare Enterprise Radiation Oncology (IHE-RO). Conclusions: Although interoperability problems are still common, solutions have been identified. Awareness of IHE-RO needs to be raised. No major new interoperability problems are predicted as RO-VUmc develops into a multi-site department.

  4. Potential interoperability problems facing multi-site radiation oncology centers in The Netherlands

    Aim: To identify potential interoperability problems facing multi-site Radiation Oncology (RO) departments in the Netherlands and solutions for unambiguous multi-system workflows. Specific challenges confronting the RO department of VUmc (RO-VUmc), which is soon to open a satellite department, were characterized. Methods: A nationwide questionnaire survey was conducted to identify possible interoperability problems and solutions. Further detailed information was obtained by in-depth interviews at 3 Dutch RO institutes that already operate in more than one site. Results: The survey had a 100% response rate (n=21). Altogether 95 interoperability problems were described. Most reported problems were on a strategic and semantic level. The majority were DICOM(-RT) and HL7 related (n=65), primarily between treatment planning and verification systems or between departmental and hospital systems. Seven were identified as being relevant for RO-VUmc. Departments have overcome interoperability problems with their own, or with tailor-made vendor solutions. There was little knowledge about or utilization of solutions developed by Integrating the Healthcare Enterprise Radiation Oncology (IHE-RO). Conclusions: Although interoperability problems are still common, solutions have been identified. Awareness of IHE-RO needs to be raised. No major new interoperability problems are predicted as RO-VUmc develops into a multi-site department.

  5. Federated Spatial Databases and Interoperability

    2001-01-01

    It is a period of information explosion. Especially for spatialinfo rmation science, information can be acquired through many ways, such as man-mad e planet, aeroplane, laser, digital photogrammetry and so on. Spatial data source s are usually distributed and heterogeneous. Federated database is the best reso lution for the share and interoperation of spatial database. In this paper, the concepts of federated database and interoperability are introduced. Three hetero geneous kinds of spatial data, vector, image and DEM are used to create integrat ed database. A data model of federated spatial databases is given

  6. CCP interoperability and system stability

    Feng, Xiaobing; Hu, Haibo

    2016-09-01

    To control counterparty risk, financial regulations such as the Dodd-Frank Act are increasingly requiring standardized derivatives trades to be cleared by central counterparties (CCPs). It is anticipated that in the near term future, CCPs across the world will be linked through interoperability agreements that facilitate risk sharing but also serve as a conduit for transmitting shocks. This paper theoretically studies a networked network with CCPs that are linked through interoperability arrangements. The major finding is that the different configurations of networked network CCPs contribute to the different properties of the cascading failures.

  7. Some Notes on Interoperability of GNSS

    YANG Yuanxi

    2016-03-01

    Full Text Available Compatibility and interoperability of GNSS are the hot research issues in international satellite navigation field. It is a requirement for integrated multi GNSS navigation and positioning. The basic concepts of the compatibility and interoperability are introduced and the trend of the interoperability among the GNSS providers is discussed. The status and problems of the frequency interoperability of GPS, BeiDou(BDS, GLONASS and Galileo are analyzed. It is pointed that the frequency interoperability problems will affect the manufacturers and multi GNSS users. The influences of the interoperability problems of the reference coordinate systems are not only resulted from the definitions and realizations of the reference coordinate systems but also from the maintenance and update strategies of the reference systems. The effects of the time datum interoperability and corresponding resolving strategies are also discussed. The influences of the interoperability problems of GNSS are summarized.

  8. Intercloud Architecture for Interoperability and Integration

    Demchenko, Y.; Makkes, M.X.; Strijkers, R.J.; Ngo, C.

    2011-01-01

    This paper presents on-going research to develop the Intercloud Architecture (ICA) Framework that should address problems in multi-provider multi-domain heterogeneous Cloud based infrastructure services and applications integration and interoperability, including integration and interoperability wit

  9. Interoperability between .Net framework and Python in Component way

    M. K. Pawar

    2013-01-01

    Full Text Available The objective of this work is to make interoperability of the distributed object based on CORBA middleware technology and standards. The distributed objects for the client-server technology are implemented in C#.Net framework and the Python language. The interoperability result shows the possibilities of application in which objects can communicate in different environment and different languages. It is also analyzing that how to achieve client-server communication in heterogeneous environment using the OmniORBpy IDL compiler and IIOP.NET IDLtoCLS mapping. The results were obtained that demonstrate the interoperability between .Net Framework and Python language. This paper also summarizes a set of fairly simple examples using some reasonably complex software tools.

  10. A Review of Ontologies with the Semantic Web in View.

    Ding, Ying

    2001-01-01

    Discusses the movement of the World Wide Web from the first generation to the second, called the Semantic Web. Provides an overview of ontology, a philosophical theory about the nature of existence being applied to artificial intelligence that will have a crucial role in enabling content-based access, interoperability, and communication across the…

  11. The interoperability force in the ERP field

    Boza Garcia, Andres; Cuenca, L.; Poler Escoto, Raúl; Michaelides, Zenon

    2015-01-01

    Enterprise resource planning (ERP) systems participate in interoperability projects and this participation sometimes leads to new proposals for the ERP field. The aim of this paper is to identify the role that interoperability plays in the evolution of ERP systems. To go about this, ERP systems have been first identified within interoperability frameworks. Second, the initiatives in the ERP field driven by interoperability requirements have been identified from two perspectives: technological...

  12. Semantic Desktop

    Sauermann, Leo; Kiesel, Malte; Schumacher, Kinga; Bernardi, Ansgar

    In diesem Beitrag wird gezeigt, wie der Arbeitsplatz der Zukunft aussehen könnte und wo das Semantic Web neue Möglichkeiten eröffnet. Dazu werden Ansätze aus dem Bereich Semantic Web, Knowledge Representation, Desktop-Anwendungen und Visualisierung vorgestellt, die es uns ermöglichen, die bestehenden Daten eines Benutzers neu zu interpretieren und zu verwenden. Dabei bringt die Kombination von Semantic Web und Desktop Computern besondere Vorteile - ein Paradigma, das unter dem Titel Semantic Desktop bekannt ist. Die beschriebenen Möglichkeiten der Applikationsintegration sind aber nicht auf den Desktop beschränkt, sondern können genauso in Web-Anwendungen Verwendung finden.

  13. Understanding semantics

    Thrane, Torben

    1997-01-01

    Understanding natural language is a cognitive, information-driven process. Discussing some of the consequences of this fact, the paper offers a novel look at the semantic effect of lexical nouns and the identification of reference types.......Understanding natural language is a cognitive, information-driven process. Discussing some of the consequences of this fact, the paper offers a novel look at the semantic effect of lexical nouns and the identification of reference types....

  14. Comparison Latent Semantic and WordNet Approach for Semantic Similarity Calculation

    Wicaksana, I Wayan Simri

    2011-01-01

    Information exchange among many sources in Internet is more autonomous, dynamic and free. The situation drive difference view of concepts among sources. For example, word 'bank' has meaning as economic institution for economy domain, but for ecology domain it will be defined as slope of river or lake. In this aper, we will evaluate latent semantic and WordNet approach to calculate semantic similarity. The evaluation will be run for some concepts from different domain with reference by expert or human. Result of the evaluation can provide a contribution for mapping of concept, query rewriting, interoperability, etc.

  15. Auto-Generated Semantic Processing Services

    Davis, Rodney; Hupf, Greg

    2009-01-01

    Auto-Generated Semantic Processing (AGSP) Services is a suite of software tools for automated generation of other computer programs, denoted cross-platform semantic adapters, that support interoperability of computer-based communication systems that utilize a variety of both new and legacy communication software running in a variety of operating- system/computer-hardware combinations. AGSP has numerous potential uses in military, space-exploration, and other government applications as well as in commercial telecommunications. The cross-platform semantic adapters take advantage of common features of computer- based communication systems to enforce semantics, messaging protocols, and standards of processing of streams of binary data to ensure integrity of data and consistency of meaning among interoperating systems. The auto-generation aspect of AGSP Services reduces development time and effort by emphasizing specification and minimizing implementation: In effect, the design, building, and debugging of software for effecting conversions among complex communication protocols, custom device mappings, and unique data-manipulation algorithms is replaced with metadata specifications that map to an abstract platform-independent communications model. AGSP Services is modular and has been shown to be easily integrable into new and legacy NASA flight and ground communication systems.

  16. The MMI Semantic Framework: Rosetta Stones for Earth Sciences

    Rueda, C.; Bermudez, L. E.; Graybeal, J.; Alexander, P.

    2009-12-01

    Semantic interoperability—the exchange of meaning among computer systems—is needed to successfully share data in Ocean Science and across all Earth sciences. The best approach toward semantic interoperability requires a designed framework, and operationally tested tools and infrastructure within that framework. Currently available technologies make a scientific semantic framework feasible, but its development requires sustainable architectural vision and development processes. This presentation outlines the MMI Semantic Framework, including recent progress on it and its client applications. The MMI Semantic Framework consists of tools, infrastructure, and operational and community procedures and best practices, to meet short-term and long-term semantic interoperability goals. The design and prioritization of the semantic framework capabilities are based on real-world scenarios in Earth observation systems. We describe some key uses cases, as well as the associated requirements for building the overall infrastructure, which is realized through the MMI Ontology Registry and Repository. This system includes support for community creation and sharing of semantic content, ontology registration, version management, and seamless integration of user-friendly tools and application programming interfaces. The presentation describes the architectural components for semantic mediation, registry and repository for vocabularies, ontology, and term mappings. We show how the technologies and approaches in the framework can address community needs for managing and exchanging semantic information. We will demonstrate how different types of users and client applications exploit the tools and services for data aggregation, visualization, archiving, and integration. Specific examples from OOSTethys (http://www.oostethys.org) and the Ocean Observatories Initiative Cyberinfrastructure (http://www.oceanobservatories.org) will be cited. Finally, we show how semantic augmentation of web

  17. Open Health Tools: Tooling for Interoperable Healthcare

    Skip McGaughey

    2008-11-01

    Full Text Available The Open Health Tools initiative is creating an ecosystem focused on the production of software tooling that promotes the exchange of medical information across political, geographic, cultural, product, and technology lines. At its core, OHT believes that the availability of high-quality tooling that interoperates will propel the industry forward, enabling organizations and vendors to build products and systems that effectively work together. This will ?raise the interoperability bar? as a result of having tools that just work. To achieve these lofty goals, careful consideration must be made to the constituencies that will be most affected by an OHT-influenced world. This document outlines a vision of OHT?s impact to these stakeholders. It does not explain the OHT process itself or how the OHT community operates. Instead, we place emphasis on the impact of that process within the health industry. The catchphrase ?code is king? underpins this document, meaning that the manifestation of any open source community lies in the products and technology it produces.

  18. Automated testing of healthcare document transformations in the PICASSO interoperability platform

    Pascale, Massimo; Roselli, Marcello; Rugani, Umberto; Bartolini, Cesare; Bertolino, Antonia; Lonetti, Francesca; Marchetti, Eda; Polini, Andrea

    2009-01-01

    In every application domain, achieving interoperability among heterogenous information systems is a crucial challenge and alliances are formed to standardize data-exchange formats. In the healthcare sector, HL7-V3 provides the current international reference models for clinical and administrative documents. Codices, an Italian company, provides the PICASSO platform that uses HL7-V3 as the pivot format to fast achieve a highly integrated degree of interoperability among health-related applicat...

  19. The interoperability force in the ERP field

    Boza, Andrés; Cuenca, Llanos; Poler, Raúl; Michaelides, Zenon

    2015-04-01

    Enterprise resource planning (ERP) systems participate in interoperability projects and this participation sometimes leads to new proposals for the ERP field. The aim of this paper is to identify the role that interoperability plays in the evolution of ERP systems. To go about this, ERP systems have been first identified within interoperability frameworks. Second, the initiatives in the ERP field driven by interoperability requirements have been identified from two perspectives: technological and business. The ERP field is evolving from classical ERP as information system integrators to a new generation of fully interoperable ERP. Interoperability is changing the way of running business, and ERP systems are changing to adapt to the current stream of interoperability.

  20. Maintaining Interoperability in a Target-Rich Environment

    Ng, Mei Ling Venessa

    2012-01-01

    Achieving interoperability in a net-centric environment is fundamental to maximizing the potential of information sharing and effective use of resources in military operations. With the increasing reliance on unmanned platforms worldwide, there is a need to study the limitations of existing Command and Control (C2) Systems in dealing with the increasing number of objects. More processing power would be required to achieve or maintain a certain level of efficiency and effectiveness of the C2 s...

  1. Smart Spaces and Smart Objects interoperability Architecture (S3OiA)

    Vega Barbas, Mario; Valero Duboy, Miguel Ángel; Casado Mansilla, Diego; López de Ipiña, Diego; Bravo Rodríguez, José; Flórez Revuelta, Francisco

    2012-01-01

    The presented work aims to contribute towards the standardization and the interoperability off the Future Internet through an open and scalable architecture design. We present S³OiA as a syntactic/semantic Service-Oriented Architecture that allows the integration of any type of object or device, not mattering their nature, on the Internet of Things. Moreover, the architecture makes possible the use of underlying heterogeneous resources as a substrate for the automatic composition of complex a...

  2. Opportunities for the Mashup of Heterogenous Data Server via Semantic Web Technology

    Ritschel, Bernd; Seelus, Christoph; Neher, Günther; Iyemori, Toshihiko; Koyama, Yukinobu; Yatagai, Akiyo; Murayama, Yasuhiro; King, Todd; Hughes, John; Fung, Shing; Galkin, Ivan; Hapgood, Michael; Belehaki, Anna

    2015-04-01

    Opportunities for the Mashup of Heterogenous Data Server via Semantic Web Technology European Union ESPAS, Japanese IUGONET and GFZ ISDC data server are developed for the ingestion, archiving and distributing of geo and space science domain data. Main parts of the data -managed by the mentioned data server- are related to near earth-space and geomagnetic field data. A smart mashup of the data server would allow a seamless browse and access to data and related context information. However the achievement of a high level of interoperability is a challenge because the data server are based on different data models and software frameworks. This paper is focused on the latest experiments and results for the mashup of the data server using the semantic Web approach. Besides the mashup of domain and terminological ontologies, especially the options to connect data managed by relational databases using D2R server and SPARQL technology will be addressed. A successful realization of the data server mashup will not only have a positive impact to the data users of the specific scientific domain but also to related projects, such as e.g. the development of a new interoperable version of NASA's Planetary Data System (PDS) or ICUS's World Data System alliance. ESPAS data server: https://www.espas-fp7.eu/portal/ IUGONET data server: http://search.iugonet.org/iugonet/ GFZ ISDC data server (semantic Web based prototype): http://rz-vm30.gfz-potsdam.de/drupal-7.9/ NASA PDS: http://pds.nasa.gov ICSU-WDS: https://www.icsu-wds.org

  3. Developing Interoperable Online Backup Software

    Nida, Dawit

    2011-01-01

    With ever-increasing amounts of digital data, various data storing techniques can be applied to overcome and minimize the risk of losing a single file or the whole system data. Data can be stored using different mechanisms including online backup.The main objective of this project was to design and implement interoperable online backup software initiated by the Green Spot Media Farm company residing in Helsinki, Finland. In addition, this documentation focuses on establishing a fundamental...

  4. Semantic SenseLab: implementing the vision of the Semantic Web in neuroscience

    Samwald, Matthias; Chen, Huajun; Ruttenberg, Alan; Lim, Ernest; Marenco, Luis; Miller, Perry; Shepherd, Gordon; Cheung, Kei-Hoi

    2011-01-01

    Summary Objective Integrative neuroscience research needs a scalable informatics framework that enables semantic integration of diverse types of neuroscience data. This paper describes the use of the Web Ontology Language (OWL) and other Semantic Web technologies for the representation and integration of molecular-level data provided by several of SenseLab suite of neuroscience databases. Methods Based on the original database structure, we semi-automatically translated the databases into OWL ontologies with manual addition of semantic enrichment. The SenseLab ontologies are extensively linked to other biomedical Semantic Web resources, including the Subcellular Anatomy Ontology, Brain Architecture Management System, the Gene Ontology, BIRNLex and UniProt. The SenseLab ontologies have also been mapped to the Basic Formal Ontology and Relation Ontology, which helps ease interoperability with many other existing and future biomedical ontologies for the Semantic Web. In addition, approaches to representing contradictory research statements are described. The SenseLab ontologies are designed for use on the Semantic Web that enables their integration into a growing collection of biomedical information resources. Conclusion We demonstrate that our approach can yield significant potential benefits and that the Semantic Web is rapidly becoming mature enough to realize its anticipated promises. The ontologies are available online at http://neuroweb.med.yale.edu/senselab/ PMID:20006477

  5. PROPOSED CONCETUAL DEVELOPMENT LEVELS FOR IDEAL INTEROPERABILITY AND SECURITY IN MODERN DIGITAL GOVERNMENT

    Md.Headayetullah

    2010-06-01

    protocol and ideal interoperability are unhurried the imperative issues for achieving a sophisticated phase of modern digital government.

  6. Vocabulary services to support scientific data interoperability

    Cox, Simon; Mills, Katie; Tan, Florence

    2013-04-01

    Shared vocabularies are a core element in interoperable systems. Vocabularies need to be available at run-time, and where the vocabularies are shared by a distributed community this implies the use of web technology to provide vocabulary services. Given the ubiquity of vocabularies or classifiers in systems, vocabulary services are effectively the base of the interoperability stack. In contemporary knowledge organization systems, a vocabulary item is considered a concept, with the "terms" denoting it appearing as labels. The Simple Knowledge Organization System (SKOS) formalizes this as an RDF Schema (RDFS) application, with a bridge to formal logic in Web Ontology Language (OWL). For maximum utility, a vocabulary should be made available through the following interfaces: * the vocabulary as a whole - at an ontology URI corresponding to a vocabulary document * each item in the vocabulary - at the item URI * summaries, subsets, and resources derived by transformation * through the standard RDF web API - i.e. a SPARQL endpoint * through a query form for human users. However, the vocabulary data model may be leveraged directly in a standard vocabulary API that uses the semantics provided by SKOS. SISSvoc3 [1] accomplishes this as a standard set of URI templates for a vocabulary. Any URI comforming to the template selects a vocabulary subset based on the SKOS properties, including labels (skos:prefLabel, skos:altLabel, rdfs:label) and a subset of the semantic relations (skos:broader, skos:narrower, etc). SISSvoc3 thus provides a RESTFul SKOS API to query a vocabulary, but hiding the complexity of SPARQL. It has been implemented using the Linked Data API (LDA) [2], which connects to a SPARQL endpoint. By using LDA, we also get content-negotiation, alternative views, paging, metadata and other functionality provided in a standard way. A number of vocabularies have been formalized in SKOS and deployed by CSIRO, the Australian Bureau of Meteorology (BOM) and their

  7. A state-of-the-art review of interoperability amongst heterogeneous software systems

    Carlos Mario Zapata Jaramillo

    2010-05-01

    Full Text Available Information systems are sets of interacting elements aimed at supporting entrepreneurial or business activities; they cannot thus coexist in an isolated way but require their data to be shared so as to increase their productivity. Such systems’ interoperability is normally accomplished through mark-up standards, query languages and web services. The literature contains work related to software system interoperability; however, it presents some difficulties, such as the need for using the same platforms and different programming languages, the use of read only languages and the deficiencies in the formalism used for achieving it. This paper presents a critical review of the advances made regarding heterogeneous software systems’ interoperability.

  8. OGC Geographic Information Service Deductive Semantic Reasoning Based on Description Vocabularies Reduction

    MIAO Lizhi; Xu, Jie; Zhou, Ya; CHENG Wenchao

    2015-01-01

    As geographic information interoperability and sharing developing, more and more interoperable OGC (open geospatial consortium) Web services (OWS) are generated and published through the internet. These services can facilitate the integration of different scientific applications by searching, finding, and utilizing the large number of scientific data and Web services. However, these services are widely dispersed and hard to be found and utilized with executive semantic retrieval. This is espe...

  9. K-Means Based Fingerprint Segmentation with Sensor Interoperability

    Xiukun Yang

    2010-01-01

    Full Text Available A critical step in an automatic fingerprint recognition system is the segmentation of fingerprint images. Existing methods are usually designed to segment fingerprint images originated from a certain sensor. Thus their performances are significantly affected when dealing with fingerprints collected by different sensors. This work studies the sensor interoperability of fingerprint segmentation algorithms, which refers to the algorithm's ability to adapt to the raw fingerprints obtained from different sensors. We empirically analyze the sensor interoperability problem, and effectively address the issue by proposing a k-means based segmentation method called SKI. SKI clusters foreground and background blocks of a fingerprint image based on the k-means algorithm, where a fingerprint block is represented by a 3-dimensional feature vector consisting of block-wise coherence, mean, and variance (abbreviated as CMV. SKI also employs morphological postprocessing to achieve favorable segmentation results. We perform SKI on each fingerprint to ensure sensor interoperability. The interoperability and robustness of our method are validated by experiments performed on a number of fingerprint databases which are obtained from various sensors.

  10. HEALTH SYSTEMS INTEROPERABILITY: ANALYSIS AND COMPARISON

    Guedria, Wided; Lamine, Elyes; Pingaud, Hervé

    2014-01-01

    Colloque avec actes et comité de lecture. internationale. International audience ": Promoting eHealth interoperability is a priority in Europe to enhance the quality and safety of patient care. However, this priority is very difficult to establish. Develop an interoperable system, or controlling systems in-teroperation have been approached from multiple points of view, with many dimensions and under various types of approaches.Several studies and initiatives have been proposed in the fi...

  11. Interoperability Issues for VPN IPsec Solutions

    Iulian Danalachi

    2011-03-01

    Full Text Available An issue of testing that should be taken into consideration is the compatibility and interoperability of the IPsec components when implementing an IPsec solution. This article will guide us trough some key point introductive notions involved in the interoperability problem, we’ll see a short overview of some of these problems and afterwards we will discuss about some of the testing solutions of IPsec interoperability that we should take into consideration.

  12. HTML5 microdata as a semantic container for medical information exchange.

    Kimura, Eizen; Kobayashi, Shinji; Ishihara, Ken

    2014-01-01

    Achieving interoperability between clinical electronic medical records (EMR) systems and cloud computing systems is challenging because of the lack of a universal reference method as a standard for information exchange with a secure connection. Here we describe an information exchange scheme using HTML5 microdata, where the standard semantic container is an HTML document. We embed HL7 messages describing laboratory test results in the microdata. We also annotate items in the clinical research report with the microdata. We mapped the laboratory test result data into the clinical research report using an HL7 selector specified in the microdata. This scheme can provide secure cooperation between the cloud-based service and the EMR system. PMID:25160218

  13. A SEMANTICALLY DISTRIBUTED APPROACH TO MAP IP TRAFFIC MEASUREMENTS TO A STANDARDIZED ONTOLOGY

    Alfredo Salvador

    2010-01-01

    Full Text Available Traffic monitoring in IP networks is a key issue for operators to guarantee Service Level Agreement bothto their clients and with regards to other connectivity providers. Thus, having efficient solutions fortraffic measurement and monitoring supports a good deal of their business and it is essential to fairdevelopment of Internet. However, even if service management is well recognized, QoS strategies mustevolve from circuit switching technological framework towards next generation networks and convergentservices concepts. Standardizing IP traffic measurement is a requirement for interoperable service awaremanagement systems upon which future Internet business would be based.A few projects have recently tackled the task of building rich infrastructures to provide IP trafficmeasurements. The European project MOMENT approach combines SOA and semantic search concepts:a mediator between clients and measurement tools has been designed in order to offer integrated accessto the infrastructures, regardless their specific details, with the possibility of achieving complex queries.Pervasiveness of ontologies has been used for various purposes in the project. As such, one ontologydeals traffic measurement data, another one describes metadata that is used instead of data for practicalreasons, a third one focuses on anonymization required for ethical (and legal restrictions and the lastone describes general concepts from the field. This paper outlines the role of these ontologies andpresents the process to achieve them from a set of traffic measurement databases as well as theintegration of specific modules in the mediator to achieve the semantic queries.

  14. Semantic Annotation: The Mainstay of Semantic Web

    Slimani, Thabet

    2013-01-01

    Given that semantic Web realization is based on the critical mass of metadata accessibility and the representation of data with formal knowledge, it needs to generate metadata that is specific, easy to understand and well-defined. However, semantic annotation of the web documents is the successful way to make the Semantic Web vision a reality. This paper introduces the Semantic Web and its vision (stack layers) with regard to some concept definitions that helps the understanding of semantic a...

  15. Interoperability for Space Mission Monitor and Control: Applying Technologies from Manufacturing Automation and Process Control Industries

    Jones, Michael K.

    1998-01-01

    Various issues associated with interoperability for space mission monitor and control are presented in viewgraph form. Specific topics include: 1) Space Project Mission Operations Control Architecture (SuperMOCA) goals and methods for achieving them; 2) Specifics on the architecture: open standards ad layering, enhancing interoperability, and promoting commercialization; 3) An advertisement; 4) Status of the task - government/industry cooperation and architecture and technology demonstrations; and 5) Key features of messaging services and virtual devices.

  16. Information interoperability and information standardisation for NATO C2 - a practical approach

    Lasschuyt, E.; Hekken, M.C. van

    2001-01-01

    Interoperability between information systems is usually 'achieved' by enabling connection at network level. Making systems really interoperable, by letting them understand and manipulate the exchanged information, requires a lot more. Above all, information standards are needed in order to gain common understanding about what will be exchanged. Besides that, information standardisation should be considered from a global point of view, taking into account the whole range of systems that will p...

  17. On the use of an Interoperability Framework in Coopetition Context

    Guédria, Wided; Golnam, Arash; Naudet, Yannick; Chen, David; Wegmann, Alain

    2011-01-01

    The simultaneous cooperation and competition between companies referred to as coopetition in the strategy literature is becoming a recurring theme in the business settings. Companies cooperate with their competitors to gain access to supplementary and complementary resources and capabilities in order to create more value for the customers in order to achieve sustainable value creation and distribution. To coopete, the companies need to be interoperable. Growing globalization, competitiveness ...

  18. Interoperability Infrastructure and Incremental learning for unreliable heterogeneous communicating Systems

    Haseeb, Abdul

    2009-01-01

    In a broader sense the main research objective of this thesis (and ongoing research work) is distributed knowledge management for mobile dynamic systems. But the primary focus and presented work focuses on communication/interoperability of heterogeneous entities in an infrastructure less paradigm, a distributed resource manipulation infrastructure and distributed learning in the absence of global knowledge. The research objectives achieved discover the design aspects of heterogeneous distribu...

  19. Integrated semantics service platform for the Internet of Things: a case study of a smart office.

    Ryu, Minwoo; Kim, Jaeho; Yun, Jaeseok

    2015-01-01

    The Internet of Things (IoT) allows machines and devices in the world to connect with each other and generate a huge amount of data, which has a great potential to provide useful knowledge across service domains. Combining the context of IoT with semantic technologies, we can build integrated semantic systems to support semantic interoperability. In this paper, we propose an integrated semantic service platform (ISSP) to support ontological models in various IoT-based service domains of a smart city. In particular, we address three main problems for providing integrated semantic services together with IoT systems: semantic discovery, dynamic semantic representation, and semantic data repository for IoT resources. To show the feasibility of the ISSP, we develop a prototype service for a smart office using the ISSP, which can provide a preset, personalized office environment by interpreting user text input via a smartphone. We also discuss a scenario to show how the ISSP-based method would help build a smart city, where services in each service domain can discover and exploit IoT resources that are wanted across domains. We expect that our method could eventually contribute to providing people in a smart city with more integrated, comprehensive services based on semantic interoperability. PMID:25608216

  20. Analyzing Interoperability of Protocols Using Model Checking

    WUPeng

    2005-01-01

    In practical terms, protocol interoperability testing is still laborious and error-prone with little effect, even for those products that have passed conformance testing. Deadlock and unsymmetrical data communication are familiar in interoperability testing, and it is always very hard to trace their causes. The previous work has not provided a coherent way to analyze why the interoperability was broken among protocol implementations under test. In this paper, an alternative approach is presented to analyzing these problems from a viewpoint of implementation structures. Sequential and concurrent structures are both representative implementation structures, especially in event-driven development model. Our research mainly discusses the influence of sequential and concurrent structures on interoperability, with two instructive conclusions: (a) a sequential structure may lead to deadlock; (b) a concurrent structure may lead to unsymmetrical data communication. Therefore, implementation structures carry weight on interoperability, which may not gain much attention before. To some extent, they are decisive on the result of interoperability testing. Moreover, a concurrent structure with a sound task-scheduling strategy may contribute to the interoperability of a protocol implementation. Herein model checking technique is introduced into interoperability analysis for the first time. As the paper shows, it is an effective way to validate developers' selections on implementation structures or strategies.

  1. Interoperability of Web Archives and Digital Libraries

    Kalb, Hendrik; Lazaridou, Paraskevi; Pinsent, Edward;

    2013-01-01

    The interoperability of web archives and digital libraries is crucial to avoid silos of preserved data and content. While various researches focus on specfic facets of the challenge to interoperate, there is a lack of empirical work about the overall situation of actual challenges. We conduct a...

  2. Model for Trans-sector Digital Interoperability

    Madureira, António; Hartog, den Frank; Silva, Eduardo; Baken, Nico; Zhao, L.; Macaulay, L.

    2009-01-01

    Interoperability refers to the ability of two or more systems or components to exchange information and to use the information that has been exchanged. The importance of interoperability has grown together with the adoption of Digital Information Networks (DINs). DINs refer to information networks s

  3. Model for Trans-sector Digital Interoperability

    Madureira, A.; Hartog, F.T.H. den; Silva, E.; Baken, N.

    2010-01-01

    Interoperability refers to the ability of two or more systems or components to exchange information and to use the information that has been exchanged. The importance of interoperability has grown together with the adoption of Digital Information Networks (DINs). DINs refer to information networks s

  4. The Information Systems Interoperability Maturity Model (ISIMM: Towards Standardizing Technical Interoperability and Assessment within Government

    STEFANUS Van Staden

    2012-10-01

    Full Text Available To establish and implement a workable e-Government, all possible and relevant stakeholders’ systems need to be inter-connected in such away that the hardware, software and data are interoperable. Thus, interoperability is the key to information exchange and sharing among the heterogeneous systems. In view of this, the paper introduces the Information Systems Interoperability Maturity Model (ISIMM that defines the levels and degree of interoperability sophistication that an organisation’s Information Systems will progress through. ISIMM focuses more on detailed technical aspects of interoperability that allows data to be exchanged and shared within an information system environment. In this way, it provides the practical means of assessing technical interoperability between information system pairs, groups or clusters and it facilitates a model to measure the maturity and compliancy levels of interoperability information systems.

  5. Semantic Vector Machines

    Vincent, Etter

    2011-01-01

    We first present our work in machine translation, during which we used aligned sentences to train a neural network to embed n-grams of different languages into an $d$-dimensional space, such that n-grams that are the translation of each other are close with respect to some metric. Good n-grams to n-grams translation results were achieved, but full sentences translation is still problematic. We realized that learning semantics of sentences and documents was the key for solving a lot of natural language processing problems, and thus moved to the second part of our work: sentence compression. We introduce a flexible neural network architecture for learning embeddings of words and sentences that extract their semantics, propose an efficient implementation in the Torch framework and present embedding results comparable to the ones obtained with classical neural language models, while being more powerful.

  6. Innovation in OGC: The Interoperability Program

    George Percivall

    2015-10-01

    Full Text Available The OGC Interoperability Program is a source of innovation in the development of open standards. The approach to innovation is based on hands-on; collaborative engineering leading to more mature standards and implementations. The process of the Interoperability Program engages a community of sponsors and participants based on an economic model that benefits all involved. Each initiative begins with an innovative approach to identify interoperability needs followed by agile software development to advance the state of technology to the benefit of society. Over eighty initiatives have been conducted in the Interoperability Program since the breakthrough Web Mapping Testbed began the program in 1999. OGC standards that were initiated in Interoperability Program are the basis of two thirds of the certified compliant products.

  7. Jigsaw Semantics

    Paul J. E. Dekker

    2010-12-01

    Full Text Available In the last decade the enterprise of formal semantics has been under attack from several philosophical and linguistic perspectives, and it has certainly suffered from its own scattered state, which hosts quite a variety of paradigms which may seem to be incompatible. It will not do to try and answer the arguments of the critics, because the arguments are often well-taken. The negative conclusions, however, I believe are not. The only adequate reply seems to be a constructive one, which puts several pieces of formal semantics, in particular dynamic semantics, together again. In this paper I will try and sketch an overview of tasks, techniques, and results, which serves to at least suggest that it is possible to develop a coherent overall picture of undeniably important and structural phenomena in the interpretation of natural language. The idea is that the concept of meanings as truth conditions after all provides an excellent start for an integrated study of the meaning and use of natural language, and that an extended notion of goal directed pragmatics naturally complements this picture. None of the results reported here are really new, but we think it is important to re-collect them.ReferencesAsher, Nicholas & Lascarides, Alex. 1998. ‘Questions in Dialogue’. Linguistics and Philosophy 23: 237–309.http://dx.doi.org/10.1023/A:1005364332007Borg, Emma. 2007. ‘Minimalism versus contextualism in semantics’. In Gerhard Preyer & Georg Peter (eds. ‘Context-Sensitivity and Semantic Minimalism’, pp. 339–359. Oxford: Oxford University Press.Cappelen, Herman & Lepore, Ernest. 1997. ‘On an Alleged Connection between Indirect Quotation and Semantic Theory’. Mind and Language 12: pp. 278–296.Cappelen, Herman & Lepore, Ernie. 2005. Insensitive Semantics. Oxford: Blackwell.http://dx.doi.org/10.1002/9780470755792Dekker, Paul. 2002. ‘Meaning and Use of Indefinite Expressions’. Journal of Logic, Language and Information 11: pp. 141–194

  8. Semantic web mining

    Stumme, Gerd; Hotho, Andreas; Berendt, Bettina

    2006-01-01

    Semantic Web Mining aims at combining the two fast-developing research areas Semantic Web and Web Mining. This survey analyzes the convergence of trends from both areas: an increasing number of researchers is working on improving the results of Web Mining by exploiting semantic structures in the Web, and they make use of Web Mining techniques for building the Semantic Web. Last but not least, these techniques can be used for mining the Semantic Web itself. The Semantic Web is t...

  9. Inter-Operating Grids Through Delegated MatchMaking

    Alexandru Iosup

    2008-01-01

    Full Text Available The grid vision of a single computing utility has yet to materialize: while many grids with thousands of processors each exist, most work in isolation. An important obstacle for the effective and efficient inter-operation of grids is the problem of resource selection. In this paper we propose a solution to this problem that combines the hierarchical and decentralized approaches for interconnecting grids. In our solution, a hierarchy of grid sites is augmented with peer-to-peer connections between sites under the same administrative control. To operate this architecture, we employ the key concept of delegated matchmaking, which temporarily binds resources from remote sites to the local environment. With trace-based simulations we evaluate our solution under various infrastructural and load conditions, and we show that it outperforms other approaches to inter-operating grids. Specifically, we show that delegated matchmaking achieves up to 60% more goodput and completes 26% more jobs than its best alternative.

  10. Efficient semantic-based IoT service discovery mechanism for dynamic environments

    Ben Fredj, Sameh; Boussard, Mathieu; Kofman, Daniel; Noirie, Ludovic

    2014-01-01

    —The adoption of Service Oriented Architecture (SOA) and Semantic Web technologies in the Internet of Things (IoT) enables to enhance the interoperability of devices by abstracting their capabilities as services and enriching their descriptions with machine-interpretable semantics. This facilitates the discovery and composition of IoT services. The increasing number of IoT services, their dynamicity and geographical distribution require to think about mechanisms to enable scalable and effecti...

  11. INFRAWEBS Semantic Web Service Development on the Base of Knowledge Management Layer

    Nern, Joachim; Agre, Gennady; Atanasova, Tatiana; Marinova, Zlatina; Micsik, András; Kovács, László; Saarela, Janne; Westkaemper, Timo

    2006-01-01

    The paper gives an overview about the ongoing FP6-IST INFRAWEBS project and describes the main layers and software components embedded in an application oriented realisation framework. An important part of INFRAWEBS is a Semantic Web Unit (SWU) – a collaboration platform and interoperable middleware for ontology-based handling and maintaining of SWS. The framework provides knowledge about a specific domain and relies on ontologies to structure and exchange this knowledge to semant...

  12. Semantic-Driven e-Government: Application of Uschold and King Ontology Building Methodology for Semantic Ontology Models Development

    Fonou-Dombeu, Jean Vincent; 10.5121/ijwest.2011.2401

    2011-01-01

    Electronic government (e-government) has been one of the most active areas of ontology development during the past six years. In e-government, ontologies are being used to describe and specify e-government services (e-services) because they enable easy composition, matching, mapping and merging of various e-government services. More importantly, they also facilitate the semantic integration and interoperability of e-government services. However, it is still unclear in the current literature how an existing ontology building methodology can be applied to develop semantic ontology models in a government service domain. In this paper the Uschold and King ontology building methodology is applied to develop semantic ontology models in a government service domain. Firstly, the Uschold and King methodology is presented, discussed and applied to build a government domain ontology. Secondly, the domain ontology is evaluated for semantic consistency using its semi-formal representation in Description Logic. Thirdly, an...

  13. Interoperable Solar Data and Metadata via LISIRD 3

    Wilson, A.; Lindholm, D. M.; Pankratz, C. K.; Snow, M. A.; Woods, T. N.

    2015-12-01

    LISIRD 3 is a major upgrade of the LASP Interactive Solar Irradiance Data Center (LISIRD), which serves several dozen space based solar irradiance and related data products to the public. Through interactive plots, LISIRD 3 provides data browsing supported by data subsetting and aggregation. Incorporating a semantically enabled metadata repository, LISIRD 3 users see current, vetted, consistent information about the datasets offered. Users can now also search for datasets based on metadata fields such as dataset type and/or spectral or temporal range. This semantic database enables metadata browsing, so users can discover the relationships between datasets, instruments, spacecraft, mission and PI. The database also enables creation and publication of metadata records in a variety of formats, such as SPASE or ISO, making these datasets more discoverable. The database also enables the possibility of a public SPARQL endpoint, making the metadata browsable in an automated fashion. LISIRD 3's data access middleware, LaTiS, provides dynamic, on demand reformatting of data and timestamps, subsetting and aggregation, and other server side functionality via a RESTful OPeNDAP compliant API, enabling interoperability between LASP datasets and many common tools. LISIRD 3's templated front end design, coupled with the uniform data interface offered by LaTiS, allows easy integration of new datasets. Consequently the number and variety of datasets offered by LISIRD has grown to encompass several dozen, with many more to come. This poster will discuss design and implementation of LISIRD 3, including tools used, capabilities enabled, and issues encountered.

  14. Semantic Web

    Hall, Wendy; O'Hara, Kieron

    2009-01-01

    The Semantic Web is a proposed extension to the World Wide Web (WWW) that aims to provide a common framework for sharing and reusing data across applications. The most common interfaces to the World Wide Web present it as a Web of Documents, linked in various ways including hyperlinks. But from the data point of view, each document is a black box – the data are not given independently of their representation in the document. This reduces its power, and also (as most information needs to be ex...

  15. Provenance in Data Interoperability for Multi-Sensor Intercomparison

    Lynnes, Chris; Leptoukh, Greg; Berrick, Steve; Shen, Suhung; Prados, Ana; Fox, Peter; Yang, Wenli; Min, Min; Holloway, Dan; Enloe, Yonsook

    2008-01-01

    As our inventory of Earth science data sets grows, the ability to compare, merge and fuse multiple datasets grows in importance. This requires a deeper data interoperability than we have now. Efforts such as Open Geospatial Consortium and OPeNDAP (Open-source Project for a Network Data Access Protocol) have broken down format barriers to interoperability; the next challenge is the semantic aspects of the data. Consider the issues when satellite data are merged, cross-calibrated, validated, inter-compared and fused. We must match up data sets that are related, yet different in significant ways: the phenomenon being measured, measurement technique, location in space-time or quality of the measurements. If subtle distinctions between similar measurements are not clear to the user, results can be meaningless or lead to an incorrect interpretation of the data. Most of these distinctions trace to how the data came to be: sensors, processing and quality assessment. For example, monthly averages of satellite-based aerosol measurements often show significant discrepancies, which might be due to differences in spatio- temporal aggregation, sampling issues, sensor biases, algorithm differences or calibration issues. Provenance information must be captured in a semantic framework that allows data inter-use tools to incorporate it and aid in the intervention of comparison or merged products. Semantic web technology allows us to encode our knowledge of measurement characteristics, phenomena measured, space-time representation, and data quality attributes in a well-structured, machine-readable ontology and rulesets. An analysis tool can use this knowledge to show users the provenance-related distrintions between two variables, advising on options for further data processing and analysis. An additional problem for workflows distributed across heterogeneous systems is retrieval and transport of provenance. Provenance may be either embedded within the data payload, or transmitted

  16. Provenance in Data Interoperability for Multi-Sensor Intercomparison

    Lynnes, C.; Leptoukh, G.; Berrick, S.; Shen, S.; Prados, A.; Fox, P.; Yang, W.; Min, M.; Holloway, D.; Enloe, Y.

    2008-12-01

    As our inventory of Earth science data sets grows, the ability to compare, merge and fuse multiple datasets grows in importance. This implies a need for deeper data interoperability than we have now. Many efforts (e.g. OPeNDAP, Open Geospatial Consortium) have broken down format barriers to interoperability; the next challenge is the semantic aspects of the data. Consider the issues when satellite data are merged, cross- calibrated, validated, inter-compared and fused. We must determine how to match up data sets that are related, yet different in significant ways: the exact nature of the phenomenon being measured, measurement technique, exact location in space-time, or the quality of the measurements. If subtle distinctions between similar measurements are not clear to the user, the results can be meaningless or even lead to an incorrect interpretation of the data. Most of these distinctions trace back to how the data came to be: sensors, processing, and quality assessment. For example, monthly averages of satellite-based aerosol measurements often show significant discrepancies, which might be due to differences in spatio-temporal aggregation, sampling issues, sensor biases, algorithm differences and/or calibration issues. This provenance information must therefore be captured in a semantic framework that allows sophisticated data inter-use tools to incorporate it, and eventually aid in the interpretation of comparison or merged products. Semantic web technology allows us to encode our knowledge of measurement characteristics, phenomena measured, space-time representations, and data quality representation in a well-structured, machine- readable ontology and rulesets. An analysis tool can use this knowledge to show users the provenance- related distinctions between two variables, advising on options for further data processing and analysis. An additional problem for workflows distributed across heterogeneous systems is retrieval and transport of provenance

  17. Political, policy and social barriers to health system interoperability: emerging opportunities of Web 2.0 and 3.0.

    Juzwishin, Donald W M

    2009-01-01

    Achieving effective health informatics interoperability in a fragmented and uncoordinated health system is by definition not possible. Interoperability requires the simultaneous integration of health care processes and information across different types and levels of care (systems thinking). The fundamental argument of this paper is that information system interoperability will remain an unfulfilled hope until health reforms effectively address the governance (accountability), structural and process barriers to interoperability of health care delivery. The ascendency of Web 2.0 and 3.0, although still unproven, signals the opportunity to accelerate patients' access to health information and their health record. Policy suggestions for simultaneously advancing health system delivery and information system interoperability are posited. PMID:20166516

  18. ICSE 2009 Tutorial - Semantic Web Technologies in Software Engineering

    Gall, H.C.; Reif, G

    2009-01-01

    Over the years, the software engineering community has developed various tools to support the specification, development, and maintainance of software. Many of these tools use proprietary data formats to store artifacts which hamper interoperability. On the other hand, the Semantic Web provides a common framework that allows data to be shared and reused across application, enterprise, and community boundaries. Ontologies are used to define the concepts in the domain of discourse and their rel...

  19. Before you make the data interoperable you have to make the people interoperable

    Jackson, I.

    2008-12-01

    In February 2006 a deceptively simple concept was put forward. Could we use the International Year of Planet Earth 2008 as a stimulus to begin the creation of a digital geological map of the planet at a target scale of 1:1 million? Could we design and initiate a project that uniquely mobilises geological surveys around the world to act as the drivers and sustainable data providers of this global dataset? Further, could we synergistically use this geoscientist-friendly vehicle of creating a tangible geological map to accelerate progress of an emerging global geoscience data model and interchange standard? Finally, could we use the project to transfer know-how to developing countries and reduce the length and expense of their learning curve, while at the same time producing geoscience maps and data that could attract interest and investment? These aspirations, plus the chance to generate a global digital geological dataset to assist in the understanding of global environmental problems and the opportunity to raise the profile of geoscience as part of IYPE seemed more than enough reasons to take the proposal to the next stage. In March 2007, in Brighton, UK, 81 delegates from 43 countries gathered together to consider the creation of this global interoperable geological map dataset. The participants unanimously agreed the Brighton "Accord" and kicked off "OneGeology", an initiative that now has the support of more than 85 nations. Brighton was never designed to be a scientific or technical meeting: it was overtly about people and their interaction - would these delegates, with their diverse cultural and technical backgrounds, be prepared to work together to achieve something which, while technically challenging, was not complex in the context of leading edge geoscience informatics. Could we scale up what is a simple informatics model at national level, to deliver global coverage and access? The major challenges for OneGeology (and the deployment of interoperability

  20. Benefit quantification of interoperability in coordinate metrology

    Savio, E.; Carmignato, S.; De Chiffre, Leonardo

    2014-01-01

    inefficiencies. The paper presents a methodology for an economic evaluation of interoperability benefits with respect to the verification of geometrical product specifications. It requires input data from testing and inspection activities, as well as information on training of personnel and licensing of software......One of the factors contributing to limited reproducibility of coordinate measurements is the use of different inspection software. Time-consuming efforts for translation of part programmes are sometimes needed, and interoperability of inspection equipment has the potential to reduce these....... The model is illustrated using an automotive case study and the related assessment of an investment in interoperability. © 2014 CIRP....

  1. Semantic-Driven e-Government: Application of Uschold and King Ontology Building Methodology for Semantic Ontology Models Development

    Jean Vincent Fonou-Dombeu; Magda Huisman

    2011-01-01

    Electronic government (e-government) has been one of the most active areas of ontology developmentduring the past six years. In e-government, ontologies are being used to describe and specify e-governmentservices (e-services) because they enable easy composition, matching, mapping and merging of various egovernmentservices. More importantly, they also facilitate the semantic integration and interoperability ofe-government services. However, it is still unclear in the current literature how an...

  2. From BPMN 2.0 to the Setting-Up on an ESB - Application to an Interoperability Problem

    Lemrabet, Y.; Clin, D.; Bigand, M.; Bourey, J. -P.

    2010-01-01

    To solve interoperability problem from semantic level, we propose to contribute to orchestration of the business processes to implement a mediation based on Enterprise Service Bus (ESB). We show how to take advantage of the forthcoming version of Business Process Modeling Notation 2.0 (BPMN 2.0) of the Object Management Group (OMG) within the framework of a Services Oriented Architecture (SOA) development. This new version of BPMN is characterized by the addition of the notion of private/publ...

  3. Semantic Web Service Framework to Intelligent Distributed Manufacturing

    Kulvatunyou, Boonserm [ORNL

    2005-12-01

    As markets become unexpectedly turbulent with a shortened product life cycle and a power shift towards buyers, the need for methods to develop products, production facilities, and supporting software rapidly and cost-effectively is becoming urgent. The use of a loosely integrated virtual enterprise based framework holds the potential of surviving changing market needs. However, its success requires reliable and large-scale interoperation among trading partners via a semantic web of trading partners services whose properties, capabilities, and interfaces are encoded in an unambiguous as well as computer-understandable form. This paper demonstrates a promising approach to integration and interoperation between a design house and a manufacturer that may or may not have prior relationship by developing semantic web services for business and engineering transactions. To this end, detailed activity and information flow diagrams are developed, in which the two trading partners exchange messages and documents. The properties and capabilities of the manufacturer sites are defined using DARPA Agent Markup Language (DAML) ontology definition language. The prototype development of semantic webs shows that enterprises can interoperate widely in an unambiguous and autonomous manner. This contributes towards the realization of virtual enterprises at a low cost.

  4. Enhancing the Interoperability of Multimedia Learning Objects Based on the Ontology Mapping

    Jihad Chaker

    2014-09-01

    Full Text Available This article addresses the interoperability between the semantic learning platforms and the educational resources banks, more precisely between the LOM and MPEG-7 standards. LOM is a set of metadata associated with e-learning content, while MPEG-7 is a standard for describing multimedia content. The use of educational resources has become an essential component to meet the learning needs. Given the multimedia nature of these resources, such use causes problems in the interoperability of multimedia learning objects in e-Learning environments, indexing and retrieval of digital resources. Faced with these problems, we propose a new approach for the multimedia learning objects by using the ontology mapping between the LOM and MPEG-7 ontologies.

  5. Convergence of Health Level Seven Version 2 Messages to Semantic Web Technologies for Software-Intensive Systems in Telemedicine Trauma Care

    Menezes, Pedro Monteiro; Cook, Timothy Wayne; Cavalini, Luciana Tricai

    2016-01-01

    Objectives To present the technical background and the development of a procedure that enriches the semantics of Health Level Seven version 2 (HL7v2) messages for software-intensive systems in telemedicine trauma care. Methods This study followed a multilevel model-driven approach for the development of semantically interoperable health information systems. The Pre-Hospital Trauma Life Support (PHTLS) ABCDE protocol was adopted as the use case. A prototype application embedded the semantics i...

  6. Interoperability of CAD Standards and Robotics in CIME

    Sørensen, Torben

    The research presented in this dissertation concerns the identification of problems and provision of solutions for increasing the degree of interoperability between CAD, CACSD (Computer Aided Control Systems Design) and CAR (Computer Aided Robotics) in Computer Integrated Manufacturing and Engine......The research presented in this dissertation concerns the identification of problems and provision of solutions for increasing the degree of interoperability between CAD, CACSD (Computer Aided Control Systems Design) and CAR (Computer Aided Robotics) in Computer Integrated Manufacturing......· The development of a STEP based interface for general control system data and functions, especially related to robot motion control for interoperability of CAD, CACSD, and CAR systems for the extension of the inter-system communication capabilities beyond the stage achieved up to now.This interface development...... comprehends the following work:· The definition of the concepts of 'information' and 'information model', and the selection of a proper information modeling methodology within the STEP methodologies.· The elaboration of a general function model of a generic robot motion controller in IDEF0 for interface...

  7. River Basin Standards Interoperability Pilot

    Pesquer, Lluís; Masó, Joan; Stasch, Christoph

    2016-04-01

    There is a lot of water information and tools in Europe to be applied in the river basin management but fragmentation and a lack of coordination between countries still exists. The European Commission and the member states have financed several research and innovation projects in support of the Water Framework Directive. Only a few of them are using the recently emerging hydrological standards, such as the OGC WaterML 2.0. WaterInnEU is a Horizon 2020 project focused on creating a marketplace to enhance the exploitation of EU funded ICT models, tools, protocols and policy briefs related to water and to establish suitable conditions for new market opportunities based on these offerings. One of WaterInnEU's main goals is to assess the level of standardization and interoperability of these outcomes as a mechanism to integrate ICT-based tools, incorporate open data platforms and generate a palette of interchangeable components that are able to use the water data emerging from the recently proposed open data sharing processes and data models stimulated by initiatives such as the INSPIRE directive. As part of the standardization and interoperability activities in the project, the authors are designing an experiment (RIBASE, the present work) to demonstrate how current ICT-based tools and water data can work in combination with geospatial web services in the Scheldt river basin. The main structure of this experiment, that is the core of the present work, is composed by the following steps: - Extraction of information from river gauges data in OGC WaterML 2.0 format using SOS services (preferably compliant to the OGC SOS 2.0 Hydrology Profile Best Practice). - Model floods using a WPS 2.0, WaterML 2.0 data and weather forecast models as input. - Evaluation of the applicability of Sensor Notification Services in water emergencies. - Open distribution of the input and output data as OGC web services WaterML, / WCS / WFS and with visualization utilities: WMS. The architecture

  8. Provenance-Based Approaches to Semantic Web Service Discovery and Usage

    Narock, Thomas William

    2012-01-01

    The World Wide Web Consortium defines a Web Service as "a software system designed to support interoperable machine-to-machine interaction over a network." Web Services have become increasingly important both within and across organizational boundaries. With the recent advent of the Semantic Web, web services have evolved into semantic…

  9. Interoperability for Entreprise Systems and Applications '12

    Doumeingts, Guy; Katzy, Bernhard; Chalmeta, Ricardo

    2012-01-01

    Within a scenario of globalised markets, where the capacity to efficiently cooperate with other firms starts to become essential in order to remain in the market in an economically, socially and environmentally cost-effective manner, it can be seen how the most innovative enterprises are beginning to redesign their business model to become interoperable. This goal of interoperability is essential, not only from the perspective of the individual enterprise but also in the new business structures that are now emerging, such as supply chains, virtual enterprises, interconnected organisations or extended enterprises, as well as in mergers and acquisitions. Composed of over 40 papers, Enterprise Interoperability V ranges from academic research through case studies to industrial and administrative experience of interoperability. The international nature of the authorship contnues to broaden. Many of the papers have examples and illustrations calculated to deepen understanding and generate new ideas. The I-ESA'12 Co...

  10. Requirements for Interoperability in Healthcare Information Systems

    Rita Noumeir

    2012-01-01

    Full Text Available Interoperability is a requirement for the successful deployment of Electronic Health Records (EHR. EHR improves the quality of healthcare by enabling access to all relevant information at the diagnostic decision moment, regardless of location. It is a system that results from the cooperation of several heterogeneous distributed subsystems that need to successfully exchange information relative to a specific healthcare process. This paper analyzes interoperability impediments in healthcare by first defining them and providing concrete healthcare examples, followed by discussion of how specifications can be defined and how verification can be conducted to eliminate those impediments and ensure interoperability in healthcare. This paper also analyzes how Integrating the Healthcare Enterprise (IHE has been successful in enabling interoperability, and identifies some neglected aspects that need attention.

  11. Scalability and interoperability within glideinWMS

    Physicists have access to thousands of CPUs in grid federations such as OSG and EGEE. With the start-up of the LHC, it is essential for individuals or groups of users to wrap together available resources from multiple sites across multiple grids under a higher user-controlled layer in order to provide a homogeneous pool of available resources. One such system is glideinWMS, which is based on the Condor batch system. A general discussion of glideinWMS can be found elsewhere. Here, we focus on recent advances in extending its reach: scalability and integration of heterogeneous compute elements. We demonstrate that the new developments exceed the design goal of over 10,000 simultaneous running jobs under a single Condor schedd, using strong security protocols across global networks, and sustaining a steady-state job completion rate of a few Hz. We also show interoperability across heterogeneous computing elements achieved using client-side methods. We discuss this technique and the challenges in direct access to NorduGrid and CREAM compute elements, in addition to Globus based systems.

  12. Scalability and interoperability within glideinWMS

    Bradley, D.; /Wisconsin U., Madison; Sfiligoi, I.; /Fermilab; Padhi, S.; /UC, San Diego; Frey, J.; /Wisconsin U., Madison; Tannenbaum, T.; /Wisconsin U., Madison

    2010-01-01

    Physicists have access to thousands of CPUs in grid federations such as OSG and EGEE. With the start-up of the LHC, it is essential for individuals or groups of users to wrap together available resources from multiple sites across multiple grids under a higher user-controlled layer in order to provide a homogeneous pool of available resources. One such system is glideinWMS, which is based on the Condor batch system. A general discussion of glideinWMS can be found elsewhere. Here, we focus on recent advances in extending its reach: scalability and integration of heterogeneous compute elements. We demonstrate that the new developments exceed the design goal of over 10,000 simultaneous running jobs under a single Condor schedd, using strong security protocols across global networks, and sustaining a steady-state job completion rate of a few Hz. We also show interoperability across heterogeneous computing elements achieved using client-side methods. We discuss this technique and the challenges in direct access to NorduGrid and CREAM compute elements, in addition to Globus based systems.

  13. GEOSS interoperability for Weather, Ocean and Water

    Richardson, David; Nyenhuis, Michael; Zsoter, Ervin; Pappenberger, Florian

    2013-04-01

    "Understanding the Earth system — its weather, climate, oceans, atmosphere, water, land, geodynamics, natural resources, ecosystems, and natural and human-induced hazards — is crucial to enhancing human health, safety and welfare, alleviating human suffering including poverty, protecting the global environment, reducing disaster losses, and achieving sustainable development. Observations of the Earth system constitute critical input for advancing this understanding." With this in mind, the Group on Earth Observations (GEO) started implementing the Global Earth Observation System of Systems (GEOSS). GEOWOW, short for "GEOSS interoperability for Weather, Ocean and Water", is supporting this objective. GEOWOW's main challenge is to improve Earth observation data discovery, accessibility and exploitability, and to evolve GEOSS in terms of interoperability, standardization and functionality. One of the main goals behind the GEOWOW project is to demonstrate the value of the TIGGE archive in interdisciplinary applications, providing a vast amount of useful and easily accessible information to the users through the GEO Common Infrastructure (GCI). GEOWOW aims at developing funcionalities that will allow easy discovery, access and use of TIGGE archive data and of in-situ observations, e.g. from the Global Runoff Data Centre (GRDC), to support applications such as river discharge forecasting.TIGGE (THORPEX Interactive Grand Global Ensemble) is a key component of THORPEX: a World Weather Research Programme to accelerate the improvements in the accuracy of 1-day to 2 week high-impact weather forecasts for the benefit of humanity. The TIGGE archive consists of ensemble weather forecast data from ten global NWP centres, starting from October 2006, which has been made available for scientific research. The TIGGE archive has been used to analyse hydro-meteorological forecasts of flooding in Europe as well as in China. In general the analysis has been favourable in terms of

  14. Intercloud Architecture Framework for Interoperability and Integration

    Demchenko, Y; Ngo, C.; Makkes, M.X.; Strijkers, R.J.

    2013-01-01

    This report presents on-going research to develop the Intercloud Architecture Framework (ICAF) that addresses interoperability and integration issues in multi-provider multi-domain heterogeneous Cloud based infrastructure services and applications provisioning, including integration and interoperability with the legacy infrastructure services. Cloud technologies are evolving as a common way of infrastructure services and resources virtualisation and provisioning on-demand. In this way, they b...

  15. The DFG Viewer for Interoperability in Germany

    2010-01-01

    This article deals with the DFG Viewer for Interoperability, a free and open source web-based viewer for digitised books, and assesses its relevance for interoperability in Germany. First the specific situation in Germany is described, including the important role of the Deutsche Forschungsgemeinschaft (German Research Foundation). The article then moves on to the overall concept of the viewer and its technical background. It introduces the data formats and standards used, it briefly illustra...

  16. Towards an Excellence Framework for Business Interoperability

    Legner, Christine; Wende, Kristin

    2006-01-01

    Organisations that wish to establish IT-supported business relationships with business partners face major challenges, among them the need for creating a win-win-situation and the effort to align business processes and link up information systems across company borders. Whereas interoperability has been widely dis-cussed in a technical context, it has not (yet) been explored how interoperability relates to the business strategy and organisational design of the business relation-ship. This pap...

  17. Interoperability and Standardization of Intercloud Cloud Computing

    Wang, Jingxin K.; Ding, Jianrui; Niu, Tian

    2012-01-01

    Cloud computing is getting mature, and the interoperability and standardization of the clouds is still waiting to be solved. This paper discussed the interoperability among clouds about message transmission, data transmission and virtual machine transfer. Starting from IEEE Pioneering Cloud Computing Initiative, this paper discussed about standardization of the cloud computing, especially intercloud cloud computing. This paper also discussed the standardization from the market-oriented view.

  18. Diabetes Device Interoperability for Improved Diabetes Management

    Silk, Alain D.

    2015-01-01

    Scientific and technological advancements have led to the increasing availability and use of sophisticated devices for diabetes management, with corresponding improvements in public health. These devices are often capable of sharing data with a few other specific devices but are generally not broadly interoperable; they cannot work together with a wide variety of other devices. As a result of limited interoperability, benefits of modern diabetes devices and potential for development of innova...

  19. Data interoperability software solution for emergency reaction in the Europe Union

    Casado, R.; Rubiera, E.; Sacristan, M.; Schütte, F.; Peters, R.

    2015-07-01

    Emergency management becomes more challenging in international crisis episodes because of cultural, semantic and linguistic differences between all stakeholders, especially first responders. Misunderstandings between first responders makes decision making slower and more difficult. However, spread and development of networks and IT-based emergency management systems (EMSs) have improved emergency responses, which have become more coordinated. Despite improvements made in recent years, EMSs have not still solved problems related to cultural, semantic and linguistic differences which are the real cause of slower decision making. In addition, from a technical perspective, the consolidation of current EMSs and the different formats used to exchange information offers another problem to be solved in any solution proposed for information interoperability between heterogeneous EMSs in different contexts. To overcome these problems, we present a software solution based on semantic and mediation technologies. EMERGency ELements (EMERGEL) (Fundacion CTIC and AntwortING Ingenieurbüro PartG, 2013), a common and modular ontology shared by all the stakeholders, has been defined. It offers the best solution to gather all stakeholders' knowledge in a unique and flexible data model, taking into account different countries' cultural and linguistic issues. To deal with the diversity of data protocols and formats, we have designed a service-oriented architecture for data interoperability (named DISASTER: Data Interoperability Solution At STakeholders Emergency Reaction) providing a flexible extensible solution to solve the mediation issues. Web services have been adopted as specific technology to implement this paradigm that has the most significant academic and industrial visibility and attraction. Contributions of this work have been validated through the design and development of a cross-border realistic prototype scenario, actively involving both emergency managers and emergency

  20. A flexible integration framework for a Semantic Geospatial Web application

    Yuan, Ying; Mei, Kun; Bian, Fuling

    2008-10-01

    With the growth of the World Wide Web technologies, the access to and use of geospatial information changed in the past decade radically. Previously, the data processed by a GIS as well as its methods had resided locally and contained information that was sufficiently unambiguous in the respective information community. Now, both data and methods may be retrieved and combined from anywhere in the world, escaping their local contexts. The last few years have seen a growing interest in the field of semantic geospatial web. With the development of semantic web technologies, we have seen the possibility of solving the heterogeneity/interoperation problem in the GIS community. The semantic geospatial web application can support a wide variety of tasks including data integration, interoperability, knowledge reuse, spatial reasoning and many others. This paper proposes a flexible framework called GeoSWF (short for Geospatial Semantic Web Framework), which supports the semantic integration of the distributed and heterogeneous geospatial information resources and also supports the semantic query and spatial relationship reasoning. We design the architecture of GeoSWF by extending the MVC Pattern. The GeoSWF use the geo-2007.owl proposed by W3C as the reference ontology of the geospatial information and design different application ontologies according to the situation of heterogeneous geospatial information resources. A Geospatial Ontology Creating Algorithm (GOCA) is designed for convert the geospatial information to the ontology instances represented by RDF/OWL. On the top of these ontology instances, the GeoSWF carry out the semantic reasoning by the rule set stored in the knowledge base to generate new system query. The query result will be ranking by ordering the Euclidean distance of each ontology instances. At last, the paper gives the conclusion and future work.

  1. On MDA - SOA based Intercloud Interoperability framework

    Tahereh Nodehi

    2013-01-01

    Full Text Available Cloud computing has been one of the latest technologies which assures reliable delivery of on - demand computing services over the Internet. Cloud service providers have established geographically distributed data centers and computing resources, which are available online as service. The clouds operated by different service providers working together in collaboration can open up lots more spaces for innovative scenarios with huge amount of resources provisioning on demand. However, current cloud systems do not support intercloud interoperability. This paper is thus motivated to address Intercloud Interoperabilityby analyzing different methodologies that have been applied to resolve various scenarios of interoperability. Model Driven Architecture (MDA and Service Oriented Architecture (SOA method have been used to address interoperability in various scenarios, which also opens up spaces to address intercloud interoperability by making use of these well accepted methodologies. The focus of this document is to show Intercloud Interoperability can be supported through a Model Driven approach and Service Oriented systems. Moreover, the current state of the art in Intercloud, concept and benefits of MDA and SOA are discussed in the paper. At the same time this paper also proposes a generic architecture for MDA - SOA based framework, which can be useful for developing applications which will require intercloud interoperability. The paper justi fies the usability of the framework by a use - case scenario for dynamic workload migration among heterogeneous clouds.

  2. Towards Interoperable Preservation Repositories: TIPR

    Priscilla Caplan

    2010-07-01

    Full Text Available Towards Interoperable Preservation Repositories (TIPR is a project funded by the Institute of Museum and Library Services to create and test a Repository eXchange Package (RXP. The package will make it possible to transfer complex digital objects between dissimilar preservation repositories.  For reasons of redundancy, succession planning and software migration, repositories must be able to exchange copies of archival information packages with each other. Every different repository application, however, describes and structures its archival packages differently. Therefore each system produces dissemination packages that are rarely understandable or usable as submission packages by other repositories. The RXP is an answer to that mismatch. Other solutions for transferring packages between repositories focus either on transfers between repositories of the same type, such as DSpace-to-DSpace transfers, or on processes that rely on central translation services.  Rather than build translators between many dissimilar repository types, the TIPR project has defined a standards-based package of metadata files that can act as an intermediary information package, the RXP, a lingua franca all repositories can read and write.

  3. Challenges of Interoperability Using HL7 v3 in Czech Healthcare

    Nagy, Miroslav; Přečková, Petra; Seidl, Libor; Zvárová, Jana

    Amsterdam: IOS Press, 2010 - (Blobel, B.; Hvannberg, E.; Gunnarsdóttir, V.), s. 122-128. (Studies in Health Technology and Informatics. 155). ISBN 978-1-60750-562-4. ISSN 0926-9630. [EFMI Special Topic Conference. Reykjavik (IS), 02.06.2010-04.06.2010] R&D Projects: GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10300504 Keywords : shared healthcare * electronic health record * semantic interoperability * classification systems * communication standards Subject RIV: IN - Informatics, Computer Science

  4. A Semantic Web Blackboard System

    McKenzie, Craig; Preece, Alun; Gray, Peter

    In this paper, we propose a Blackboard Architecture as a means for coordinating hybrid reasoning over the Semantic Web. We describe the components of traditional blackboard systems (Knowledge Sources, Blackboard, Controller) and then explain how we have enhanced these by incorporating some of the principles of the Semantic Web to pro- duce our Semantic Web Blackboard. Much of the framework is already in place to facilitate our research: the communication protocol (HTTP); the data representation medium (RDF); a rich expressive description language (OWL); and a method of writing rules (SWRL). We further enhance this by adding our own constraint based formalism (CIF/SWRL) into the mix. We provide an example walk-though of our test-bed system, the AKTive Workgroup Builder and Blackboard(AWB+B), illustrating the interaction and cooperation of the Knowledge Sources and providing some context as to how the solution is achieved. We conclude with the strengths and weaknesses of the architecture.

  5. European Interoperability Assets Register and Quality Framework Implementation.

    Moreno-Conde, Alberto; Thienpont, Geert; Lamote, Inge; Coorevits, Pascal; Parra, Carlos; Kalra, Dipak

    2016-01-01

    Interoperability assets is the term applied to refer to any resource that can support the design, implementation and successful adoption of eHealth services that can exchange data meaningfully. Some examples may include functional requirements, specifications, standards, clinical models and term lists, guidance on how standards may be used concurrently, implementation guides, educational resources, and other resources. Unfortunately, these are largely accessible in ad hoc ways and result in scattered fragments of a solution space that urgently need to be brought together. At present, it is well known that new initiatives and projects will reinvent assets of which they were unaware, while those assets which were potentially of great value are forgotten, not maintained and eventually fall into disuse. This research has defined a quality in use model and assessed the suitability of this quality framework based on the feedback and opinion of a representative sample of potential end users. This quality framework covers the following domains of asset development and adoption: (i) Development process, (ii) Maturity level, (iii) Trustworthiness, (iv) Support & skills, (v) Sustainability, (vi) Semantic interoperability, (vii) Cost & effort of adoption (viii) Maintenance. When participants were requested to evaluate how the overall quality in use framework, 70% would recommend using the register to their colleagues, 70% felt that it could provide relevant benefits for discovering new assets, and 50% responded that it would support their decision making about the recommended asset to adopt or implement in their organisation. Several European projects have expressed interest in using the register, which will now be sustained and promoted by the the European Institute for Innovation through Health Data. PMID:27577473

  6. Semantic Context Detection Using Audio Event Fusion

    Cheng Wen-Huang

    2006-01-01

    Full Text Available Semantic-level content analysis is a crucial issue in achieving efficient content retrieval and management. We propose a hierarchical approach that models audio events over a time series in order to accomplish semantic context detection. Two levels of modeling, audio event and semantic context modeling, are devised to bridge the gap between physical audio features and semantic concepts. In this work, hidden Markov models (HMMs are used to model four representative audio events, that is, gunshot, explosion, engine, and car braking, in action movies. At the semantic context level, generative (ergodic hidden Markov model and discriminative (support vector machine (SVM approaches are investigated to fuse the characteristics and correlations among audio events, which provide cues for detecting gunplay and car-chasing scenes. The experimental results demonstrate the effectiveness of the proposed approaches and provide a preliminary framework for information mining by using audio characteristics.

  7. Event-Driven Interoperability Framework For Interoperation In E-Learning Information Systems - Monitored Repository

    Petrov, Milen

    2006-01-01

    M.Petrov "Event-Driven Interoperability Framework For Interoperation In E-Learning Information Systems - Monitored Repository", IADAT-e2006, 3rd International Conference on Education, Barcelona (Spain), July 12-14, 2006, ISBN: 84-933971-9-9, pp.198 - pp.202

  8. Code lists for interoperability - Principles and best practices in INSPIRE

    Lutz, M.; Portele, C.; Cox, S.; Murray, K.

    2012-04-01

    external vocabulary. In the former case, for each value, an external identifier, one or more labels (possibly in different languages), a definition and other metadata should be specified. In the latter case, the external vocabulary should be characterised, e.g. by specifying the version to be used, the format(s) in which the vocabulary is available, possible constraints (e.g. if only as specific part of the external list is to be used), rules for using values in the encoding of instance data, and the maintenance rules applied to the external vocabulary. This information is crucial for enabling implementation and interoperability in distributed systems (such as SDIs) and should be made available through a code list registry. While thus the information on allowed code list values is usually managed outside the UML application schema, we recommend inclusion of «codeList»-stereotyped classes in the model for semantic clarity. Information on the obligation, extensibility and a reference to the specified values should be provided through tagged values. Acknowledgements: The authors would like to thank the INSPIRE Thematic Working Groups, the Data Specifications Drafting Team and the JRC Contact Points for their contributions to the discussions on code lists in INSPIRE and to this abstract.

  9. Enterprise interoperability VI : Interoperability for Agility, Resilience and Plasticity of Collaboration

    Bénaben, Frédérick; Poler, Raúl; Bourrières, Jean-Paul

    2014-01-01

    A concise reference to the state of the art in systems interoperability, Enterprise Interoperability VI will be of great value to engineers and computer scientists working in manufacturing and other process industries and to software engineers and electronic and manufacturing engineers working in the academic environment. Over 40 papers, ranging from academic research through case studies to industrial and administrative experience of interoperability show how, in a scenario of globalised markets, where the capacity to cooperate with other firms efficiently starts to become essential in order to remain in the market in an economically, socially and environmentally cost-effective manner, the most innovative enterprises are beginning to redesign their business model to become interoperable. This goal of interoperability is essential, not only from the perspective of the individual enterprise but also in the new business structures that are now emerging, such as supply chains, virtual enterprises, interconnected...

  10. A web services choreography scenario for interoperating bioinformatics applications

    Cheung David W

    2004-03-01

    Full Text Available Abstract Background Very often genome-wide data analysis requires the interoperation of multiple databases and analytic tools. A large number of genome databases and bioinformatics applications are available through the web, but it is difficult to automate interoperation because: 1 the platforms on which the applications run are heterogeneous, 2 their web interface is not machine-friendly, 3 they use a non-standard format for data input and output, 4 they do not exploit standards to define application interface and message exchange, and 5 existing protocols for remote messaging are often not firewall-friendly. To overcome these issues, web services have emerged as a standard XML-based model for message exchange between heterogeneous applications. Web services engines have been developed to manage the configuration and execution of a web services workflow. Results To demonstrate the benefit of using web services over traditional web interfaces, we compare the two implementations of HAPI, a gene expression analysis utility developed by the University of California San Diego (UCSD that allows visual characterization of groups or clusters of genes based on the biomedical literature. This utility takes a set of microarray spot IDs as input and outputs a hierarchy of MeSH Keywords that correlates to the input and is grouped by Medical Subject Heading (MeSH category. While the HTML output is easy for humans to visualize, it is difficult for computer applications to interpret semantically. To facilitate the capability of machine processing, we have created a workflow of three web services that replicates the HAPI functionality. These web services use document-style messages, which means that messages are encoded in an XML-based format. We compared three approaches to the implementation of an XML-based workflow: a hard coded Java application, Collaxa BPEL Server and Taverna Workbench. The Java program functions as a web services engine and interoperates