WorldWideScience

Sample records for achieving semantic interoperability

  1. Semantic Interoperability in Electronic Business

    Directory of Open Access Journals (Sweden)

    Juha Puustjarvi

    2010-09-01

    Full Text Available E-business refers to the utilization of information and communication technologies (ICT in support of all the activities of business. The standards developed for e-business help to facilitate the deployment of e-business. In particular, several organizations in e-business sector have produced standards and representation forms using XML. It serves as an interchange format for exchanging data between communicating applications. However, XML says nothing about the semantics of the used tags. XML is merely a standard notation for markup languages, which provides a means for structuring documents. Therefore the XML-based e-business software is developed by hard-coding. Hard-coding is proven to be a valuable and powerful way for exchanging structured and persistent business documents. However, if we use hard-coding in the case of non-persistent documents and non-static environments we will encounter problems in deploying new document types as it requires a long lasting standardization process. Replacing existing hard-coded ebusiness systems by open systems that support semantic interoperability, and which are easily extensible, is the topic of this article. We first consider XML-based technologies and standards developed for B2B interoperation. Then, we consider electronic auctions, which represent a form of e-business. In particular, we represent how semantic interoperability can be achieved in electronic auctions.

  2. Semantic and Process Interoperability

    Directory of Open Access Journals (Sweden)

    Félix Oscar Fernández Peña

    2010-05-01

    Full Text Available Knowledge management systems support education at different levels of the education. This is very important for the process in which the higher education of Cuba is involved. Structural transformations of teaching are focused on supporting the foundation of the information society in the country. This paper describes technical aspects of the designing of a model for the integration of multiple knowledgemanagement tools supporting teaching. The proposal is based on the definition of an ontology for the explicit formal description of the semantic of motivations of students and teachers in the learning process. Its target is to facilitate knowledge spreading.

  3. MIDST: Interoperability for Semantic Annotations

    Science.gov (United States)

    Atzeni, Paolo; Del Nostro, Pierluigi; Paolozzi, Stefano

    In the last years, interoperability of ontologies and databases has received a lot of attention. However, most of the work has concentrated on specific problems (such as storing an ontology in a database or making database data available to ontologies) and referred to specific models for each of the two. Here, we propose an approach that aims at being more general and model independent. In fact, it works for different dialects for ontologies and for various data models for databases. Also, it supports translations in both directions (ontologies to databases and vice versa) and it allows for flexibility in the translations, so that customization is possible. The proposal extends recent work for schema and data translation (the MIDST project, which implements the ModelGen operator proposed in model management), which relies on a metamodel approach, where data models and variations thereof are described in a common framework and translations are built as compositions of elementary ones.

  4. Review of Semantically Interoperable Electronic Health Records for Ubiquitous Healthcare

    OpenAIRE

    Hwang, Kyung Hoon; Chung, Kyo-IL; Chung, Myung-Ae; Choi, Duckjoo

    2010-01-01

    In order to provide more effective and personalized healthcare services to patients and healthcare professionals, intelligent active knowledge management and reasoning systems with semantic interoperability are needed. Technological developments have changed ubiquitous healthcare making it more semantically interoperable and individual patient-based; however, there are also limitations to these methodologies. Based upon an extensive review of international literature, this paper describes two...

  5. Providing semantic interoperability between clinical care and clinical research domains.

    Science.gov (United States)

    Laleci, Gokce Banu; Yuksel, Mustafa; Dogac, Asuman

    2013-03-01

    Improving the efficiency with which clinical research studies are conducted can lead to faster medication innovation and decreased time to market for new drugs. To increase this efficiency, the parties involved in a regulated clinical research study, namely, the sponsor, the clinical investigator and the regulatory body, each with their own software applications, need to exchange data seamlessly. However, currently, the clinical research and the clinical care domains are quite disconnected because each use different standards and terminology systems. In this article, we describe an initial implementation of the Semantic Framework developed within the scope of SALUS project to achieve interoperability between the clinical research and the clinical care domains. In our Semantic Framework, the core ontology developed for semantic mediation is based on the shared conceptual model of both of these domains provided by the BRIDG initiative. The core ontology is then aligned with the extracted semantic models of the existing clinical care and research standards as well as with the ontological representations of the terminology systems to create a model of meaning for enabling semantic mediation. Although SALUS is a research and development effort rather than a product, the current SALUS knowledge base contains around 4.7 million triples representing BRIDG DAM, HL7 CDA model, CDISC standards and several terminology ontologies. In order to keep the reasoning process within acceptable limits without sacrificing the quality of mediation, we took an engineering approach by developing a number of heuristic mechanisms. The results indicate that it is possible to build a robust and scalable semantic framework with a solid theoretical foundation for achieving interoperability between the clinical research and clinical care domains. PMID:23008263

  6. Local ontologies for semantic interoperability in supply chain networks

    OpenAIRE

    Zdravković, Milan; Trajanović, Miroslav; Panetto, Hervé

    2011-01-01

    ISBN: 978-989-8425-53-9 International audience Most of the issues of current supply chain management practices are related to the challenges of interoperability of relevant enterprise information systems (EIS). In this paper, we present the ontological framework for semantic interoperability of EISs in supply chain networks, based on Supply Chain Operations Reference (SCOR) model, its semantic enrichment and mappings with relevant enterprise conceptualizations. In order to introduce the...

  7. State of the Art on Semantic IS Standardization, Interoperability & Quality

    NARCIS (Netherlands)

    Folmer, E.J.A.; Verhoosel, J.P.C.

    2011-01-01

    This book contains a broad overview of relevant studies in the area of semantic IS standards. It includes an introduction in the general topic of standardization and introduces the concept of interoperability. The primary focus is however on semantic IS standards, their characteristics, and the qual

  8. State of the art on semantic IS standardization, interoperability & quality

    NARCIS (Netherlands)

    Folmer, Erwin; Verhoosel, Jack

    2011-01-01

    This book contains a broad overview of relevant studies in the area of semantic IS standards. It includes an introduction in the general topic of standardization and introduces the concept of interoperability. The primary focus is however on semantic IS standards, their characteristics, and the qual

  9. Semantics-Based Interoperability Framework for the Geosciences

    Science.gov (United States)

    Sinha, A.; Malik, Z.; Raskin, R.; Barnes, C.; Fox, P.; McGuinness, D.; Lin, K.

    2008-12-01

    Interoperability between heterogeneous data, tools and services is required to transform data to knowledge. To meet geoscience-oriented societal challenges such as forcing of climate change induced by volcanic eruptions, we suggest the need to develop semantic interoperability for data, services, and processes. Because such scientific endeavors require integration of multiple data bases associated with global enterprises, implicit semantic-based integration is impossible. Instead, explicit semantics are needed to facilitate interoperability and integration. Although different types of integration models are available (syntactic or semantic) we suggest that semantic interoperability is likely to be the most successful pathway. Clearly, the geoscience community would benefit from utilization of existing XML-based data models, such as GeoSciML, WaterML, etc to rapidly advance semantic interoperability and integration. We recognize that such integration will require a "meanings-based search, reasoning and information brokering", which will be facilitated through inter-ontology relationships (ontologies defined for each discipline). We suggest that Markup languages (MLs) and ontologies can be seen as "data integration facilitators", working at different abstraction levels. Therefore, we propose to use an ontology-based data registration and discovery approach to compliment mark-up languages through semantic data enrichment. Ontologies allow the use of formal and descriptive logic statements which permits expressive query capabilities for data integration through reasoning. We have developed domain ontologies (EPONT) to capture the concept behind data. EPONT ontologies are associated with existing ontologies such as SUMO, DOLCE and SWEET. Although significant efforts have gone into developing data (object) ontologies, we advance the idea of developing semantic frameworks for additional ontologies that deal with processes and services. This evolutionary step will

  10. An Ontology Based Methodology for Satellite Data Semantic Interoperability

    Directory of Open Access Journals (Sweden)

    ABBURU, S.

    2015-08-01

    Full Text Available Satellites and ocean based observing system consists of various sensors and configurations. These observing systems transmit data in heterogeneous file formats and heterogeneous vocabulary from various data centers. These data centers maintain a centralized data management system that disseminates the observations to various research communities. Currently, different data naming conventions are being used by existing observing systems, thus leading to semantic heterogeneity. In this work, sensor data interoperability and semantics of the data are being addressed through ontologies. The present work provides an effective technical solution to address semantic heterogeneity through semantic technologies. These technologies provide interoperability, capability to build knowledge base, and framework for semantic information retrieval by developing an effective concept vocabulary through domain ontologies. The paper aims at a new methodology to interlink the multidisciplinary and heterogeneous sensor data products. A four phase methodology has been implemented to address satellite data semantic interoperability. The paper concludes with the evaluation of the methodology by linking and interfacing multiple ontologies to arrive at ontology vocabulary for sensor observations. Data from Indian Meteorological satellite INSAT-3D satellite have been used as a typical example to illustrate the concepts. This work on similar lines can also be extended to other sensor observations.

  11. Conceptual Model Formalization in a Semantic Interoperability Service Framework: Transforming Relational Database Schemas to OWL.

    Science.gov (United States)

    Bravo, Carlos; Suarez, Carlos; González, Carolina; López, Diego; Blobel, Bernd

    2014-01-01

    Healthcare information is distributed through multiple heterogeneous and autonomous systems. Access to, and sharing of, distributed information sources are a challenging task. To contribute to meeting this challenge, this paper presents a formal, complete and semi-automatic transformation service from Relational Databases to Web Ontology Language. The proposed service makes use of an algorithm that allows to transform several data models of different domains by deploying mainly inheritance rules. The paper emphasizes the relevance of integrating the proposed approach into an ontology-based interoperability service to achieve semantic interoperability.

  12. Open PHACTS: semantic interoperability for drug discovery.

    Science.gov (United States)

    Williams, Antony J; Harland, Lee; Groth, Paul; Pettifer, Stephen; Chichester, Christine; Willighagen, Egon L; Evelo, Chris T; Blomberg, Niklas; Ecker, Gerhard; Goble, Carole; Mons, Barend

    2012-11-01

    Open PHACTS is a public-private partnership between academia, publishers, small and medium sized enterprises and pharmaceutical companies. The goal of the project is to deliver and sustain an 'open pharmacological space' using and enhancing state-of-the-art semantic web standards and technologies. It is focused on practical and robust applications to solve specific questions in drug discovery research. OPS is intended to facilitate improvements in drug discovery in academia and industry and to support open innovation and in-house non-public drug discovery research. This paper lays out the challenges and how the Open PHACTS project is hoping to address these challenges technically and socially. PMID:22683805

  13. An adaptive semantic based mediation system for data interoperability among Health Information Systems.

    Science.gov (United States)

    Khan, Wajahat Ali; Khattak, Asad Masood; Hussain, Maqbool; Amin, Muhammad Bilal; Afzal, Muhammad; Nugent, Christopher; Lee, Sungyoung

    2014-08-01

    Heterogeneity in the management of the complex medical data, obstructs the attainment of data level interoperability among Health Information Systems (HIS). This diversity is dependent on the compliance of HISs with different healthcare standards. Its solution demands a mediation system for the accurate interpretation of data in different heterogeneous formats for achieving data interoperability. We propose an adaptive AdapteR Interoperability ENgine mediation system called ARIEN, that arbitrates between HISs compliant to different healthcare standards for accurate and seamless information exchange to achieve data interoperability. ARIEN stores the semantic mapping information between different standards in the Mediation Bridge Ontology (MBO) using ontology matching techniques. These mappings are provided by our System for Parallel Heterogeneity (SPHeRe) matching system and Personalized-Detailed Clinical Model (P-DCM) approach to guarantee accuracy of mappings. The realization of the effectiveness of the mappings stored in the MBO is evaluation of the accuracy in transformation process among different standard formats. We evaluated our proposed system with the transformation process of medical records between Clinical Document Architecture (CDA) and Virtual Medical Record (vMR) standards. The transformation process achieved over 90 % of accuracy level in conversion process between CDA and vMR standards using pattern oriented approach from the MBO. The proposed mediation system improves the overall communication process between HISs. It provides an accurate and seamless medical information exchange to ensure data interoperability and timely healthcare services to patients.

  14. An adaptive semantic based mediation system for data interoperability among Health Information Systems.

    Science.gov (United States)

    Khan, Wajahat Ali; Khattak, Asad Masood; Hussain, Maqbool; Amin, Muhammad Bilal; Afzal, Muhammad; Nugent, Christopher; Lee, Sungyoung

    2014-08-01

    Heterogeneity in the management of the complex medical data, obstructs the attainment of data level interoperability among Health Information Systems (HIS). This diversity is dependent on the compliance of HISs with different healthcare standards. Its solution demands a mediation system for the accurate interpretation of data in different heterogeneous formats for achieving data interoperability. We propose an adaptive AdapteR Interoperability ENgine mediation system called ARIEN, that arbitrates between HISs compliant to different healthcare standards for accurate and seamless information exchange to achieve data interoperability. ARIEN stores the semantic mapping information between different standards in the Mediation Bridge Ontology (MBO) using ontology matching techniques. These mappings are provided by our System for Parallel Heterogeneity (SPHeRe) matching system and Personalized-Detailed Clinical Model (P-DCM) approach to guarantee accuracy of mappings. The realization of the effectiveness of the mappings stored in the MBO is evaluation of the accuracy in transformation process among different standard formats. We evaluated our proposed system with the transformation process of medical records between Clinical Document Architecture (CDA) and Virtual Medical Record (vMR) standards. The transformation process achieved over 90 % of accuracy level in conversion process between CDA and vMR standards using pattern oriented approach from the MBO. The proposed mediation system improves the overall communication process between HISs. It provides an accurate and seamless medical information exchange to ensure data interoperability and timely healthcare services to patients. PMID:24964780

  15. Database Semantic Interoperability based on Information Flow Theory and Formal Concept Analysis

    Directory of Open Access Journals (Sweden)

    Guanghui Yang

    2012-07-01

    Full Text Available As databases become widely used, there is a growing need to translate information between multiple databases. Semantic interoperability and integration has been a long standing challenge for the database community and has now become a prominent area of database research. In this paper, we aim to answer the question how semantic interoperability between two databases can be achieved by using Formal Concept Analysis (FCA for short and Information Flow (IF for short theories. For our purposes, firstly we discover knowledge from different databases by using FCA, and then align what is discovered by using IF and FCA. The development of FCA has led to some software systems such as TOSCANA and TUPLEWARE, which can be used as a tool for discovering knowledge in databases. A prototype based on the IF and FCA has been developed. Our method is tested and verified by using this prototype and TUPLEWARE.

  16. An approach to define semantics for BPM systems interoperability

    Science.gov (United States)

    Rico, Mariela; Caliusco, María Laura; Chiotti, Omar; Rosa Galli, María

    2015-04-01

    This article proposes defining semantics for Business Process Management systems interoperability through the ontology of Electronic Business Documents (EBD) used to interchange the information required to perform cross-organizational processes. The semantic model generated allows aligning enterprise's business processes to support cross-organizational processes by matching the business ontology of each business partner with the EBD ontology. The result is a flexible software architecture that allows dynamically defining cross-organizational business processes by reusing the EBD ontology. For developing the semantic model, a method is presented, which is based on a strategy for discovering entity features whose interpretation depends on the context, and representing them for enriching the ontology. The proposed method complements ontology learning techniques that can not infer semantic features not represented in data sources. In order to improve the representation of these entity features, the method proposes using widely accepted ontologies, for representing time entities and relations, physical quantities, measurement units, official country names, and currencies and funds, among others. When the ontologies reuse is not possible, the method proposes identifying whether that feature is simple or complex, and defines a strategy to be followed. An empirical validation of the approach has been performed through a case study.

  17. A federated semantic metadata registry framework for enabling interoperability across clinical research and care domains.

    Science.gov (United States)

    Sinaci, A Anil; Laleci Erturkmen, Gokce B

    2013-10-01

    In order to enable secondary use of Electronic Health Records (EHRs) by bridging the interoperability gap between clinical care and research domains, in this paper, a unified methodology and the supporting framework is introduced which brings together the power of metadata registries (MDR) and semantic web technologies. We introduce a federated semantic metadata registry framework by extending the ISO/IEC 11179 standard, and enable integration of data element registries through Linked Open Data (LOD) principles where each Common Data Element (CDE) can be uniquely referenced, queried and processed to enable the syntactic and semantic interoperability. Each CDE and their components are maintained as LOD resources enabling semantic links with other CDEs, terminology systems and with implementation dependent content models; hence facilitating semantic search, much effective reuse and semantic interoperability across different application domains. There are several important efforts addressing the semantic interoperability in healthcare domain such as IHE DEX profile proposal, CDISC SHARE and CDISC2RDF. Our architecture complements these by providing a framework to interlink existing data element registries and repositories for multiplying their potential for semantic interoperability to a greater extent. Open source implementation of the federated semantic MDR framework presented in this paper is the core of the semantic interoperability layer of the SALUS project which enables the execution of the post marketing safety analysis studies on top of existing EHR systems. PMID:23751263

  18. CityGML - Interoperable semantic 3D city models

    Science.gov (United States)

    Gröger, Gerhard; Plümer, Lutz

    2012-07-01

    CityGML is the international standard of the Open Geospatial Consortium (OGC) for the representation and exchange of 3D city models. It defines the three-dimensional geometry, topology, semantics and appearance of the most relevant topographic objects in urban or regional contexts. These definitions are provided in different, well-defined Levels-of-Detail (multiresolution model). The focus of CityGML is on the semantical aspects of 3D city models, its structures, taxonomies and aggregations, allowing users to employ virtual 3D city models for advanced analysis and visualization tasks in a variety of application domains such as urban planning, indoor/outdoor pedestrian navigation, environmental simulations, cultural heritage, or facility management. This is in contrast to purely geometrical/graphical models such as KML, VRML, or X3D, which do not provide sufficient semantics. CityGML is based on the Geography Markup Language (GML), which provides a standardized geometry model. Due to this model and its well-defined semantics and structures, CityGML facilitates interoperable data exchange in the context of geo web services and spatial data infrastructures. Since its standardization in 2008, CityGML has become used on a worldwide scale: tools from notable companies in the geospatial field provide CityGML interfaces. Many applications and projects use this standard. CityGML is also having a strong impact on science: numerous approaches use CityGML, particularly its semantics, for disaster management, emergency responses, or energy-related applications as well as for visualizations, or they contribute to CityGML, improving its consistency and validity, or use CityGML, particularly its different Levels-of-Detail, as a source or target for generalizations. This paper gives an overview of CityGML, its underlying concepts, its Levels-of-Detail, how to extend it, its applications, its likely future development, and the role it plays in scientific research. Furthermore, its

  19. Interoperability of learning objects copyright in the LUISA semantic learning management system

    OpenAIRE

    García González, Roberto; Pariente, Tomas

    2009-01-01

    Semantic Web technology is able to provide the required computational semantics for interoperability of learning resources across different Learning Management Systems (LMS) and Learning Object Repositories (LOR). The EU research project LUISA (Learning Content Management System Using Innovative Semantic Web Services Architecture) addresses the development of a reference semantic architecture for the major challenges in the search, interchange and delivery of learning objects in a service-...

  20. Shape-function-relationship (SFR) framework for semantic interoperability of product model

    OpenAIRE

    Gupta, Ravi Kumar; Gurumoorthy, B

    2009-01-01

    The problem of semantic interoperability arises while integrating applications in different task domains across the product life cycle. A new shape-function-relationship (SFR) framework is proposed as a taxonomy based on which an ontology is developed. Ontology based on the SFR framework, that captures explicit definition of terminology and knowledge relationships in terms of shape, function and relationship descriptors, offers an attractive approach for solving semantic interoperability issu...

  1. Achieving interoperability in critical IT and communication systems

    CERN Document Server

    Desourdis, Robert I

    2009-01-01

    Supported by over 90 illustrations, this unique book provides a detailed examination of the subject, focusing on the use of voice, data, and video systems for public safety and emergency response. This practical resource makes in-depth recommendations spanning technical, planning, and procedural approaches to provide efficient public safety response performance. You find covered the many approaches used to achieve interoperability, including a synopsis of the enabling technologies and systems intended to provide radio interoperability. Featuring specific examples nationwide, the book takes you

  2. RuleML-Based Learning Object Interoperability on the Semantic Web

    Science.gov (United States)

    Biletskiy, Yevgen; Boley, Harold; Ranganathan, Girish R.

    2008-01-01

    Purpose: The present paper aims to describe an approach for building the Semantic Web rules for interoperation between heterogeneous learning objects, namely course outlines from different universities, and one of the rule uses: identifying (in)compatibilities between course descriptions. Design/methodology/approach: As proof of concept, a rule…

  3. An Approach to Semantic Interoperability for Improved Capability Exchanges in Federations of Systems

    Science.gov (United States)

    Moschoglou, Georgios

    2013-01-01

    This study seeks an affirmative answer to the question whether a knowledge-based approach to system of systems interoperation using semantic web standards and technologies can provide the centralized control of the capability for exchanging data and services lacking in a federation of systems. Given the need to collect and share real-time…

  4. Semantic Interoperability in Body Area Sensor Networks and Applications

    NARCIS (Netherlands)

    Bui, V.T.; Brandt, P.; Liu, H.; Basten, T.; Lukkien, J.

    2014-01-01

    Crucial to the success of Body Area Sensor Networks is the flexibility with which stakeholders can share, extend and adapt the system with respect to sensors, data and functionality. The first step is to develop an interoperable platform with explicit interfaces, which takes care of common managemen

  5. Sharing meanings: developing interoperable semantic technologies to enhance reproducibility in earth and environmental science research

    Science.gov (United States)

    Schildhauer, M.

    2015-12-01

    Earth and environmental scientists are familiar with the entities, processes, and theories germane to their field of study, and comfortable collecting and analyzing data in their area of interest. Yet, while there appears to be consistency and agreement as to the scientific "terms" used to describe features in their data and analyses, aside from a few fundamental physical characteristics—such as mass or velocity-- there can be broad tolerances, if not considerable ambiguity, in how many earth science "terms" map to the underlying "concepts" that they actually represent. This ambiguity in meanings, or "semantics", creates major problems for scientific reproducibility. It greatly impedes the ability to replicate results—by making it difficult to determine the specifics of the intended meanings of terms such as deforestation or carbon flux -- as to scope, composition, magnitude, etc. In addition, semantic ambiguity complicates assemblage of comparable data for reproducing results, due to ambiguous or idiosyncratic labels for measurements, such as percent cover of forest, where the term "forest" is undefined; or where a reported output of "total carbon-emissions" might just include CO2 emissions, but not methane emissions. In this talk, we describe how the NSF-funded DataONE repository for earth and environmental science data (http://dataone.org), is using W3C-standard languages (RDF/OWL) to build an ontology for clarifying concepts embodied in heterogeneous data and model outputs. With an initial focus on carbon cycling concepts using terrestrial biospheric model outputs and LTER productivity data, we describe how we are achieving interoperability with "semantic vocabularies" (or ontologies) from aligned earth and life science domains, including OBO-foundry ontologies such as ENVO and BCO; the ISO/OGC O&M; and the NSF Earthcube GeoLink project. Our talk will also discuss best practices that may be helpful for other groups interested in constructing their own

  6. Interoperability and different ways of knowing: How semantics can aid in cross-cultural understanding

    Science.gov (United States)

    Pulsifer, P. L.; Parsons, M. A.; Duerr, R. E.; Fox, P. A.; Khalsa, S. S.; McCusker, J. P.; McGuinness, D. L.

    2012-12-01

    differences in its application. Furthermore, it is an analog encoding scheme whose meaning has evolved over time. By semantically modeling the egg code, its subtle variations, and how it connects to other data, we illustrate a mechanism for translating across data formats and representations. But there are limits to what semantically modeling the egg-code can achieve. The egg-code and common operational sea ice formats do not address community needs, notably the timing and processes of sea ice freeze-up and break-up which have profound impact on local hunting, shipping, oil exploration, and safety. We work with local experts from four very different Indigenous communities and scientific creators of sea ice forecasts to establish an understanding of concepts and terminology related to fall freeze-up and spring break up from the individually represented regions. This helps expand our conceptions of sea ice while also aiding in understanding across cultures and communities, and in passing knowledge to younger generations. This is an early step to expanding concepts of interoperability to very different ways of knowing to make data truly relevant and locally useful.

  7. Interoperable cross-domain semantic and geospatial framework for automatic change detection

    Science.gov (United States)

    Kuo, Chiao-Ling; Hong, Jung-Hong

    2016-01-01

    With the increasingly diverse types of geospatial data established over the last few decades, semantic interoperability in integrated applications has attracted much interest in the field of Geographic Information System (GIS). This paper proposes a new strategy and framework to process cross-domain geodata at the semantic level. This framework leverages the semantic equivalence of concepts between domains through bridge ontology and facilitates the integrated use of different domain data, which has been long considered as an essential superiority of GIS, but is impeded by the lack of understanding about the semantics implicitly hidden in the data. We choose the task of change detection to demonstrate how the introduction of ontology concept can effectively make the integration possible. We analyze the common properties of geodata and change detection factors, then construct rules and summarize possible change scenario for making final decisions. The use of topographic map data to detect changes in land use shows promising success, as far as the improvement of efficiency and level of automation is concerned. We believe the ontology-oriented approach will enable a new way for data integration across different domains from the perspective of semantic interoperability, and even open a new dimensionality for the future GIS.

  8. Semantic modeling and interoperability in product and process engineering a technology for engineering informatics

    CERN Document Server

    2013-01-01

    In the past decade, feature-based design and manufacturing has gained some momentum in various engineering domains to represent and reuse semantic patterns with effective applicability. However, the actual scope of feature application is still very limited. Semantic Modeling and Interoperability in Product and Process Engineering provides a systematic solution for the challenging engineering informatics field aiming at the enhancement of sustainable knowledge representation, implementation and reuse in an open and yet practically manageable scale.   This semantic modeling technology supports uniform, multi-facet and multi-level collaborative system engineering with heterogeneous computer-aided tools, such as CADCAM, CAE, and ERP.  This presented unified feature model can be applied to product and process representation, development, implementation and management. Practical case studies and test samples are provided to illustrate applications which can be implemented by the readers in real-world scenarios. �...

  9. Interoperability Between Coastal Web Atlases Using Semantic Mediation: A Case Study of the International Coastal Atlas Network (ICAN)

    Science.gov (United States)

    Wright, D. J.; Lassoued, Y.; Dwyer, N.; Haddad, T.; Bermudez, L. E.; Dunne, D.

    2009-12-01

    Coastal mapping plays an important role in informing marine spatial planning, resource management, maritime safety, hazard assessment and even national sovereignty. As such, there is now a plethora of data/metadata catalogs, pre-made maps, tabular and text information on resource availability and exploitation, and decision-making tools. A recent trend has been to encapsulate these in a special class of web-enabled geographic information systems called a coastal web atlas (CWA). While multiple benefits are derived from tailor-made atlases, there is great value added from the integration of disparate CWAs. CWAs linked to one another can query more successfully to optimize planning and decision-making. If a dataset is missing in one atlas, it may be immediately located in another. Similar datasets in two atlases may be combined to enhance study in either region. *But how best to achieve semantic interoperability to mitigate vague data queries, concepts or natural language semantics when retrieving and integrating data and information?* We report on the development of a new prototype seeking to interoperate between two initial CWAs: the Marine Irish Digital Atlas (MIDA) and the Oregon Coastal Atlas (OCA). These two mature atlases are used as a testbed for more regional connections, with the intent for the OCA to use lessons learned to develop a regional network of CWAs along the west coast, and for MIDA to do the same in building and strengthening atlas networks with the UK, Belgium, and other parts of Europe. Our prototype uses semantic interoperability via services harmonization and ontology mediation, allowing local atlases to use their own data structures, and vocabularies (ontologies). We use standard technologies such as OGC Web Map Services (WMS) for delivering maps, and OGC Catalogue Service for the Web (CSW) for delivering and querying ISO-19139 metadata. The metadata records of a given CWA use a given ontology of terms called local ontology. Human or machine

  10. A Reusable and Interoperable Semantic Classification Tool which Integrates Owl Ontology

    Directory of Open Access Journals (Sweden)

    Saadia Lgarch

    2012-11-01

    Full Text Available In e-Learning systems, tutor plays a very important role to support learners, and guarantee a learning of quality. A successful collaboration between learners and their tutor requires the use of communication tools. Thanks to their flexibility in terms of time, the asynchronous tools as discussion forum are the most used. However this type of tools generates a great mass of messages making tutoring an operation complex to manage, hence the need of a classification tool of messages. We proposed in a first step a semantics classification tool, which is based on the LSA and thesaurus. The possibility that ontology provides to overcome the limitations of the thesaurus encouraged us to use it to control our vocabulary. By the way of our proposed selection algorithm, the OWL ontology is queried to generate new terms which are used to build the LSA matrix. The integration of formal OWL ontology provides a highly relevant semantic classification of messages, and the reuse by other applications of ontological knowledge base is also guaranteed. The interoperability and the knowledge exchange between systems are also ensured by ontology integrated. In order to ensure its reuse and interoperability with systems which requesting for its service of classification, the implementation of our semantic classifier tool basing on the SOA is adopted and it will be explained and tested in this work.

  11. Implementation of a metadata architecture and knowledge collection to support semantic interoperability in an enterprise data warehouse.

    Science.gov (United States)

    Dhaval, Rakesh; Borlawsky, Tara; Ostrander, Michael; Santangelo, Jennifer; Kamal, Jyoti; Payne, Philip R O

    2008-11-06

    In order to enhance interoperability between enterprise systems, and improve data validity and reliability throughout The Ohio State University Medical Center (OSUMC), we have initiated the development of an ontology-anchored metadata architecture and knowledge collection for our enterprise data warehouse. The metadata and corresponding semantic relationships stored in the OSUMC knowledge collection are intended to promote consistency and interoperability across the heterogeneous clinical, research, business and education information managed within the data warehouse.

  12. Cohort Selection and Management Application Leveraging Standards-based Semantic Interoperability and a Groovy DSL.

    Science.gov (United States)

    Bucur, Anca; van Leeuwen, Jasper; Chen, Njin-Zu; Claerhout, Brecht; de Schepper, Kristof; Perez-Rey, David; Paraiso-Medina, Sergio; Alonso-Calvo, Raul; Mehta, Keyur; Krykwinski, Cyril

    2016-01-01

    This paper describes a new Cohort Selection application implemented to support streamlining the definition phase of multi-centric clinical research in oncology. Our approach aims at both ease of use and precision in defining the selection filters expressing the characteristics of the desired population. The application leverages our standards-based Semantic Interoperability Solution and a Groovy DSL to provide high expressiveness in the definition of filters and flexibility in their composition into complex selection graphs including splits and merges. Widely-adopted ontologies such as SNOMED-CT are used to represent the semantics of the data and to express concepts in the application filters, facilitating data sharing and collaboration on joint research questions in large communities of clinical users. The application supports patient data exploration and efficient collaboration in multi-site, heterogeneous and distributed data environments. PMID:27570644

  13. Case Study for Integration of an Oncology Clinical Site in a Semantic Interoperability Solution based on HL7 v3 and SNOMED-CT: Data Transformation Needs.

    Science.gov (United States)

    Ibrahim, Ahmed; Bucur, Anca; Perez-Rey, David; Alonso, Enrique; de Hoog, Matthy; Dekker, Andre; Marshall, M Scott

    2015-01-01

    This paper describes the data transformation pipeline defined to support the integration of a new clinical site in a standards-based semantic interoperability environment. The available datasets combined structured and free-text patient data in Dutch, collected in the context of radiation therapy in several cancer types. Our approach aims at both efficiency and data quality. We combine custom-developed scripts, standard tools and manual validation by clinical and knowledge experts. We identified key challenges emerging from the several sources of heterogeneity in our case study (systems, language, data structure, clinical domain) and implemented solutions that we will further generalize for the integration of new sites. We conclude that the required effort for data transformation is manageable which supports the feasibility of our semantic interoperability solution. The achieved semantic interoperability will be leveraged for the deployment and evaluation at the clinical site of applications enabling secondary use of care data for research. This work has been funded by the European Commission through the INTEGRATE (FP7-ICT-2009-6-270253) and EURECA (FP7-ICT-2011-288048) projects. PMID:26306242

  14. A Joint Initiative to Support the Semantic Interoperability within the GIIDA Project

    CERN Document Server

    Plini, Paolo; De Santis, Valentina; Uricchio, Vito F; De Carlo, Dario; D'Arpa, Stefania; De Martino, Monica; Albertoni, Riccardo

    2010-01-01

    The GIIDA project aims to develop a digital infrastructure for the spatial information within CNR. It is foreseen to use semantic-oriented technologies to ease information modeling and connecting, according to international standards like the ISO/IEC 11179. Complex information management systems, like GIIDA, will take benefit from the use of terminological tools like thesauri that make available a reference lexicon for the indexing and retrieval of information. Within GIIDA the goal is to make available the EARTh thesaurus (Environmental Applications Reference Thesaurus), developed by the CNR-IIA-EKOLab. A web-based software, developed by the CNR-Water Research Institute (IRSA) was implemented to allow consultation and utilization of thesaurus through the web. This service is a useful tool to ensure interoperability between thesaurus and other systems of the indexing, with, the idea of cooperating to develop a comprehensive system of knowledge organization, that could be defined integrated, open, multi-functi...

  15. A Proof-of-Concept for Semantically Interoperable Federation of IoT Experimentation Facilities

    Science.gov (United States)

    Lanza, Jorge; Sanchez, Luis; Gomez, David; Elsaleh, Tarek; Steinke, Ronald; Cirillo, Flavio

    2016-01-01

    The Internet-of-Things (IoT) is unanimously identified as one of the main pillars of future smart scenarios. The potential of IoT technologies and deployments has been already demonstrated in a number of different application areas, including transport, energy, safety and healthcare. However, despite the growing number of IoT deployments, the majority of IoT applications tend to be self-contained, thereby forming application silos. A lightweight data centric integration and combination of these silos presents several challenges that still need to be addressed. Indeed, the ability to combine and synthesize data streams and services from diverse IoT platforms and testbeds, holds the promise to increase the potentiality of smart applications in terms of size, scope and targeted business context. In this article, a proof-of-concept implementation that federates two different IoT experimentation facilities by means of semantic-based technologies will be described. The specification and design of the implemented system and information models will be described together with the practical details of the developments carried out and its integration with the existing IoT platforms supporting the aforementioned testbeds. Overall, the system described in this paper demonstrates that it is possible to open new horizons in the development of IoT applications and experiments at a global scale, that transcend the (silo) boundaries of individual deployments, based on the semantic interconnection and interoperability of diverse IoT platforms and testbeds. PMID:27367695

  16. A Proof-of-Concept for Semantically Interoperable Federation of IoT Experimentation Facilities.

    Science.gov (United States)

    Lanza, Jorge; Sanchez, Luis; Gomez, David; Elsaleh, Tarek; Steinke, Ronald; Cirillo, Flavio

    2016-01-01

    The Internet-of-Things (IoT) is unanimously identified as one of the main pillars of future smart scenarios. The potential of IoT technologies and deployments has been already demonstrated in a number of different application areas, including transport, energy, safety and healthcare. However, despite the growing number of IoT deployments, the majority of IoT applications tend to be self-contained, thereby forming application silos. A lightweight data centric integration and combination of these silos presents several challenges that still need to be addressed. Indeed, the ability to combine and synthesize data streams and services from diverse IoT platforms and testbeds, holds the promise to increase the potentiality of smart applications in terms of size, scope and targeted business context. In this article, a proof-of-concept implementation that federates two different IoT experimentation facilities by means of semantic-based technologies will be described. The specification and design of the implemented system and information models will be described together with the practical details of the developments carried out and its integration with the existing IoT platforms supporting the aforementioned testbeds. Overall, the system described in this paper demonstrates that it is possible to open new horizons in the development of IoT applications and experiments at a global scale, that transcend the (silo) boundaries of individual deployments, based on the semantic interconnection and interoperability of diverse IoT platforms and testbeds. PMID:27367695

  17. A Proof-of-Concept for Semantically Interoperable Federation of IoT Experimentation Facilities

    Directory of Open Access Journals (Sweden)

    Jorge Lanza

    2016-06-01

    Full Text Available The Internet-of-Things (IoT is unanimously identified as one of the main pillars of future smart scenarios. The potential of IoT technologies and deployments has been already demonstrated in a number of different application areas, including transport, energy, safety and healthcare. However, despite the growing number of IoT deployments, the majority of IoT applications tend to be self-contained, thereby forming application silos. A lightweight data centric integration and combination of these silos presents several challenges that still need to be addressed. Indeed, the ability to combine and synthesize data streams and services from diverse IoT platforms and testbeds, holds the promise to increase the potentiality of smart applications in terms of size, scope and targeted business context. In this article, a proof-of-concept implementation that federates two different IoT experimentation facilities by means of semantic-based technologies will be described. The specification and design of the implemented system and information models will be described together with the practical details of the developments carried out and its integration with the existing IoT platforms supporting the aforementioned testbeds. Overall, the system described in this paper demonstrates that it is possible to open new horizons in the development of IoT applications and experiments at a global scale, that transcend the (silo boundaries of individual deployments, based on the semantic interconnection and interoperability of diverse IoT platforms and testbeds.

  18. A Proof-of-Concept for Semantically Interoperable Federation of IoT Experimentation Facilities.

    Science.gov (United States)

    Lanza, Jorge; Sanchez, Luis; Gomez, David; Elsaleh, Tarek; Steinke, Ronald; Cirillo, Flavio

    2016-06-29

    The Internet-of-Things (IoT) is unanimously identified as one of the main pillars of future smart scenarios. The potential of IoT technologies and deployments has been already demonstrated in a number of different application areas, including transport, energy, safety and healthcare. However, despite the growing number of IoT deployments, the majority of IoT applications tend to be self-contained, thereby forming application silos. A lightweight data centric integration and combination of these silos presents several challenges that still need to be addressed. Indeed, the ability to combine and synthesize data streams and services from diverse IoT platforms and testbeds, holds the promise to increase the potentiality of smart applications in terms of size, scope and targeted business context. In this article, a proof-of-concept implementation that federates two different IoT experimentation facilities by means of semantic-based technologies will be described. The specification and design of the implemented system and information models will be described together with the practical details of the developments carried out and its integration with the existing IoT platforms supporting the aforementioned testbeds. Overall, the system described in this paper demonstrates that it is possible to open new horizons in the development of IoT applications and experiments at a global scale, that transcend the (silo) boundaries of individual deployments, based on the semantic interconnection and interoperability of diverse IoT platforms and testbeds.

  19. Interoperability

    DEFF Research Database (Denmark)

    Savin, Andrej

    The European Commission recently proposed a General Data Protection Regulation,1 which is meant to replace the EU Data Protection Directive2 and to thoroughly reform and modernize the EU privacy regulatory framework. The Regulation, if adopted, would introduce a number of changes, several of which...... would considerably alter the current privacy setting.3 First, the current Directive would be replaced with a Regulation, achieving EU-­‐wide harmonization. Second, the scope of the instrument would be widened and the provisions made more precise. Third, the use of consent for data processing would...... be limited. Fourth, Data protection “by design” would be distinguished from data protection “by default”. Fifth, new fundamental rights would be introduced and the old ones clarified. Sixth, new rules on controllers’ and processors’ duties, on supervisory authorities and on sanctions would be introduced...

  20. Proposed Information Sharing Security Approach for Security Personnels, Vertical Integration, Semantic Interoperability Architecture and Framework for Digital Government

    CERN Document Server

    Headayetullah, Md; Biswas, Sanjay; Puthal, B

    2011-01-01

    This paper mainly depicts the conceptual overview of vertical integration, semantic interoperability architecture such as Educational Sector Architectural Framework (ESAF) for New Zealand government and different interoperability framework solution for digital government. In this paper, we try to develop a secure information sharing approach for digital government to improve home land security. This approach is a role and cooperation based approach for security personnel of different government departments. In order to run any successful digital government of any country in the world, it is necessary to interact with their citizen and to share secure information via different network among the citizen or other government. Consequently, in order to smooth the progress of users to cooperate with and share information without darkness and flawlessly transversely different networks and databases universally, a safe and trusted information-sharing environment has been renowned as a very important requirement and t...

  1. Semantic Interoperability for Computational Mineralogy: Experiences of the eMinerals Consortium

    Science.gov (United States)

    Walker, A. M.; White, T. O.; Dove, M. T.; Bruin, R. P.; Couch, P. A.; Tyer, R. P.

    2006-12-01

    The use of atomic scale computer simulation of minerals to obtain information for geophysics and environmental science has grown enormously over the past couple of decades. It is now routine to probe mineral behavior in the Earth's deep interior and in the surface environment by borrowing methods and simulation codes from computational chemistry and physics. It is becoming increasingly important to use methods embodied in more than one of these codes to solve any single scientific problem. However, scientific codes are rarely designed for easy interoperability and data exchange; data formats are often code-specific, poorly documented and fragile, liable to frequent change between software versions, and even compiler versions. This means that the scientist's simple desire to use the methodological approaches offered by multiple codes is frustrated, and even the sharing of data between collaborators becomes fraught with difficulties. The eMinerals consortium was formed in the early stages of the UK eScience program with the aim of developing the tools needed to apply atomic scale simulation to environmental problems in a grid-enabled world, and to harness the computational power offered by grid technologies to address some outstanding mineralogical problems. One example of the kind of problem we can tackle is the origin of the compressibility anomaly in silica glass. By passing data directly between simulation and analysis tools we were able to probe this effect in more detail than has previously been possible and have shown how the anomaly is related to the details of the amorphous structure. In order to approach this kind of problem we have constructed a mini-grid, a small scale and extensible combined compute- and data-grid that allows the execution of many calculations in parallel, and the transparent storage of semantically-rich marked-up result data. Importantly, we automatically capture multiple kinds of metadata and key results from each calculation. We

  2. An Integrated Framework to Achieve Interoperability in Person-Centric Health Management

    Directory of Open Access Journals (Sweden)

    Fabio Vergari

    2011-01-01

    Full Text Available The need for high-quality out-of-hospital healthcare is a known socioeconomic problem. Exploiting ICT's evolution, ad-hoc telemedicine solutions have been proposed in the past. Integrating such ad-hoc solutions in order to cost-effectively support the entire healthcare cycle is still a research challenge. In order to handle the heterogeneity of relevant information and to overcome the fragmentation of out-of-hospital instrumentation in person-centric healthcare systems, a shared and open source interoperability component can be adopted, which is ontology driven and based on the semantic web data model. The feasibility and the advantages of the proposed approach are demonstrated by presenting the use case of real-time monitoring of patients' health and their environmental context.

  3. A Service Oriented Architecture Approach to Achieve Interoperability between Immunization Information Systems in Iran.

    Science.gov (United States)

    Hosseini, Masoud; Ahmadi, Maryam; Dixon, Brian E

    2014-01-01

    Clinical decision support (CDS) systems can support vaccine forecasting and immunization reminders; however, immunization decision-making requires data from fragmented, independent systems. Interoperability and accurate data exchange between immunization information systems (IIS) is an essential factor to utilize Immunization CDS systems. Service oriented architecture (SOA) and Health Level 7 (HL7) are dominant standards for web-based exchange of clinical information. We implemented a system based on SOA and HL7 v3 to support immunization CDS in Iran. We evaluated system performance by exchanging 1500 immunization records for roughly 400 infants between two IISs. System turnaround time is less than a minute for synchronous operation calls and the retrieved immunization history of infants were always identical in different systems. CDS generated reports were accordant to immunization guidelines and the calculations for next visit times were accurate. Interoperability is rare or nonexistent between IIS. Since inter-state data exchange is rare in United States, this approach could be a good prototype to achieve interoperability of immunization information. PMID:25954452

  4. A Service Oriented Architecture Approach to Achieve Interoperability between Immunization Information Systems in Iran

    Science.gov (United States)

    Hosseini, Masoud; Ahmadi, Maryam; Dixon, Brian E.

    2014-01-01

    Clinical decision support (CDS) systems can support vaccine forecasting and immunization reminders; however, immunization decision-making requires data from fragmented, independent systems. Interoperability and accurate data exchange between immunization information systems (IIS) is an essential factor to utilize Immunization CDS systems. Service oriented architecture (SOA) and Health Level 7 (HL7) are dominant standards for web-based exchange of clinical information. We implemented a system based on SOA and HL7 v3 to support immunization CDS in Iran. We evaluated system performance by exchanging 1500 immunization records for roughly 400 infants between two IISs. System turnaround time is less than a minute for synchronous operation calls and the retrieved immunization history of infants were always identical in different systems. CDS generated reports were accordant to immunization guidelines and the calculations for next visit times were accurate. Interoperability is rare or nonexistent between IIS. Since inter-state data exchange is rare in United States, this approach could be a good prototype to achieve interoperability of immunization information. PMID:25954452

  5. A Service Oriented Architecture Approach to Achieve Interoperability between Immunization Information Systems in Iran.

    Science.gov (United States)

    Hosseini, Masoud; Ahmadi, Maryam; Dixon, Brian E

    2014-01-01

    Clinical decision support (CDS) systems can support vaccine forecasting and immunization reminders; however, immunization decision-making requires data from fragmented, independent systems. Interoperability and accurate data exchange between immunization information systems (IIS) is an essential factor to utilize Immunization CDS systems. Service oriented architecture (SOA) and Health Level 7 (HL7) are dominant standards for web-based exchange of clinical information. We implemented a system based on SOA and HL7 v3 to support immunization CDS in Iran. We evaluated system performance by exchanging 1500 immunization records for roughly 400 infants between two IISs. System turnaround time is less than a minute for synchronous operation calls and the retrieved immunization history of infants were always identical in different systems. CDS generated reports were accordant to immunization guidelines and the calculations for next visit times were accurate. Interoperability is rare or nonexistent between IIS. Since inter-state data exchange is rare in United States, this approach could be a good prototype to achieve interoperability of immunization information.

  6. The Semantic Management of Environmental Resources within the Interoperable Context of the EuroGEOSS: Alignment of GEMET and the GEOSS SBAs

    Science.gov (United States)

    Cialone, Claudia; Stock, Kristin

    2010-05-01

    EuroGEOSS is a European Commission funded project. It aims at improving a scientific understanding of the complex mechanisms which drive changes affecting our planet, identifying and establishing interoperable arrangements between environmental information systems. These systems would be sustained and operated by organizations with a clear mandate and resources and rendered available following the specifications of already existent frameworks such as GEOSS (the Global Earth Observation System of systems)1 and INSPIRE (the Infrastructure for Spatial Information in the European Community)2. The EuroGEOSS project's infrastructure focuses on three thematic areas: forestry, drought and biodiversity. One of the important activities in the project is the retrieval, parsing and harmonization of the large amount of heterogeneous environmental data available at local, regional and global levels between these strategic areas. The challenge is to render it semantically and technically interoperable in a simple way. An initial step in achieving this semantic and technical interoperability involves the selection of appropriate classification schemes (for example, thesauri, ontologies and controlled vocabularies) to describe the resources in the EuroGEOSS framework. These classifications become a crucial part of the interoperable framework scaffolding because they allow data providers to describe their resources and thus support resource discovery, execution and orchestration of varying levels of complexity. However, at present, given the diverse range of environmental thesauri, controlled vocabularies and ontologies and the large number of resources provided by project participants, the selection of appropriate classification schemes involves a number of considerations. First of all, there is the semantic difficulty of selecting classification schemes that contain concepts that are relevant to each thematic area. Secondly, EuroGEOSS is intended to accommodate a number of

  7. Quality measurement of semantic standards

    OpenAIRE

    Folmer, E.J.A.; Oude Luttighuis, P.H.W.M.; Hillegersberg, van, R.

    2010-01-01

    Quality of semantic standards is unadressed in current research while there is an explicit need from standard developers. The business importance is evident since quality of standards will have impact on its diffusion and achieved interoperability in practice. An instrument to measure the quality of semantic standards is designed to contribute to the knowledge domain, standards developers and might ultimo lead to improved interoperability. This instrument is iteratively designed with multiple...

  8. Community-Driven Initiatives to Achieve Interoperability for Ecological and Environmental Data

    Science.gov (United States)

    Madin, J.; Bowers, S.; Jones, M.; Schildhauer, M.

    2007-12-01

    interoperability by describing the semantics of data at the level of observation and measurement (rather than the traditional focus at the level of the data set) and will define the necessary specifications and technologies to facilitate semantic interpretation and integration of observational data for the environmental sciences. As such, this initiative will focus on unifying the various existing approaches for representing and describing observation data (e.g., SEEK's Observation Ontology, CUAHSI's Observation Data Model, NatureServe's Observation Data Standard, to name a few). Products of this initiative will be compatible with existing standards and build upon recent advances in knowledge representation (e.g., W3C's recommended Web Ontology Language, OWL) that have demonstrated practical utility in enhancing scientific communication and data interoperability in other communities (e.g., the genomics community). A community-sanctioned, extensible, and unified model for observational data will support metadata standards such as EML while reducing the "babel" of scientific dialects that currently impede effective data integration, which will in turn provide a strong foundation for enabling cross-disciplinary synthetic research in the ecological and environmental sciences.

  9. The Semantic Management of Environmental Resources within the Interoperable Context of the EuroGEOSS: Alignment of GEMET and the GEOSS SBAs

    Science.gov (United States)

    Cialone, Claudia; Stock, Kristin

    2010-05-01

    EuroGEOSS is a European Commission funded project. It aims at improving a scientific understanding of the complex mechanisms which drive changes affecting our planet, identifying and establishing interoperable arrangements between environmental information systems. These systems would be sustained and operated by organizations with a clear mandate and resources and rendered available following the specifications of already existent frameworks such as GEOSS (the Global Earth Observation System of systems)1 and INSPIRE (the Infrastructure for Spatial Information in the European Community)2. The EuroGEOSS project's infrastructure focuses on three thematic areas: forestry, drought and biodiversity. One of the important activities in the project is the retrieval, parsing and harmonization of the large amount of heterogeneous environmental data available at local, regional and global levels between these strategic areas. The challenge is to render it semantically and technically interoperable in a simple way. An initial step in achieving this semantic and technical interoperability involves the selection of appropriate classification schemes (for example, thesauri, ontologies and controlled vocabularies) to describe the resources in the EuroGEOSS framework. These classifications become a crucial part of the interoperable framework scaffolding because they allow data providers to describe their resources and thus support resource discovery, execution and orchestration of varying levels of complexity. However, at present, given the diverse range of environmental thesauri, controlled vocabularies and ontologies and the large number of resources provided by project participants, the selection of appropriate classification schemes involves a number of considerations. First of all, there is the semantic difficulty of selecting classification schemes that contain concepts that are relevant to each thematic area. Secondly, EuroGEOSS is intended to accommodate a number of

  10. Using architectures for semantic interoperability to create journal clubs for emergency response

    Energy Technology Data Exchange (ETDEWEB)

    Powell, James E [Los Alamos National Laboratory; Collins, Linn M [Los Alamos National Laboratory; Martinez, Mark L B [Los Alamos National Laboratory

    2009-01-01

    In certain types of 'slow burn' emergencies, careful accumulation and evaluation of information can offer a crucial advantage. The SARS outbreak in the first decade of the 21st century was such an event, and ad hoc journal clubs played a critical role in assisting scientific and technical responders in identifying and developing various strategies for halting what could have become a dangerous pandemic. This research-in-progress paper describes a process for leveraging emerging semantic web and digital library architectures and standards to (1) create a focused collection of bibliographic metadata, (2) extract semantic information, (3) convert it to the Resource Description Framework /Extensible Markup Language (RDF/XML), and (4) integrate it so that scientific and technical responders can share and explore critical information in the collections.

  11. PROPOSED INFORMATION SHARING SECURITY APPROACH FOR SECURITY PERSONNELS, VERTICAL INTEGRATION, SEMANTIC INTEROPERABILITY ARCHITECTURE AND FRAMEWORK FOR DIGITAL GOVERNMENT

    Directory of Open Access Journals (Sweden)

    Md.Headayetullah

    2011-06-01

    Full Text Available This paper mainly depicts the conceptual overview of vertical integration, semantic interoperability architecture such as Educational Sector Architectural Framework (ESAF for New Zealand governmentand different interoperability framework solution for digital government. In this paper, we try to develop a secure information sharing approach for digital government to improve home land security. This approach is a role and cooperation based approach for security personnel of different government departments. In order to run any successful digital government of any country in the world, it is necessary to interact with their citizen and to share secure information via different network among the citizen or other government. Consequently, in order to smooth the progress of users to cooperate with and share information without darkness and flawlessly transversely different networks and databases universally, a safe and trusted information-sharing environment has been renowned as a very important requirement and to press forward homeland security endeavor. The key incentive following this research is to put up a secure and trusted information-sharing approach for government departments. This paper presents a proficient function and teamwork based information sharing approach for safe exchange of hush-hush and privileged information amid security personnels and government departments inside the national boundaries by means of public key cryptography. The expanded approach makes use of cryptographic hash function; public key cryptosystem and a unique and complex mapping function for securely swapping over secret information. Moreover, the projected approach facilitates privacy preserving information sharing with probable restrictions based on the rank of the security personnels. The projected function and collaboration based information sharing approach ensures protected and updated information sharing between security personnels and government

  12. 基于本体的空间信息语义互操作研究%Geospatial Semantic Interoperability Based on Ontology

    Institute of Scientific and Technical Information of China (English)

    王艳东; 龚健雅; 吴小凰

    2007-01-01

    In GIS field, great varieties of information from different domains are involved in order to solve actual problems. But usually spatial information is stored in diverse spatial databases, manipulated by different GIS platforms. Semantic heterogeneity is caused due to the distinctions of conception explanations among various GIS implements. It will result in the information obtaining and understanding gaps for spatial data sharing and usage. An ontology-based model for spatial information semantic interoperability is put forward after the comprehensive review of progress in ontology theory, methodology and application research in GIS domain.

  13. An HL7-CDA wrapper for facilitating semantic interoperability to rule-based Clinical Decision Support Systems.

    Science.gov (United States)

    Sáez, Carlos; Bresó, Adrián; Vicente, Javier; Robles, Montserrat; García-Gómez, Juan Miguel

    2013-03-01

    The success of Clinical Decision Support Systems (CDSS) greatly depends on its capability of being integrated in Health Information Systems (HIS). Several proposals have been published up to date to permit CDSS gathering patient data from HIS. Some base the CDSS data input on the HL7 reference model, however, they are tailored to specific CDSS or clinical guidelines technologies, or do not focus on standardizing the CDSS resultant knowledge. We propose a solution for facilitating semantic interoperability to rule-based CDSS focusing on standardized input and output documents conforming an HL7-CDA wrapper. We define the HL7-CDA restrictions in a HL7-CDA implementation guide. Patient data and rule inference results are mapped respectively to and from the CDSS by means of a binding method based on an XML binding file. As an independent clinical document, the results of a CDSS can present clinical and legal validity. The proposed solution is being applied in a CDSS for providing patient-specific recommendations for the care management of outpatients with diabetes mellitus. PMID:23199936

  14. Quality model for semantic IS standards

    NARCIS (Netherlands)

    Folmer, E.J.A.

    2011-01-01

    Semantic IS (Information Systems) standards are essential for achieving interoperability between organizations. However a recent survey suggests that not the full benefits of standards are achieved, due to the quality issues. This paper presents a quality model for semantic IS standards, that should

  15. 语义互操作在电子政务建设中的应用及对策研究%Research on the Application of Semantic Interoperability in E-Government

    Institute of Scientific and Technical Information of China (English)

    田浩; 段丽君

    2011-01-01

    分析了语义互操作在电子政务中的应用现状,总结了我国电子政务建设中存在的问题,并就如何建设基于语义互操作的政务知识协同体系提出了对策.%Semantic interoperability can overcome the influence of heterogeneous information to realize the automation and intelligent information processing,and can also eliminate the information islands in electronic government affairs so as to promote e-government's overall business synergy.This paper analyzes the application status of semantic interoperability in the electronic government affairs in China,summarizes problems in the construction of e-government,and puts forward some suggestions to construct e-government knowledge coordination system based on semantic interoperability in our country.This article provides beneficial references for Chinese e-government construction.

  16. Achieving control and interoperability through unified model-based systems and software engineering

    Science.gov (United States)

    Rasmussen, Robert; Ingham, Michel; Dvorak, Daniel

    2005-01-01

    Control and interoperation of complex systems is one of the most difficult challenges facing NASA's Exploration Systems Mission Directorate. An integrated but diverse array of vehicles, habitats, and supporting facilities, evolving over the long course of the enterprise, must perform ever more complex tasks while moving steadily away from the sphere of ground support and intervention.

  17. Commonality based interoperability

    Science.gov (United States)

    Moulton, Christine L.; Hepp, Jared J.; Harrell, John

    2016-05-01

    What interoperability is and why the Army wants it between systems is easily understood. Enabling multiple systems to work together and share data across boundaries in a co-operative manner will benefit the warfighter by allowing for easy access to previously hard-to-reach capabilities. How to achieve interoperability is not as easy to understand due to the numerous different approaches that accomplish the goal. Commonality Based Interoperability (CBI) helps establish how to achieve the goal by extending the existing interoperability definition. CBI is not an implementation, nor is it an architecture; it is a definition of interoperability with a foundation of establishing commonality between systems.

  18. Geospatial semantic web

    CERN Document Server

    Zhang, Chuanrong; Li, Weidong

    2015-01-01

    This book covers key issues related to Geospatial Semantic Web, including geospatial web services for spatial data interoperability; geospatial ontology for semantic interoperability; ontology creation, sharing, and integration; querying knowledge and information from heterogeneous data source; interfaces for Geospatial Semantic Web, VGI (Volunteered Geographic Information) and Geospatial Semantic Web; challenges of Geospatial Semantic Web; and development of Geospatial Semantic Web applications. This book also describes state-of-the-art technologies that attempt to solve these problems such

  19. Achieving semantic interoperability in multi-agent systems: A dialogue-based approach

    NARCIS (Netherlands)

    Diggelen, J. van

    2007-01-01

    Software agents sharing the same ontology can exchange their knowledge fluently as their knowledge representations are compatible with respect to the concepts regarded as relevant and with respect to the names given to these concepts. However, in open heterogeneous multi-agent systems, this scenario

  20. A Formal Approach to Protocol Interoperability Testing

    Institute of Scientific and Technical Information of China (English)

    郝瑞兵; 吴建平

    1998-01-01

    Porotocol Interoperability testing is an important means to ensure the interconnection and interoperation between protocol products.In this paper,we proposed a formal approach to protocol interoperability testing based on the operational semantics of Concurrent TTCN.We define Concurrent TTCN's operational semantics by using Labeled Transition System,and describe the interoperability test execution and test verdict based on Concurrent TTCN.This approach is very helpful for the formation of formal interoperability testing theory and construction of general interoperability testing system.

  1. Quality measurement of semantic standards

    NARCIS (Netherlands)

    Folmer, E.J.A.; Oude Luttighuis, P.H.W.M.; Hillegersberg, J. van

    2010-01-01

    Quality of semantic standards is unadressed in current research while there is an explicit need from standard developers. The business importance is evident since quality of standards will have impact on its diffusion and achieved interoperability in practice. An instrument to measure the quality of

  2. Tuning Ontology Interoperability

    OpenAIRE

    Giunchiglia, Fausto; Pan, Jeff Z.; Serafini, Luciano

    2005-01-01

    The main contribution of this paper is the notion of ontology space, which allows us to move from an ontology-centric vision to a constellation-centric vision of the Web, where multiple ontologies and their interactions can be explicitly modeled and studied. This, in turn, allows us to study how OWL ontologies can interoperate, and, in particular, to provide two main results. The first is a formalization of the intended semantics of the OWL importing operator as opaque semantics. This result ...

  3. Benchmarking Semantic Web technology

    OpenAIRE

    García-Castro, Raúl

    2008-01-01

    Semantic Web technologies need to interchange ontologies for further use. Due to the heterogeneity in the knowledge representation formalisms of the different existing technologies, interoperability is a problem in the SemanticWeb and the limits of the interoperability of current technologies are yet unknown. A massive improvement of the interoperability of current SemanticWeb technologies, or of any other characteristic of these technologies, requires continuous evaluations that should be de...

  4. Achieving mask order processing automation, interoperability and standardization based on P10

    Science.gov (United States)

    Rodriguez, B.; Filies, O.; Sadran, D.; Tissier, Michel; Albin, D.; Stavroulakis, S.; Voyiatzis, E.

    2007-02-01

    Last year the MUSCLE (Masks through User's Supply Chain: Leadership by Excellence) project was presented. Here is the project advancement. A key process in mask supply chain management is the exchange of technical information for ordering masks. This process is large, complex, company specific and error prone, and leads to longer cycle times and higher costs due to missing or wrong inputs. Its automation and standardization could produce significant benefits. We need to agree on the standard for mandatory and optional parameters, and also a common way to describe parameters when ordering. A system was created to improve the performance in terms of Key Performance Indicators (KPIs) such as cycle time and cost of production. This tool allows us to evaluate and measure the effect of factors, as well as the effect of implementing the improvements of the complete project. Next, a benchmark study and a gap analysis were performed. These studies show the feasibility of standardization, as there is a large overlap in requirements. We see that the SEMI P10 standard needs enhancements. A format supporting the standard is required, and XML offers the ability to describe P10 in a flexible way. Beyond using XML for P10, the semantics of the mask order should also be addressed. A system design and requirements for a reference implementation for a P10 based management system are presented, covering a mechanism for the evolution and for version management and a design for P10 editing and data validation.

  5. An HL7-CDA wrapper for facilitating semantic interoperability to rule-based Clinical Decision Support Systems

    OpenAIRE

    Sáez Silvestre, Carlos; BRESÓ GUARDADO, ADRIÁN; Vicente Robledo, Javier; Robles Viejo, Montserrat; García Gómez, Juan Miguel

    2013-01-01

    The success of Clinical Decision Support Systems (CDSS) greatly depends on its capability of being integrated in Health Information Systems (HIS). Several proposals have been published up to date to permit CDSS gathering patient data from HIS. Some base the CDSS data input on the HL7 reference model, however, they are tailored to specific CDSS or clinical guidelines technologies, or do not focus on standardizing the CDSS resultant knowledge. We propose a solution for facilitating semantic int...

  6. Towards technical interoperability in telemedicine.

    Energy Technology Data Exchange (ETDEWEB)

    Craft, Richard Layne, II

    2004-05-01

    For telemedicine to realize the vision of anywhere, anytime access to care, the question of how to create a fully interoperable technical infrastructure must be addressed. After briefly discussing how 'technical interoperability' compares with other types of interoperability being addressed in the telemedicine community today, this paper describes reasons for pursuing technical interoperability, presents a proposed framework for realizing technical interoperability, identifies key issues that will need to be addressed if technical interoperability is to be achieved, and suggests a course of action that the telemedicine community might follow to accomplish this goal.

  7. Applying the Levels of Conceptual Interoperability Model in Support of Integratability, Interoperability, and Composability for System-of-Systems Engineering

    Directory of Open Access Journals (Sweden)

    Andreas Tolk

    2007-10-01

    Full Text Available The Levels of Conceptual Interoperability Model (LCIM was developed to cope with the different layers of interoperation of modeling and simulation applications. It introduced technical, syntactic, semantic, pragmatic, dynamic, and conceptual layers of interoperation and showed how they are related to the ideas of integratability, interoperability, and composability. The model was successfully applied in various domains of systems, cybernetics, and informatics.

  8. The XML and Semantic Web Worlds: Technologies, Interoperability and Integration. A Survey of the State of the Art

    OpenAIRE

    Bikakis, Nikos; Tsinaraki, Chrisa; Gioldasis, Nektarios; Stavrakantonakis, Ioannis; Christodoulakis, Stavros

    2016-01-01

    In the context of the emergent Web of Data, a large number of organizations, institutes and companies (e.g., DBpedia, Geonames, PubMed ACM, IEEE, NASA, BBC) adopt the Linked Data practices and publish their data utilizing Semantic Web (SW) technologies. On the other hand, the dominant standard for information exchange in the Web today is XML. Many international standards (e.g., Dublin Core, MPEG-7, METS, TEI, IEEE LOM) have been expressed in XML Schema resulting to a large number of XML datas...

  9. Towards a Common Platform to Support Business Processes, Services and Semantics

    Science.gov (United States)

    Piprani, Baba

    The search for the Holy Grail in achieving interoperability of business processes, services and semantics continues with every new type or search for the Silver Bullet. Most approaches towards interoperability either are focusing narrowly on the simplistic notion using technology supporting a cowboy-style development without much regard to metadata or semantics. At the same time, the distortions on semantics created by many of current modeling paradigms and approaches - including the disharmony created by multiplicity of parallel approaches to standardization - are not helping us resolve the real issues facing knowledge and semantics management. This paper will address some of the issues facing us, like: What have we achieved? Where did we go wrong? What are we doing right? - providing an ipso-facto encapsulated candid snapshot on an approach to harmonizing our approach to interoperability, and propose a common platform to support Business Processes, Services and Semantics.

  10. Data interchange standards in healthcare IT--computable semantic interoperability: now possible but still difficult, do we really need a better mousetrap?

    Science.gov (United States)

    Mead, Charles N

    2006-01-01

    The following article on HL7 Version 3 will give readers a glimpse into the significant differences between "what came before"--that is, HL7 Version 2.x--and "what today and the future will bring," which is the HL7 Version 3 family of data interchange specifications. The difference between V2.x and V3 is significant, and it exists because the various stakeholders in the HL7 development process believe that the increased depth, breadth, and, to some degree, complexity that characterize V3 are necessary to solve many of today's and tomorrow's increasingly wide, deep and complex healthcare information data interchange requirements. Like many healthcare or technology discussions, this discussion has its own vocabulary of somewhat obscure, but not difficult, terms. This article will define the minimum set that is necessary for readers to appreciate the relevance and capabilities of HL7 Version 3, including how it is different than HL7 Version 2. After that, there will be a brief overview of the primary motivations for HL7 Version 3 in the presence of the unequivocal success of Version 2. In this context, the article will give readers an overview of one of the prime constructs of Version 3, the Reference Information Model (RIM). There are 'four pillars that are necessary but not sufficient to obtain computable semantic interoperability." These four pillars--a cross-domain information model; a robust data type specification; a methodology for separating domain-specific terms from, as well as binding them to, the common model; and a top-down interchange specification methodology and tools for using 1, 2, 3 and defining Version 3 specification--collectively comprise the "HL7 Version 3 Toolkit." Further, this article will present a list of questions and answers to help readers assess the scope and complexity of the problems facing healthcare IT today, and which will further enlighten readers on the "reality" of HL7 Version 3. The article will conclude with a "pseudo

  11. Semantic Registration and Discovery System of Subsystems and Services within an Interoperable Coordination Platform in Smart Cities.

    Science.gov (United States)

    Rubio, Gregorio; Martínez, José Fernán; Gómez, David; Li, Xin

    2016-06-24

    Smart subsystems like traffic, Smart Homes, the Smart Grid, outdoor lighting, etc. are built in many urban areas, each with a set of services that are offered to citizens. These subsystems are managed by self-contained embedded systems. However, coordination and cooperation between them are scarce. An integration of these systems which truly represents a "system of systems" could introduce more benefits, such as allowing the development of new applications and collective optimization. The integration should allow maximum reusability of available services provided by entities (e.g., sensors or Wireless Sensor Networks). Thus, it is of major importance to facilitate the discovery and registration of available services and subsystems in an integrated way. Therefore, an ontology-based and automatic system for subsystem and service registration and discovery is presented. Using this proposed system, heterogeneous subsystems and services could be registered and discovered in a dynamic manner with additional semantic annotations. In this way, users are able to build customized applications across different subsystems by using available services. The proposed system has been fully implemented and a case study is presented to show the usefulness of the proposed method.

  12. Semantic Registration and Discovery System of Subsystems and Services within an Interoperable Coordination Platform in Smart Cities

    Directory of Open Access Journals (Sweden)

    Gregorio Rubio

    2016-06-01

    Full Text Available Smart subsystems like traffic, Smart Homes, the Smart Grid, outdoor lighting, etc. are built in many urban areas, each with a set of services that are offered to citizens. These subsystems are managed by self-contained embedded systems. However, coordination and cooperation between them are scarce. An integration of these systems which truly represents a “system of systems” could introduce more benefits, such as allowing the development of new applications and collective optimization. The integration should allow maximum reusability of available services provided by entities (e.g., sensors or Wireless Sensor Networks. Thus, it is of major importance to facilitate the discovery and registration of available services and subsystems in an integrated way. Therefore, an ontology-based and automatic system for subsystem and service registration and discovery is presented. Using this proposed system, heterogeneous subsystems and services could be registered and discovered in a dynamic manner with additional semantic annotations. In this way, users are able to build customized applications across different subsystems by using available services. The proposed system has been fully implemented and a case study is presented to show the usefulness of the proposed method.

  13. Semantic Registration and Discovery System of Subsystems and Services within an Interoperable Coordination Platform in Smart Cities

    Science.gov (United States)

    Rubio, Gregorio; Martínez, José Fernán; Gómez, David; Li, Xin

    2016-01-01

    Smart subsystems like traffic, Smart Homes, the Smart Grid, outdoor lighting, etc. are built in many urban areas, each with a set of services that are offered to citizens. These subsystems are managed by self-contained embedded systems. However, coordination and cooperation between them are scarce. An integration of these systems which truly represents a “system of systems” could introduce more benefits, such as allowing the development of new applications and collective optimization. The integration should allow maximum reusability of available services provided by entities (e.g., sensors or Wireless Sensor Networks). Thus, it is of major importance to facilitate the discovery and registration of available services and subsystems in an integrated way. Therefore, an ontology-based and automatic system for subsystem and service registration and discovery is presented. Using this proposed system, heterogeneous subsystems and services could be registered and discovered in a dynamic manner with additional semantic annotations. In this way, users are able to build customized applications across different subsystems by using available services. The proposed system has been fully implemented and a case study is presented to show the usefulness of the proposed method. PMID:27347965

  14. Semantic Registration and Discovery System of Subsystems and Services within an Interoperable Coordination Platform in Smart Cities.

    Science.gov (United States)

    Rubio, Gregorio; Martínez, José Fernán; Gómez, David; Li, Xin

    2016-01-01

    Smart subsystems like traffic, Smart Homes, the Smart Grid, outdoor lighting, etc. are built in many urban areas, each with a set of services that are offered to citizens. These subsystems are managed by self-contained embedded systems. However, coordination and cooperation between them are scarce. An integration of these systems which truly represents a "system of systems" could introduce more benefits, such as allowing the development of new applications and collective optimization. The integration should allow maximum reusability of available services provided by entities (e.g., sensors or Wireless Sensor Networks). Thus, it is of major importance to facilitate the discovery and registration of available services and subsystems in an integrated way. Therefore, an ontology-based and automatic system for subsystem and service registration and discovery is presented. Using this proposed system, heterogeneous subsystems and services could be registered and discovered in a dynamic manner with additional semantic annotations. In this way, users are able to build customized applications across different subsystems by using available services. The proposed system has been fully implemented and a case study is presented to show the usefulness of the proposed method. PMID:27347965

  15. The Fractal Nature of the Semantic Web

    OpenAIRE

    Berners-Lee, Tim; Massachusetts Institute of Technology; Kagal, Lalana; Massachusetts Institute of Technology

    2008-01-01

    In the past, many knowledge representation systems failed because they were too monolithic and didn’t scale well, whereas other systems failed to have an impact because they were small and isolated. Along with this trade-off in size, there is also a constant tension between the cost involved in building a larger community that can interoperate through common terms and the cost of the lack of interoperability. The semantic web offers a good compromise between these approaches as it achieves wi...

  16. Basic semantic architecture of interoperability for the intelligent distribution in the CFE electrical system; Arquitectura base de interoperabilidad semantica para el sistema electrico de distribucion inteligente en la CFE

    Energy Technology Data Exchange (ETDEWEB)

    Espinosa Reza, Alfredo; Garcia Mendoza, Raul; Borja Diaz, Jesus Fidel; Sierra Rodriguez, Benjamin [Instituto de Investigaciones Electricas, Cuernavaca, Morelos (Mexico)

    2010-07-01

    The physical and logical architecture of the interoperability platform defined for the distribution management systems (DMS), of the Distribution Subdivision of Comision Federal de Electricidad (CFE) in Mexico is presented. The adopted architecture includes the definition of a technological platform to manage the exchange of information between systems and applications, sustained in the Model of Common Information (CIM), established in norms IEC61968 and IEC 61970. The architecture based on SSOA (Semantic Services Oriented Architecture), on EIB (Enterprise Integration Bus) and on GID (Generic Interface Definition) is presented, as well as the sequence to obtain the interoperability of systems related to the Distribution Management of the of electrical energy in Mexico. Of equal way it is described the process to establish a Semantic Model of the Electrical System of Distribution (SED) and the creation of instances CIM/XML, oriented to the interoperability of the information systems in the DMS scope, by means of the interchange of messages conformed and validated according to the structure obtained and agreed to the rules established by Model CIM. In this way, the messages and the information interchanged among systems, assure the compatibility and correct interpretation in an independent way to the developer, mark or manufacturer of the system source and destiny. The primary target is to establish the infrastructure semantic base of interoperability, cradle in standards that sustain the strategic definition of an Electrical System of Intelligent Distribution (SEDI) in Mexico. [Spanish] Se presenta la arquitectura fisica y logica de la plataforma de interoperabilidad definida para los sistemas de gestion de la distribucion (DMS por sus siglas en ingles), de la Subdireccion de Distribucion de la Comision Federal de Electricidad (CFE) en Mexico. La arquitectura adoptada incluye la definicion de una plataforma tecnologica para gestionar el intercambio de informacion

  17. Knowledge Organization Tools and the Method System of Their Semantic Interoperability%知识组织的工具及其语义互操作方法体系

    Institute of Scientific and Technical Information of China (English)

    王景侠

    2013-01-01

      Under the Web environment, the tools of knowledge organization are mainly al kinds of knowledge organization system (KOS), and interoperability of KOS has been the hot problem between the researches and applications. Based on the analysis of knowledge organization tools, the paper mainly discusses the method of Semantic Interoperability among traditional knowledge organization tools and modern knowledge organization tools represented by ontology, in order to provide reference to various types of information agencies digital resource integrating, resource sharing and knowledge services.%  网络环境下,知识组织所用的工具主要是各类知识组织系统,同时知识组织系统之间的互操作已成为知识组织研究和应用中的热点问题。文章在分析知识组织工具的基础上重点阐述了传统知识组织工具以及本体为代表的现代知识组织工具的语义互操作方法,以期为图书馆等信息机构在进行数字资源整合、资源共享和知识服务时提供参考。

  18. Lemnos Interoperable Security Program

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, John [Tennessee Valley Authority, Knoxville, TN (United States); Halbgewachs, Ron [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Chavez, Adrian [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Smith, Rhett [Schweitzer Engineering Laboratories, Chattanooga, TN (United States); Teumim, David [Teumim Technical, Allentown, PA (United States)

    2012-01-31

    The manner in which the control systems are being designed and operated in the energy sector is undergoing some of the most significant changes in history due to the evolution of technology and the increasing number of interconnections to other system. With these changes however come two significant challenges that the energy sector must face; 1) Cyber security is more important than ever before, and 2) Cyber security is more complicated than ever before. A key requirement in helping utilities and vendors alike in meeting these challenges is interoperability. While interoperability has been present in much of the discussions relating to technology utilized within the energy sector and especially the Smart Grid, it has been absent in the context of cyber security. The Lemnos project addresses these challenges by focusing on the interoperability of devices utilized within utility control systems which support critical cyber security functions. In theory, interoperability is possible with many of the cyber security solutions available to utilities today. The reality is that the effort required to achieve cyber security interoperability is often a barrier for utilities. For example, consider IPSec, a widely-used Internet Protocol to define Virtual Private Networks, or tunnels , to communicate securely through untrusted public and private networks. The IPSec protocol suite has a significant number of configuration options and encryption parameters to choose from, which must be agreed upon and adopted by both parties establishing the tunnel. The exercise in getting software or devices from different vendors to interoperate is labor intensive and requires a significant amount of security expertise by the end user. Scale this effort to a significant number of devices operating over a large geographical area and the challenge becomes so overwhelming that it often leads utilities to pursue solutions from a single vendor. These single vendor solutions may inadvertently lock

  19. A Semantically Automated Protocol Adapter for Mapping SOAP Web Services to RESTful HTTP Format to Enable the Web Infrastructure, Enhance Web Service Interoperability and Ease Web Service Migration

    Directory of Open Access Journals (Sweden)

    Frank Doheny

    2012-04-01

    Full Text Available Semantic Web Services (SWS are Web Service (WS descriptions augmented with semantic information. SWS enable intelligent reasoning and automation in areas such as service discovery, composition, mediation, ranking and invocation. This paper applies SWS to a previous protocol adapter which, operating within clearly defined constraints, maps SOAP Web Services to RESTful HTTP format. However, in the previous adapter, the configuration element is manual and the latency implications are locally based. This paper applies SWS technologies to automate the configuration element and the latency tests are conducted in a more realistic Internet based setting.

  20. Benchmarking semantic web technology

    CERN Document Server

    García-Castro, R

    2009-01-01

    This book addresses the problem of benchmarking Semantic Web Technologies; first, from a methodological point of view, proposing a general methodology to follow in benchmarking activities over Semantic Web Technologies and, second, from a practical point of view, presenting two international benchmarking activities that involved benchmarking the interoperability of Semantic Web technologies using RDF(S) as the interchange language in one activity and OWL in the other.The book presents in detail how the different resources needed for these interoperability benchmarking activities were defined:

  1. Semantic Web

    OpenAIRE

    Anna Lamandini

    2011-01-01

    The semantic Web is a technology at the service of knowledge which is aimed at accessibility and the sharing of content; facilitating interoperability between different systems and as such is one of the nine key technological pillars of TIC (technologies for information and communication) within the third theme, programme specific cooperation of the seventh programme framework for research and development (7°PQRS, 2007-2013). As a system it seeks to overcome overload or excess of irrelevant i...

  2. Semantic based P2P System for local e-Government

    OpenAIRE

    Ortiz-Rodriguez, F.; Palma, R.; Villazón-Terrazas, B.

    2006-01-01

    The Electronic Government is an emerging field of applications for the Semantic Web where ontologies are becoming an important research technology. The e-Government faces considerable challenges to achieve interoperability given the semantic differences of interpretation, omplexity and width of scope. This paper addresses the importance of providing an infrastructure capable of dealing with issues such as: communications between public administrations across government and retrieval of offici...

  3. HeartDrive: A Broader Concept of Interoperability to Implement Care Processes for Heart Failure.

    Science.gov (United States)

    Lettere, M; Guerri, D; La Manna, S; Groccia, M C; Lofaro, D; Conforti, D

    2016-01-01

    This paper originates from the HeartDrive project, a platform of services for a more effective, efficient and integrated management of heart failure and comorbidities. HeartDrive establishes a cooperative approach based on the concepts of continuity of care and extreme, patient oriented, customization of diagnostic, therapeutic and follow-up procedures. Definition and development of evidence based processes, migration from parceled and episode based healthcare provisioning to a workflow oriented model and increased awareness and responsibility of citizens towards their own health and wellness are key objectives of HeartDrive. In two scenarios for rehabilitation and home monitoring we show how the results are achieved by providing a solution that highlights a broader concept of cooperation that goes beyond technical interoperability towards semantic interoperability explicitly sharing process definitions, decision support strategies and information semantics. PMID:27225572

  4. SEMANTIC INTEGRATION FOR AUTOMATIC ONTOLOGY MAPPING

    Directory of Open Access Journals (Sweden)

    Siham AMROUCH

    2013-11-01

    Full Text Available In the last decade, ontologies have played a key technology role for information sharing and agents interoperability in different application domains. In semantic web domain, ontologies are efficiently used to face the great challenge of representing the semantics of data, in order to bring the actual web to its full power and hence, achieve its objective. However, using ontologies as common and shared vocabularies requires a certain degree of interoperability between them. To confront this requirement, mapping ontologies is a solution that is not to be avoided. In deed, ontology mapping build a meta layer that allows different applications and information systems to access and share their informations, of course, after resolving the different forms of syntactic, semantic and lexical mismatches. In the contribution presented in this paper, we have integrated the semantic aspect based on an external lexical resource, wordNet, to design a new algorithm for fully automatic ontology mapping. This fully automatic character features the main difference of our contribution with regards to the most of the existing semi-automatic algorithms of ontology mapping, such as Chimaera, Prompt, Onion, Glue, etc. To better enhance the performances of our algorithm, the mapping discovery stage is based on the combination of two sub-modules. The former analysis the concept’s names and the later analysis their properties. Each one of these two sub-modules is it self based on the combination of lexical and semantic similarity measures.

  5. Towards Interoperability for Public Health Surveillance: Experiences from Two States

    OpenAIRE

    Dixon, Brian E.; Siegel, Jason A.; Oemig, Tanya V.; Grannis, Shaun J

    2013-01-01

    Objective To characterize the use of standardized vocabularies in real-world electronic laboratory reporting (ELR) messages sent to public health agencies for surveillance. Introduction The use of health information systems to electronically deliver clinical data necessary for notifiable disease surveillance is growing. For health information systems to be effective at improving population surveillance functions, semantic interoperability is necessary. Semantic interoperability is “the abilit...

  6. Turning Interoperability Operational with GST

    Science.gov (United States)

    Schaeben, Helmut; Gabriel, Paul; Gietzel, Jan; Le, Hai Ha

    2013-04-01

    GST - Geosciences in space and time is being developed and implemented as hub to facilitate the exchange of spatially and temporally indexed multi-dimensional geoscience data and corresponding geomodels amongst partners. It originates from TUBAF's contribution to the EU project "ProMine" and its perspective extensions are TUBAF's contribution to the actual EU project "GeoMol". As of today, it provides basic components of a geodata infrastructure as required to establish interoperability with respect to geosciences. Generally, interoperability means the facilitation of cross-border and cross-sector information exchange, taking into account legal, organisational, semantic and technical aspects, cf. Interoperability Solutions for European Public Administrations (ISA), cf. http://ec.europa.eu/isa/. Practical interoperability for partners of a joint geoscience project, say European Geological Surveys acting in a border region, means in particular provision of IT technology to exchange spatially and maybe additionally temporally indexed multi-dimensional geoscience data and corresponding models, i.e. the objects composing geomodels capturing the geometry, topology, and various geoscience contents. Geodata Infrastructure (GDI) and interoperability are objectives of several inititatives, e.g. INSPIRE, OneGeology-Europe, and most recently EGDI-SCOPE to name just the most prominent ones. Then there are quite a few markup languages (ML) related to geographical or geological information like GeoSciML, EarthResourceML, BoreholeML, ResqML for reservoir characterization, earth and reservoir models, and many others featuring geoscience information. Several Web Services are focused on geographical or geoscience information. The Open Geospatial Consortium (OGC) promotes specifications of a Web Feature Service (WFS), a Web Map Service (WMS), a Web Coverage Serverice (WCS), a Web 3D Service (W3DS), and many more. It will be clarified how GST is related to these initiatives, especially

  7. An Interoperability Infrastructure for Developing Multidatabase Systems

    OpenAIRE

    Doğaç, Asuman; Özhan, Gökhan; Kılıç, Ebru; Özcan, Fatma; Nural, Sena; Sema

    1998-01-01

    A multidatabase system (MDBS) allows the users to simultaneously access autonomous, heterogeneous databases using a single data model and a query language. This provides for achieving interoperability among heterogeneous, federated DBMSs. In this paper, we describe the interoperability infrastructure of a multidatabase system, namely METU Interoperable DBMS (MIND). The architecture of MIND is based on OMG distributed object management model. It is implemented on top of a CORBA compl...

  8. SOF and conventional force interoperability through SOF reconfiguration

    OpenAIRE

    McHale, Edward J.

    1996-01-01

    The goal of this thesis was to decide what environmental variables affected past SOF attempts at achieving interoperability with the conventional military, to examine the status of SOF and conventional forces interoperability as it exists today, and to explain why now is the time for SOP to engage in the reconfiguration of its forces to achieve an optimal level of interoperability. Five variables were used in the examination of SOFs organizational evolution toward interoperability with conven...

  9. The GEOSS solution for enabling data interoperability and integrative research.

    Science.gov (United States)

    Nativi, Stefano; Mazzetti, Paolo; Craglia, Max; Pirrone, Nicola

    2014-03-01

    Global sustainability research requires an integrative research effort underpinned by digital infrastructures (systems) able to harness data and heterogeneous information across disciplines. Digital data and information sharing across systems and applications is achieved by implementing interoperability: a property of a product or system to work with other products or systems, present or future. There are at least three main interoperability challenges a digital infrastructure must address: technological, semantic, and organizational. In recent years, important international programs and initiatives are focusing on such an ambitious objective. This manuscript presents and combines the studies and the experiences carried out by three relevant projects, focusing on the heavy metal domain: Global Mercury Observation System, Global Earth Observation System of Systems (GEOSS), and INSPIRE. This research work recognized a valuable interoperability service bus (i.e., a set of standards models, interfaces, and good practices) proposed to characterize the integrative research cyber-infrastructure of the heavy metal research community. In the paper, the GEOSS common infrastructure is discussed implementing a multidisciplinary and participatory research infrastructure, introducing a possible roadmap for the heavy metal pollution research community to join GEOSS as a new Group on Earth Observation community of practice and develop a research infrastructure for carrying out integrative research in its specific domain. PMID:24243262

  10. A Survey on Interoperability in the Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    Bahman Rashidi

    2013-07-01

    Full Text Available In the recent years, Cloud Computing has been one of the top ten new technologies which provides various services such as software, platform and infrastructure for internet users. The Cloud Computing is a promising IT paradigm which enables the Internet evolution into a global market of collaborating services. In order to provide better services for cloud customers, cloud providers need services that are in cooperation with other services. Therefore, Cloud Computing semantic interoperability plays a key role in Cloud Computing services. In this paper, we address interoperability issues in Cloud Computing environments. After a description of Cloud Computing interoperability from different aspects and references, we describe two architectures of cloud service interoperability. Architecturally, we classify existing interoperability challenges and we describe them. Moreover, we use these aspects to discuss and compare several interoperability approaches.

  11. A Framework of Semantic Information Representation in Distributed Environments

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    An information representation framework is designed to overcome the problem of semantic heterogeneity in distributed environments in this paper. Emphasis is placed on establishing an XML-oriented semantic data model and the mapping between XML data based on a global ontology semantic view. The framework is implemented in Web Service, which enhances information process efficiency, accuracy and the semantic interoperability as well.

  12. Combining Archetypes with Fast Health Interoperability Resources in Future-proof Health Information Systems.

    Science.gov (United States)

    Bosca, Diego; Moner, David; Maldonado, Jose Alberto; Robles, Montserrat

    2015-01-01

    Messaging standards, and specifically HL7 v2, are heavily used for the communication and interoperability of Health Information Systems. HL7 FHIR was created as an evolution of the messaging standards to achieve semantic interoperability. FHIR is somehow similar to other approaches like the dual model methodology as both are based on the precise modeling of clinical information. In this paper, we demonstrate how we can apply the dual model methodology to standards like FHIR. We show the usefulness of this approach for data transformation between FHIR and other specifications such as HL7 CDA, EN ISO 13606, and openEHR. We also discuss the advantages and disadvantages of defining archetypes over FHIR, and the consequences and outcomes of this approach. Finally, we exemplify this approach by creating a testing data server that supports both FHIR resources and archetypes. PMID:25991126

  13. Matchmaking Semantic Based for Information System Interoperability

    CERN Document Server

    Wicaksana, I Wayan Simri

    2011-01-01

    Unlike the traditional model of information pull, matchmaking is base on a cooperative partnership between information providers and consumers, assisted by an intelligent facilitator (the matchmaker). Refer to some experiments, the matchmaking to be most useful in two different ways: locating information sources or services that appear dynamically and notification of information changes. Effective information and services sharing in distributed such as P2P based environments raises many challenges, including discovery and localization of resources, exchange over heterogeneous sources, and query processing. One traditional approach for dealing with some of the above challenges is to create unified integrated schemas or services to combine the heterogeneous sources. This approach does not scale well when applied in dynamic distributed environments and has many drawbacks related to the large numbers of sources. The main issues in matchmaking are how to represent advertising and request, and how to calculate poss...

  14. Semantic interoperability for collaborative spatial design

    NARCIS (Netherlands)

    Hofman, W.

    2009-01-01

    Mobile devices offer integrated functionality to browse, phone, play music, and watch video. Moreover, these devices have sufficient memory and processing power to run (small) applications based on for instance Google Android and the iPhone/iPod OS. As such, they support for instance Google Earth to

  15. A Semantics-Based Approachfor Achieving Self Fault-Tolerance of Protocols

    Institute of Scientific and Technical Information of China (English)

    李腊元; 李春林

    2000-01-01

    The cooperation of different processes may be lost by mistake when a protocol is executed. The protocol cannot be normally operated under this condition. In this paper, the self fault-tolerance of protocols is discussed, and a semanticsbased approach for achieving self fault-tolerance of protocols is presented. Some main characteristics of self fault-tolerance of protocols concerning liveness, nontermination and infinity are also presented. Meanwhile, the sufficient and necessary conditions for achieving self fault-tolerance of protocols are given. Finally, a typical protocol that does not satisfy the self fault-tolerance is investigated, and a new redesign version of this existing protocol using the proposed approach is given.

  16. Towards a contract-based interoperation model

    OpenAIRE

    Fernández Peña, Félix Oscar; Willmott, Steven Nicolás

    2007-01-01

    Web Services-based solutions for interoperating processes are considered to be one of the most promising technologies for achieving truly interoperable functioning in open environments. In the last three years, the specification in particular of agreements between resource / service providers and consumers, as well as protocols for their negotiation have been proposed as a possible solution for managing the resulting computing systems. In this report, the state of the art in the area of contr...

  17. Intelligent interoperable application for employment exchange system using ontology

    Directory of Open Access Journals (Sweden)

    Kavidha Ayechetty

    2013-12-01

    Full Text Available Semantic web technologies have the potential to simplify heterogeneous data integration using explicit semantics. The paper proposes a framework for building intelligent interoperable application for employment exchange system by collaborating among distributed heterogeneous data models using semantic web technologies. The objective of the application development using semantic technologies is to provide a better inference for the query against dynamic collection of information in collaborating data models. The employment exchange system provides interface for the users to register their details thereby managing the knowledge base dynamically. Semantic server transforms the queries from the employer and jobseeker semantically for possible integration of the two heterogeneous data models to drive intelligent inference. The semantic agent reconcile the syntax and semantic conflicts exists among the contributing ontologies in different granularity levels and performs automatic integration of two source ontologies and gives better response to the user. The benefits of building interoperable application using semantic web are data sharing, reusing the knowledge, best query response, independent maintenance of the model, extending the application for extra features.

  18. Governance of Interoperability in Intergovernmental Services - Towards an Empirical Taxonomy

    Directory of Open Access Journals (Sweden)

    Herbert Kubicek

    2008-12-01

    Full Text Available High quality and comfortable online delivery of governmental services often requires the seamless exchange of data between two or more government agencies. Smooth data exchange, in turn, requires interoperability of the databases and workflows in the agencies involved. Interoperability (IOP is a complex issue covering purely technical aspects such as transmission protocols and data exchange formats, but also content-related semantic aspects such as identifiers and the meaning of codes as well as organizational, contractual or legal issues. Starting from IOP frameworks which provide classifications of what has to be standardized, this paper, based on an ongoing research project, adopts a political and managerial view and tries to clarify the governance of achieving IOP, i.e. where and by whom IOPstandards are developed and established and how they are put into operation. By analyzing 32 cases of successful implementation of IOP in E-Government services within the European Union empirical indicators for different aspects of governance are proposed and applied to develop an empirical taxonomy of different types of IOP governance which can be used for future comparative research regarding success factors, barriers etc.

  19. Rationale and design considerations for a semantic mediator in health information systems.

    Science.gov (United States)

    Degoulet, P; Sauquet, D; Jaulent, M C; Zapletal, E; Lavril, M

    1998-11-01

    Rapid development of community health information networks raises the issue of semantic interoperability between distributed and heterogeneous systems. Indeed, operational health information systems originate from heterogeneous teams of independent developers and have to cooperate in order to exchange data and services. A good cooperation is based on a good understanding of the messages exchanged between the systems. The main issue of semantic interoperability is to ensure that the exchange is not only possible but also meaningful. The main objective of this paper is to analyze semantic interoperability from a software engineering point of view. It describes the principles for the design of a semantic mediator (SM) in the framework of a distributed object manager (DOM). The mediator is itself a component that should allow the exchange of messages independently of languages and platforms. The functional architecture of such a SM is detailed. These principles have been partly applied in the context of the HELIOS object-oriented software engineering environment. The resulting service components are presented with their current state of achievement. PMID:9865050

  20. Study of Tools Interoperability

    NARCIS (Netherlands)

    Krilavičius, T.

    2007-01-01

    Interoperability of tools usually refers to a combination of methods and techniques that address the problem of making a collection of tools to work together. In this study we survey different notions that are used in this context: interoperability, interaction and integration. We point out relation

  1. Information Interoperability Domains

    NARCIS (Netherlands)

    Lasschuyt, E.

    2004-01-01

    Coalition-wide interoperability can be improved considerably by better harmonisation of all major information standardisation efforts within NATO. This notion is supported by the concept of dividing the NATO C3 information area into more or less independent “information interoperability domains”, co

  2. Standards-based data interoperability in the climate sciences

    Science.gov (United States)

    Woolf, Andrew; Cramer, Ray; Gutierrez, Marta; Kleese van Dam, Kerstin; Kondapalli, Siva; Latham, Susan; Lawrence, Bryan; Lowry, Roy; O'Neill, Kevin

    2005-03-01

    Emerging developments in geographic information systems and distributed computing offer a roadmap towards an unprecedented spatial data infrastructure in the climate sciences. Key to this are the standards developments for digital geographic information being led by the International Organisation for Standardisation (ISO) technical committee on geographic information/geomatics (TC211) and the Open Geospatial Consortium (OGC). These, coupled with the evolution of standardised web services for applications on the internet by the World Wide Web Consortium (W3C), mean that opportunities for both new applications and increased interoperability exist. These are exemplified by the ability to construct ISO-compliant data models that expose legacy data sources through OGC web services. This paper concentrates on the applicability of these standards to climate data by introducing some examples and outlining the challenges ahead. An abstract data model is developed, based on ISO standards, and applied to a range of climate data both observational and modelled. An OGC Web Map Server interface is constructed for numerical weather prediction (NWP) data stored in legacy data files. A W3C web service for remotely accessing gridded climate data is illustrated. Challenges identified include the following: first, both the ISO and OGC specifications require extensions to support climate data. Secondly, OGC services need to fully comply with W3C web services, and support complex access control. Finally, to achieve real interoperability, broadly accepted community-based semantic data models are required across the range of climate data types. These challenges are being actively pursued, and broad data interoperability for the climate sciences appears within reach.

  3. Focus for 3D city models should be on interoperability

    DEFF Research Database (Denmark)

    Bodum, Lars; Kjems, Erik; Jaegly, Marie Michele Helena;

    2006-01-01

    3D city models have become a very popular commodity for cities in general. The politicians and/or the administrative management have in the last few years been very active when it comes to investments in dimensionality, and the models come in many different forms and for many specific or non...... of interoperability. Verisimilarity would in this case mean a 3D model with close resemblance to reality and based on modelling principles from CAD and scenes from this, build with focus on photorealism. Interoperability would mean a 3D model that included semantics in form of an object model and an ontology...... that would make it useful for other purposes than visualisation. Time has come to try to change this trend and to convince the municipalities that interoperability and semantics are important issues for the future. It is important for them to see that 3D modelling, mapping and geographic information...

  4. Groundwater data network interoperability

    Science.gov (United States)

    Brodaric, Boyan; Booth, Nathaniel; Boisvert, Eric; Lucido, Jessica M.

    2016-01-01

    Water data networks are increasingly being integrated to answer complex scientific questions that often span large geographical areas and cross political borders. Data heterogeneity is a major obstacle that impedes interoperability within and between such networks. It is resolved here for groundwater data at five levels of interoperability, within a Spatial Data Infrastructure architecture. The result is a pair of distinct national groundwater data networks for the United States and Canada, and a combined data network in which they are interoperable. This combined data network enables, for the first time, transparent public access to harmonized groundwater data from both sides of the shared international border.

  5. Interoperability for electronic ID

    OpenAIRE

    Zygadlo, Zuzanna

    2009-01-01

    Electronic Business, including eBanking, eCommerce and eGovernmental services, is today based on a large variety of security solutions, comprising electronic IDs provided by a broad community of Public Key Infrastructure (PKI) vendors. Significant differences in implementations of those solutions introduce a problem of lack of interoperability in electronic business, which have not yet been resolved by standardization and interoperability initiatives based on existing PKI trust models. It i...

  6. ONTOLOGY BASED SEMANTIC KNOWLEDGE REPRESENTATION FOR SOFTWARE RISK MANAGEMENT

    Directory of Open Access Journals (Sweden)

    C.R.Rene Robin

    2010-10-01

    Full Text Available Domain specific knowledge representation is achieved through the use of ontologies. The ontology model of software risk management is an effective approach for the intercommunion between people from teaching and learning community, the communication and interoperation among various knowledge oriented applications, and the share and reuse of the software. But the lack of formal representation tools for domain modeling results in taking liberties with conceptualization. This paper narrates an ontology based semantic knowledge representation mechanism and the architecture we proposed has been successfully implemented for the domain software riskmanagement.

  7. Buildings Interoperability Landscape

    Energy Technology Data Exchange (ETDEWEB)

    Hardin, Dave [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Stephan, Eric G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wang, Weimin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Corbin, Charles D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Widergren, Steven E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-12-31

    Through its Building Technologies Office (BTO), the United States Department of Energy’s Office of Energy Efficiency and Renewable Energy (DOE-EERE) is sponsoring an effort to advance interoperability for the integration of intelligent buildings equipment and automation systems, understanding the importance of integration frameworks and product ecosystems to this cause. This is important to BTO’s mission to enhance energy efficiency and save energy for economic and environmental purposes. For connected buildings ecosystems of products and services from various manufacturers to flourish, the ICT aspects of the equipment need to integrate and operate simply and reliably. Within the concepts of interoperability lie the specification, development, and certification of equipment with standards-based interfaces that connect and work. Beyond this, a healthy community of stakeholders that contribute to and use interoperability work products must be developed. On May 1, 2014, the DOE convened a technical meeting to take stock of the current state of interoperability of connected equipment and systems in buildings. Several insights from that meeting helped facilitate a draft description of the landscape of interoperability for connected buildings, which focuses mainly on small and medium commercial buildings. This document revises the February 2015 landscape document to address reviewer comments, incorporate important insights from the Buildings Interoperability Vision technical meeting, and capture thoughts from that meeting about the topics to be addressed in a buildings interoperability vision. In particular, greater attention is paid to the state of information modeling in buildings and the great potential for near-term benefits in this area from progress and community alignment.

  8. The caCORE Software Development Kit: Streamlining construction of interoperable biomedical information services

    Directory of Open Access Journals (Sweden)

    Warzel Denise

    2006-01-01

    Full Text Available Abstract Background Robust, programmatically accessible biomedical information services that syntactically and semantically interoperate with other resources are challenging to construct. Such systems require the adoption of common information models, data representations and terminology standards as well as documented application programming interfaces (APIs. The National Cancer Institute (NCI developed the cancer common ontologic representation environment (caCORE to provide the infrastructure necessary to achieve interoperability across the systems it develops or sponsors. The caCORE Software Development Kit (SDK was designed to provide developers both within and outside the NCI with the tools needed to construct such interoperable software systems. Results The caCORE SDK requires a Unified Modeling Language (UML tool to begin the development workflow with the construction of a domain information model in the form of a UML Class Diagram. Models are annotated with concepts and definitions from a description logic terminology source using the Semantic Connector component. The annotated model is registered in the Cancer Data Standards Repository (caDSR using the UML Loader component. System software is automatically generated using the Codegen component, which produces middleware that runs on an application server. The caCORE SDK was initially tested and validated using a seven-class UML model, and has been used to generate the caCORE production system, which includes models with dozens of classes. The deployed system supports access through object-oriented APIs with consistent syntax for retrieval of any type of data object across all classes in the original UML model. The caCORE SDK is currently being used by several development teams, including by participants in the cancer biomedical informatics grid (caBIG program, to create compatible data services. caBIG compatibility standards are based upon caCORE resources, and thus the caCORE SDK has

  9. Semantic Observation Integration

    Directory of Open Access Journals (Sweden)

    Werner Kuhn

    2012-09-01

    Full Text Available Although the integration of sensor-based information into analysis and decision making has been a research topic for many years, semantic interoperability has not yet been reached. The advent of user-generated content for the geospatial domain, Volunteered Geographic Information (VGI, makes it even more difficult to establish semantic integration. This paper proposes a novel approach to integrating conventional sensor information and VGI, which is exploited in the context of detecting forest fires. In contrast to common logic-based semantic descriptions, we present a formal system using algebraic specifications to unambiguously describe the processing steps from natural phenomena to value-added information. A generic ontology of observations is extended and profiled for forest fire detection in order to illustrate how the sensing process, and transformations between heterogeneous sensing systems, can be represented as mathematical functions and grouped into abstract data types. We discuss the required ontological commitments and a possible generalization.

  10. Role of semantics in Autonomic and Adaptive Web Services & Processes

    OpenAIRE

    Sheth, Amit P.

    2007-01-01

    The emergence of Service Oriented Architectures (SOA) has created a new paradigm of loosely coupled distributed systems. In the METEOR-S project, we have studied the comprehensive role of semantics in all stages of the life cycle of service and process-- including annotation, publication, discovery, interoperability/data mediation, and composition. In 2002-2003, we had offered a broad framework of semantics consisting of four types:1) Data semantics, 2) Functional semantics...

  11. Architecture for interoperable software in biology.

    Science.gov (United States)

    Bare, James Christopher; Baliga, Nitin S

    2014-07-01

    Understanding biological complexity demands a combination of high-throughput data and interdisciplinary skills. One way to bring to bear the necessary combination of data types and expertise is by encapsulating domain knowledge in software and composing that software to create a customized data analysis environment. To this end, simple flexible strategies are needed for interconnecting heterogeneous software tools and enabling data exchange between them. Drawing on our own work and that of others, we present several strategies for interoperability and their consequences, in particular, a set of simple data structures--list, matrix, network, table and tuple--that have proven sufficient to achieve a high degree of interoperability. We provide a few guidelines for the development of future software that will function as part of an interoperable community of software tools for biological data analysis and visualization.

  12. Standards and Interoperability

    Institute of Scientific and Technical Information of China (English)

    Stephen McGibbon

    2006-01-01

    @@ I am sure that there will be much discussion at the upcoming Baltic IT&T 2005 conference about standards and interoperability, and so I thought I would try to contribute to the debate with this, the first of four articles that I will write for this journal over the coming months.

  13. Semantic Web

    Directory of Open Access Journals (Sweden)

    Anna Lamandini

    2011-06-01

    Full Text Available The semantic Web is a technology at the service of knowledge which is aimed at accessibility and the sharing of content; facilitating interoperability between different systems and as such is one of the nine key technological pillars of TIC (technologies for information and communication within the third theme, programme specific cooperation of the seventh programme framework for research and development (7°PQRS, 2007-2013. As a system it seeks to overcome overload or excess of irrelevant information in Internet, in order to facilitate specific or pertinent research. It is an extension of the existing Web in which the aim is for cooperation between and the computer and people (the dream of Sir Tim Berners –Lee where machines can give more support to people when integrating and elaborating data in order to obtain inferences and a global sharing of data. It is a technology that is able to favour the development of a “data web” in other words the creation of a space in both sets of interconnected and shared data (Linked Data which allows users to link different types of data coming from different sources. It is a technology that will have great effect on everyday life since it will permit the planning of “intelligent applications” in various sectors such as education and training, research, the business world, public information, tourism, health, and e-government. It is an innovative technology that activates a social transformation (socio-semantic Web on a world level since it redefines the cognitive universe of users and enables the sharing not only of information but of significance (collective and connected intelligence.

  14. Model and Interoperability using Meta Data Annotations

    Science.gov (United States)

    David, O.

    2011-12-01

    Software frameworks and architectures are in need for meta data to efficiently support model integration. Modelers have to know the context of a model, often stepping into modeling semantics and auxiliary information usually not provided in a concise structure and universal format, consumable by a range of (modeling) tools. XML often seems the obvious solution for capturing meta data, but its wide adoption to facilitate model interoperability is limited by XML schema fragmentation, complexity, and verbosity outside of a data-automation process. Ontologies seem to overcome those shortcomings, however the practical significance of their use remains to be demonstrated. OMS version 3 took a different approach for meta data representation. The fundamental building block of a modular model in OMS is a software component representing a single physical process, calibration method, or data access approach. Here, programing language features known as Annotations or Attributes were adopted. Within other (non-modeling) frameworks it has been observed that annotations lead to cleaner and leaner application code. Framework-supported model integration, traditionally accomplished using Application Programming Interfaces (API) calls is now achieved using descriptive code annotations. Fully annotated components for various hydrological and Ag-system models now provide information directly for (i) model assembly and building, (ii) data flow analysis for implicit multi-threading or visualization, (iii) automated and comprehensive model documentation of component dependencies, physical data properties, (iv) automated model and component testing, calibration, and optimization, and (v) automated audit-traceability to account for all model resources leading to a particular simulation result. Such a non-invasive methodology leads to models and modeling components with only minimal dependencies on the modeling framework but a strong reference to its originating code. Since models and

  15. Interoperability for Global Observation Data by Ontological Information

    Institute of Scientific and Technical Information of China (English)

    Masahiko Nagai; Masafumi Ono; Ryosuke Shibasaki

    2008-01-01

    The Ontology registry system is developed to collect, manage, and compare ontological informa-tion for integrating global observation data. Data sharing and data service such as support of metadata deign, structudng of data contents, support of text mining are applied for better use of data as data interop-erability. Semantic network dictionary and gazetteers are constructed as a trans-disciplinary dictionary. On-tological information is added to the system by digitalizing text based dictionaries, developing "knowledge writing tool" for experts, and extracting semantic relations from authodtative documents with natural lan-guage processing technique. The system is developed to collect lexicographic ontology and geographic ontology.

  16. Product-driven Enterprise Interoperability for Manufacturing Systems Integration

    OpenAIRE

    Dassisti, Michele; Panetto, Hervé; Tursi, Angela

    2006-01-01

    International audience The “Babel tower effect”, induced by the heterogeneity of applications available in the operation of enterprises brings to a consistent lack of “exchangeability” and risk of semantic loss whenever cooperation has to take place within the same enterprise. Generally speaking, this kind of problem falls within the umbrella of interoperability between local reference information models .This position paper discuss some idea on this field and traces a research roadmap to ...

  17. Interoperability does matter

    Directory of Open Access Journals (Sweden)

    Manfred Goepel

    2006-04-01

    Full Text Available In companies, the historically developed IT systems are mostly application islands. They always produce good results if the system's requirements and surroundings are not changed and as long as a system interface is not needed. With the ever increas-ing dynamic and globalization of the market, however, these IT islands are certain to collapse. Interoperability (IO is the bid of the hour, assuming the integration of users, data, applications and processes. In the following, important IO enablers such as ETL, EAI, and SOA will be examined on the basis of practica-bility. It will be shown that especially SOA produces a surge of interoperability that could rightly be referred to as IT evolution.

  18. Driving Innovation Through Interoperability

    Directory of Open Access Journals (Sweden)

    John Weigelt

    2008-12-01

    Full Text Available Today's difficult economic environment provides a time of change where information technology matters more than ever. As business and service delivery leaders look to become even more effective and efficient in meeting their client's expectations, they are increasingly looking to electronic channels as an integral element of their business strategies. Regrettably, the ever increasing pace of technological change often disconnects the technology from the business requirements. This disconnection hides technology innovations from the business and has a broader impact of preventing business innovation. This article discusses the role service oriented architecture and interoperability can play in keeping an organization innovative and competitive. We also discuss Microsoft's interoperability principles, its commitment to its open source community, and the benefits of embracing openness as part of an organization's business strategy.

  19. Maturity model for enterprise interoperability

    Science.gov (United States)

    Guédria, Wided; Naudet, Yannick; Chen, David

    2015-01-01

    Historically, progress occurs when entities communicate, share information and together create something that no one individually could do alone. Moving beyond people to machines and systems, interoperability is becoming a key factor of success in all domains. In particular, interoperability has become a challenge for enterprises, to exploit market opportunities, to meet their own objectives of cooperation or simply to survive in a growing competitive world where the networked enterprise is becoming a standard. Within this context, many research works have been conducted over the past few years and enterprise interoperability has become an important area of research, ensuring the competitiveness and growth of European enterprises. Among others, enterprises have to control their interoperability strategy and enhance their ability to interoperate. This is the purpose of the interoperability assessment. Assessing interoperability maturity allows a company to know its strengths and weaknesses in terms of interoperability with its current and potential partners, and to prioritise actions for improvement. The objective of this paper is to define a maturity model for enterprise interoperability that takes into account existing maturity models while extending the coverage of the interoperability domain. The assessment methodology is also presented. Both are demonstrated with a real case study.

  20. Lemnos interoperable security project.

    Energy Technology Data Exchange (ETDEWEB)

    Halbgewachs, Ronald D.

    2010-03-01

    With the Lemnos framework, interoperability of control security equipment is straightforward. To obtain interoperability between proprietary security appliance units, one or both vendors must now write cumbersome 'translation code.' If one party changes something, the translation code 'breaks.' The Lemnos project is developing and testing a framework that uses widely available security functions and protocols like IPsec - to form a secure communications channel - and Syslog, to exchange security log messages. Using this model, security appliances from two or more different vendors can clearly and securely exchange information, helping to better protect the total system. Simplify regulatory compliance in a complicated security environment by leveraging the Lemnos framework. As an electric utility, are you struggling to implement the NERC CIP standards and other regulations? Are you weighing the misery of multiple management interfaces against committing to a ubiquitous single-vendor solution? When vendors build their security appliances to interoperate using the Lemnos framework, it becomes practical to match best-of-breed offerings from an assortment of vendors to your specific control systems needs. The Lemnos project is developing and testing a framework that uses widely available open-source security functions and protocols like IPsec and Syslog to create a secure communications channel between appliances in order to exchange security data.

  1. National Flood Interoperability Experiment

    Science.gov (United States)

    Maidment, D. R.

    2014-12-01

    The National Flood Interoperability Experiment is led by the academic community in collaboration with the National Weather Service through the new National Water Center recently opened on the Tuscaloosa campus of the University of Alabama. The experiment will also involve the partners in IWRSS (Integrated Water Resources Science and Services), which include the USGS, the Corps of Engineers and FEMA. The experiment will address the following questions: (1) How can near-real-time hydrologic forecasting at high spatial resolution, covering the nation, be carried out using the NHDPlus or next generation geofabric (e.g. hillslope, watershed scales)? (2) How can this lead to improved emergency response and community resilience? (3) How can improved an improved interoperability framework support the first two goals and lead to sustained innovation in the research to operations process? The experiment will run from September 2014 through August 2015, in two phases. The mobilization phase from September 2014 until May 2015 will assemble the components of the interoperability framework. A Summer Institute to integrate the components will be held from June to August 2015 at the National Water Center involving faculty and students from the University of Alabama and other institutions coordinated by CUAHSI. It is intended that the insight that arises from this experiment will help lay the foundation for a new national scale, high spatial resolution, near-real-time hydrologic simulation system for the United States.

  2. -Means Based Fingerprint Segmentation with Sensor Interoperability

    Directory of Open Access Journals (Sweden)

    Yang Xiukun

    2010-01-01

    Full Text Available A critical step in an automatic fingerprint recognition system is the segmentation of fingerprint images. Existing methods are usually designed to segment fingerprint images originated from a certain sensor. Thus their performances are significantly affected when dealing with fingerprints collected by different sensors. This work studies the sensor interoperability of fingerprint segmentation algorithms, which refers to the algorithm's ability to adapt to the raw fingerprints obtained from different sensors. We empirically analyze the sensor interoperability problem, and effectively address the issue by proposing a -means based segmentation method called SKI. SKI clusters foreground and background blocks of a fingerprint image based on the -means algorithm, where a fingerprint block is represented by a 3-dimensional feature vector consisting of block-wise coherence, mean, and variance (abbreviated as CMV. SKI also employs morphological postprocessing to achieve favorable segmentation results. We perform SKI on each fingerprint to ensure sensor interoperability. The interoperability and robustness of our method are validated by experiments performed on a number of fingerprint databases which are obtained from various sensors.

  3. Semantic modelling of learning objects and instruction

    OpenAIRE

    Pahl, Claus; Melia, Mark

    2006-01-01

    We introduce an ontology-based semantic modelling framework that addresses subject domain modelling, instruction modelling, and interoperability aspects in the development of complex reusable learning objects. Ontologies are knowledge representation frameworks, ideally suited to support knowledge-based modelling of these learning objects. We illustrate the benefits of semantic modelling for learning object assemblies within the context of standards such as SCORM Sequencing and Navigation and ...

  4. Semantic tags for generative multiview product breakdown

    OpenAIRE

    Paviot, Thomas; Cheutet, Vincent; Lamouri, Samir

    2010-01-01

    The interoperability of IT systems that drive engineering and production processes (i.e. Product Data Management and Enterprise Resource Planning systems) is still an issue. The semantic meaning of product information has to be explicit in order to be able to exchange information between these systems. However, the product breakdown activity generates many disconnected product views over which the product semantics is disseminated and mostly implicit. This paper introduces a methodology allow...

  5. Connecting Archaeological Data and Grey Literature via Semantic Cross Search

    Directory of Open Access Journals (Sweden)

    Douglas Tudhope

    2011-07-01

    Full Text Available Differing terminology and database structure hinders meaningful cross search of excavation datasets. Matching free text grey literature reports with datasets poses yet more challenges. Conventional search techniques are unable to cross search between archaeological datasets and Web-based grey literature. Results are reported from two AHRC funded research projects that investigated the use of semantic techniques to link digital archive databases, vocabularies and associated grey literature. STAR (Semantic Technologies for Archaeological Resources was a collaboration between the University of Glamorgan, Hypermedia Research Unit and English Heritage (EH. The main outcome is a research Demonstrator (available online, which cross searches over excavation datasets from different database schemas, including Raunds Roman, Raunds Prehistoric, Museum of London, Silchester Roman and Stanwick sampling. The system additionally cross searches over an extract of excavation reports from the OASIS index of grey literature, operated by the Archaeology Data Service (ADS. A conceptual framework provided by the CIDOC Conceptual Reference Model (CRM integrates the different database structures and the metadata automatically generated from the OASIS reports by natural language processing techniques. The methods employed for extracting semantic RDF representations from the datasets and the information extraction from grey literature are described. The STELLAR project provides freely available tools to reduce the costs of mapping and extracting data to semantic search systems such as the Demonstrator and to linked data representation generally. Detailed use scenarios (and a screen capture video provide a basis for a discussion of key issues, including cost-benefits, ontology modelling, mapping, terminology control, semantic implementation and information extraction issues. The scenarios show that semantic interoperability can be achieved by mapping and extracting

  6. Principles of data integration and interoperability in the GEO Biodiversity Observation Network

    Science.gov (United States)

    Saarenmaa, Hannu; Ó Tuama, Éamonn

    2010-05-01

    The goal of the Global Earth Observation System of Systems (GEOSS) is to link existing information systems into a global and flexible network to address nine areas of critical importance to society. One of these "societal benefit areas" is biodiversity and it will be supported by a GEOSS sub-system known as the GEO Biodiversity Observation Network (GEO BON). In planning the GEO BON, it was soon recognised that there are already a multitude of existing networks and initiatives in place worldwide. What has been lacking is a coordinated framework that allows for information sharing and exchange between the networks. Traversing across the various scales of biodiversity, in particular from the individual and species levels to the ecosystems level has long been a challenge. Furthermore, some of the major regions of the world have already taken steps to coordinate their efforts, but links between the regions have not been a priority until now. Linking biodiversity data to that of the other GEO societal benefit areas, in particular ecosystems, climate, and agriculture to produce useful information for the UN Conventions and other policy-making bodies is another need that calls for integration of information. Integration and interoperability are therefore a major theme of GEO BON, and a "system of systems" is very much needed. There are several approaches to integration that need to be considered. Data integration requires harmonising concepts, agreeing on vocabularies, and building ontologies. Semantic mediation of data using these building blocks is still not easy to achieve. Agreements on, or mappings between, the metadata standards that will be used across the networks is a major requirement that will need to be addressed early on. With interoperable metadata, service integration will be possible through registry of registries systems such as GBIF's forthcoming GBDRS and the GEO Clearinghouse. Chaining various services that build intermediate products using workflow

  7. Semantic Advertising

    OpenAIRE

    Zamanzadeh, Ben; Ashish, Naveen; Ramakrishnan, Cartic; Zimmerman, John

    2013-01-01

    We present the concept of Semantic Advertising which we see as the future of online advertising. Semantic Advertising is online advertising powered by semantic technology which essentially enables us to represent and reason with concepts and the meaning of things. This paper aims to 1) Define semantic advertising, 2) Place it in the context of broader and more widely used concepts such as the Semantic Web and Semantic Search, 3) Provide a survey of work in related areas such as context matchi...

  8. Interoperability between .Net framework and Python in Component way

    OpenAIRE

    M. K. Pawar; Ravindra Patel; Dr. N. S. Chaudhari

    2013-01-01

    The objective of this work is to make interoperability of the distributed object based on CORBA middleware technology and standards. The distributed objects for the client-server technology are implemented in C#.Net framework and the Python language. The interoperability result shows the possibilities of application in which objects can communicate in different environment and different languages. It is also analyzing that how to achieve client-server communication in heterogeneous environmen...

  9. Inter-operability

    International Nuclear Information System (INIS)

    Building an internal gas market implies establishing harmonized rules for cross border trading between operators. To that effect, the European association EASEE-gas is carrying out standards and procedures, commonly called 'inter-operability'. Set up in 2002, the Association brings together all segments of the gas industry: producers, transporters, distributors, traders and shippers, suppliers, consumers and service providers. This workshop presents the latest status on issues such as barriers to gas trade in Europe, rules and procedures under preparation by EASEE-gas, and the implementation schedule of these rules by operators. This article gathers 5 presentations about this topic given at the gas conference

  10. An Interoperable Cartographic Database

    Directory of Open Access Journals (Sweden)

    Slobodanka Ključanin

    2007-05-01

    Full Text Available The concept of producing a prototype of interoperable cartographic database is explored in this paper, including the possibilities of integration of different geospatial data into the database management system and their visualization on the Internet. The implementation includes vectorization of the concept of a single map page, creation of the cartographic database in an object-relation database, spatial analysis, definition and visualization of the database content in the form of a map on the Internet. 

  11. ASP-SSN: An Effective Approach for Linking Semantic Social Networks

    Directory of Open Access Journals (Sweden)

    Sanaa Kaddoura

    2012-11-01

    Full Text Available The dramatic increase of social networking sites forced web users to duplicate their identity on many ofthem. But, the lack of interoperability and linkage between these social networks allowed users’information to be disseminated within walled garden data islands. Achieving interoperability willcontribute to the creation of rich knowledge base that can be used for querying social networks anddiscovering some facts about social connections. This paper presents a new approach for linking semanticsocial networks (SSN. This approach is based on the Answer Set Programming (ASP Paradigm and FuzzyLogic. An ASP-SNN reasoner is developed using the DLV answer set solver and tested on data setsexported from seven different semantic social networks. Fuzzy logic is used to assign a degree of truth toevery discovered link. The proposed approach is simple, generic and intuitive.

  12. Interoperability of heterogeneous distributed systems

    Science.gov (United States)

    Zaschke, C.; Essendorfer, B.; Kerth, C.

    2016-05-01

    To achieve knowledge superiority in today's operations interoperability is the key. Budget restrictions as well as the complexity and multiplicity of threats combined with the fact that not single nations but whole areas are subject to attacks force nations to collaborate and share information as appropriate. Multiple data and information sources produce different kinds of data, real time and non-real time, in different formats that are disseminated to the respective command and control level for further distribution. The data is most of the time highly sensitive and restricted in terms of sharing. The question is how to make this data available to the right people at the right time with the right granularity. The Coalition Shared Data concept aims to provide a solution to these questions. It has been developed within several multinational projects and evolved over time. A continuous improvement process was established and resulted in the adaptation of the architecture as well as the technical solution and the processes it supports. Coming from the idea of making use of existing standards and basing the concept on sharing of data through standardized interfaces and formats and enabling metadata based query the concept merged with a more sophisticated service based approach. The paper addresses concepts for information sharing to facilitate interoperability between heterogeneous distributed systems. It introduces the methods that were used and the challenges that had to be overcome. Furthermore, the paper gives a perspective how the concept could be used in the future and what measures have to be taken to successfully bring it into operations.

  13. The Semantic Architecture for Chinese Cultural Celebrities’ Manuscript Library

    OpenAIRE

    Liu, Wei

    2004-01-01

    Semantic architecture is crucial for a digital library application especially in a distributed system environment. It provides various approaches to overcome semantic interoperability problems and usually consists of metadata solution with open system architecture. The design of the digital library system for the China Cultural Celebrities’ Manuscripts Library (CCCML), which is a branch of Shanghai Library, has taken into account a lot of the main aspects from the requirement of semantics, in...

  14. Bringing Semantics to Web Services: The OWL-S Approach

    OpenAIRE

    Martin, David; Paolucci, Massimo; McIlraith, Sheila; Burnstein, Mark; McDermott, Drew; McGuinness, Deborah; Parsia, Bijan; Payne, Terry R.; Sabou, Marta; Solanki, Monika; Srinivasan, Naveen; Sycara, Katia

    2004-01-01

    Service interface description languages such as WSDL, and related standards, are evolving rapidly to provide a foundation for interoperation between Web services. At the same time, Semantic Web service technologies, such as the Ontology Web Language for Services (OWL-S), are developing the means by which services can be given richer semantic specifications. Richer semantics can enable fuller, more flexible automation of service provision and use, and support the construction of more powerful ...

  15. Leveraging the Semantic Web for Adaptive Education

    Science.gov (United States)

    Kravcik, Milos; Gasevic, Dragan

    2007-01-01

    In the area of technology-enhanced learning reusability and interoperability issues essentially influence the productivity and efficiency of learning and authoring solutions. There are two basic approaches how to overcome these problems--one attempts to do it via standards and the other by means of the Semantic Web. In practice, these approaches…

  16. An Approach towards Enterprise Interoperability Assessment

    Science.gov (United States)

    Razavi, Mahsa; Aliee, Fereidoon Shams

    Enterprise Architecture (EA) as a discipline with numerous and enterprise-wide models, can support decision making on enterprise-wide issues. In order to provide such support, EA models should be amenable to analysis of various utilities and quality attributes. This paper provides a method towards EA interoperability analysis. This approach is based on Analytical Hierarchy Process (AHP) and considers the situation of the enterprise in giving weight to the different criteria and sub criteria of each utility. It proposes a quantitative method of assessing Interoperability achievement of different scenarios using AHP based on the knowledge and experience of EA experts and domain experts, and helps in deciding between them. The applicability of the proposed approach is demonstrated using a practical case study.

  17. The EuroGEOSS Brokering Framework for Multidisciplinary Interoperability

    Science.gov (United States)

    Santoro, M.; Nativi, S.; Craglia, M.; Boldrini, E.; Vaccari, L.; Papeschi, F.; Bigagli, L.

    2011-12-01

    The Global Earth Observation System of Systems (GEOSS), envisioned by the group of eight most industrialized countries (G-8) in 2003, provides the indispensable framework to integrate the Earth observation efforts at a global level. The European Commission also contributes to the implementation of the GEOSS through research projects funded from its Framework Programme for Research & Development. The EuroGEOSS (A European Approach to GEOSS) project was launched on May 2009 for a three-year period with the aim of supporting existing Earth Observing systems and applications interoperability and use within the GEOSS and INSPIRE frameworks. EuroGEOSS developed a multidisciplinary interoperability infrastructure for the three strategic areas of Drought, Forestry and Biodiversity; this operating capacity is currently being extended to other scientific domains (i.e. Climate Change, Water, Ocean, Weather, etc.) Central to the multidisciplinary infrastructure is the "EuroGEOSS Brokering Framework", which is based on a Brokered SOA (Service Oriented Architecture) Approach. This approach extends the typical SOA archetype introducing "expert" components: the Brokers. The Brokers provide the mediation and distribution functionalities needed to interconnect the distributed and heterogeneous resources characterizing a System of Systems (SoS) environment. Such a solution addresses significant shortcomings characterizing the present SOA implementations for global frameworks, such as multiple protocols and data models interoperability. Currently, the EuroGEOSS multidisciplinary infrastructure is composed of the following brokering components: 1. The Discovery Broker: providing harmonized discovery functionalities by mediating and distributing user queries against tens of heterogeneous services. 2. The Semantic Discovery Augmentation Component: enhancing the capabilities of the discovery broker with semantic query-expansion. 3. The Data Access Broker: enabling users to seamlessly

  18. Extending the GI Brokering Suite to Support New Interoperability Specifications

    Science.gov (United States)

    Boldrini, E.; Papeschi, F.; Santoro, M.; Nativi, S.

    2014-12-01

    The GI brokering suite provides the discovery, access, and semantic Brokers (i.e. GI-cat, GI-axe, GI-sem) that empower a Brokering framework for multi-disciplinary and multi-organizational interoperability. GI suite has been successfully deployed in the framework of several programmes and initiatives, such as European Union funded projects, NSF BCube, and the intergovernmental coordinated effort Global Earth Observation System of Systems (GEOSS). Each GI suite Broker facilitates interoperability for a particular functionality (i.e. discovery, access, semantic extension) among a set of brokered resources published by autonomous providers (e.g. data repositories, web services, semantic assets) and a set of heterogeneous consumers (e.g. client applications, portals, apps). A wide set of data models, encoding formats, and service protocols are already supported by the GI suite, such as the ones defined by international standardizing organizations like OGC and ISO (e.g. WxS, CSW, SWE, GML, netCDF) and by Community specifications (e.g. THREDDS, OpenSearch, OPeNDAP, ESRI APIs). Using GI suite, resources published by a particular Community or organization through their specific technology (e.g. OPeNDAP/netCDF) can be transparently discovered, accessed, and used by different Communities utilizing their preferred tools (e.g. a GIS visualizing WMS layers). Since Information Technology is a moving target, new standards and technologies continuously emerge and are adopted in the Earth Science context too. Therefore, GI Brokering suite was conceived to be flexible and accommodate new interoperability protocols and data models. For example, GI suite has recently added support to well-used specifications, introduced to implement Linked data, Semantic Web and precise community needs. Amongst the others, they included: DCAT: a RDF vocabulary designed to facilitate interoperability between Web data catalogs. CKAN: a data management system for data distribution, particularly used by

  19. A semantically rich and standardised approach enhancing discovery of sensor data and metadata

    Science.gov (United States)

    Kokkinaki, Alexandra; Buck, Justin; Darroch, Louise

    2016-04-01

    The marine environment plays an essential role in the earth's climate. To enhance the ability to monitor the health of this important system, innovative sensors are being produced and combined with state of the art sensor technology. As the number of sensors deployed is continually increasing,, it is a challenge for data users to find the data that meet their specific needs. Furthermore, users need to integrate diverse ocean datasets originating from the same or even different systems. Standards provide a solution to the above mentioned challenges. The Open Geospatial Consortium (OGC) has created Sensor Web Enablement (SWE) standards that enable different sensor networks to establish syntactic interoperability. When combined with widely accepted controlled vocabularies, they become semantically rich and semantic interoperability is achievable. In addition, Linked Data is the recommended best practice for exposing, sharing and connecting information on the Semantic Web using Uniform Resource Identifiers (URIs), Resource Description Framework (RDF) and RDF Query Language (SPARQL). As part of the EU-funded SenseOCEAN project, the British Oceanographic Data Centre (BODC) is working on the standardisation of sensor metadata enabling 'plug and play' sensor integration. Our approach combines standards, controlled vocabularies and persistent URIs to publish sensor descriptions, their data and associated metadata as 5 star Linked Data and OGC SWE (SensorML, Observations & Measurements) standard. Thus sensors become readily discoverable, accessible and useable via the web. Content and context based searching is also enabled since sensors descriptions are understood by machines. Additionally, sensor data can be combined with other sensor or Linked Data datasets to form knowledge. This presentation will describe the work done in BODC to achieve syntactic and semantic interoperability in the sensor domain. It will illustrate the reuse and extension of the Semantic Sensor

  20. Semantic Extraction for Multi-Enterprise Business Collaboration

    Institute of Scientific and Technical Information of China (English)

    SUN Hongjun; FAN Yushun

    2009-01-01

    Semantic extraction is essential for semantic interoperability in multi-enterprise business collabo-ration environments. Although many studies on semantic extraction have been carried out, few have focused on how to precisely and effectively extract semantics from multiple heterogeneous data schemas. This paper presents a semi-automatic semantic extraction method based on a neutral representation format (NRF) for acquiring semantics from heterogeneous data schemas. As a unified syntax-independent model, NRF re-moves all the contingencies of heterogeneous data schemas from the original data environment. Conceptual extraction and keyword extraction are used to acquire the semantics from the NRF. Conceptual extraction entails constructing a conceptual model, while keyword extraction seeks to obtain the metadata. An industrial case is given to validate the approach. This method has good extensibility and flexibility. The results show that the method provides simple, accurate, and effective semantic intereperability in multi-enterprise busi-ness collaboration environments.

  1. The Interoperability of Distributed DBMSs in a CORBA-Based Multidatabase System

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    One way of achieving interoperability among heterogeneous, distributed DBMSs is through a multidatabase system. Recently, there is an increasing use of CORBA implementation in developing multidatabase systems. Panorama is a multidatabase system that has been implemented on the top of CORBA compliant namely VisiBroker. It aims to achieve interoperability among Oracle, Sybase and other different DBMSs through the registration of these DBMSs to Panorama and through the single global query language -PanoSQL designed for this system. In this paper, we first introduce CORBA for the interoperability in multidatabase systems. Then, a general view for our designed multidatabase system, Panorama, has been given. In section four, we introduce the global query language -PanoSQL designed to achieve interoperability among the different DBMSs implemented in Panorama. Then, as an example, we present the registration of Oracle to Panorama in order to achieve interoperability in this system. And finally, a conclusion and the future work for this system have been given.

  2. On Web Services Based Cloud Interoperability

    Directory of Open Access Journals (Sweden)

    Reeta Sony A.L

    2012-09-01

    Full Text Available Cloud Computing is a paradigm shift in the field of Computing. It is moving at an incredible fast pace and one of the fastest evolving domains of computer science today. It consist set of technology and service models that concentrates on the internet base use and delivery of IT applications, processing capability, storage and memory space. There is a shift from the traditional in-house servers and applications to the next generation of cloud computing applications. With many of the computer giants like Google, Microsoft, etc. entering into the cloud computing arena, there will be thousands of applications running on the cloud. There are several cloud environments available in the market today which support a huge consumer-base. Eventually this will lead to a multitude of standards, technologies and products being provided on the cloud. Consumers will need certain degrees of flexibility to use the cloud application/services of their choice and at the same time will need these applications/services to communicate with each other. This paper emphasizes cloud computing and provides a solution to achieve Interoperability, which is in the form of Web Services. The paper will also provide a Live Case Study where interoperability comes into play - Connecting Google App Engine and Microsoft Windows Azure Platform, two of the leading Cloud Platforms available today. GAE and WAP are two Cloud Frameworks which have very little in common, making interoperability an absolute necessary.

  3. Evaluation of Multistrategy Classifiers for Heterogeneous Ontology Matching On the Semantic Web

    Institute of Scientific and Technical Information of China (English)

    PAN Le-yun; LIU Xiao-qiang; MA Fan-yuan

    2005-01-01

    On the semantic web, data interoperability and ontology heterogeneity are becoming ever more important issues. To resolve these problems, multiple classification methods can be used to learn the matching between ontologies. The paper uses the general statistic classification method to discover category features in data instances and use the first-order learning algorithm FOIL to exploit the semantic relations among data instances. When using mulfistrategy learning approach, a central problem is the evaluation of multistrategy classifiers. The goal and the conditions of using multistrategy classifiers within ontology matching are different from the ones for general text classification. This paper describes the combination rule of multiple classifiers called the Best Outstanding Champion, which is suitable for heterogeneous ontology mapping. On the prediction results of individual methods, the method can well accumulate the correct matching of alone classifier. The experiments show that the approach achieves high accuracy on real-world domain.

  4. Fusion is possible only with interoperability agreements; the GEOSS experience

    Science.gov (United States)

    Percivall, G.

    2008-12-01

    Data fusion is defined for this session as the merging of disparate data sources for multidisciplinary study. Implicit in this definition is that the data consumer may not be intimately familiar with the data sources. In order to achieve fusion of the data, there must be generalized concepts that apply to both the data sources and consumer; and those concepts must be implemented in our information systems. The successes of GEOSS depend on data and information providers accepting and implementing a set of interoperability arrangements, including technical specifications for collecting, processing, storing, and disseminating shared data, metadata, and products. GEOSS interoperability is based on non-proprietary standards, with preference to formal international standards. GEOSS requires a scientific basis for the collection, processing and interpretation of the data. Use of standards is a hallmark of a sound scientific basis. In order communicate effectively to achieve data fusion, interoperability arrangements must be based upon sound scientific principles that have been implemented in efficient and effective tools. Establishing such interoperability arrangements depends upon social processes and technology. Through the use of Interoperability Arrangements based upon standards, GEOSS achieves data fusion to in order to answer humanities critical questions. Decision making in support of societal benefit areas depends upon data fusion in multidisciplinary settings.

  5. Political, policy and social barriers to health system interoperability: emerging opportunities of Web 2.0 and 3.0.

    Science.gov (United States)

    Juzwishin, Donald W M

    2009-01-01

    Achieving effective health informatics interoperability in a fragmented and uncoordinated health system is by definition not possible. Interoperability requires the simultaneous integration of health care processes and information across different types and levels of care (systems thinking). The fundamental argument of this paper is that information system interoperability will remain an unfulfilled hope until health reforms effectively address the governance (accountability), structural and process barriers to interoperability of health care delivery. The ascendency of Web 2.0 and 3.0, although still unproven, signals the opportunity to accelerate patients' access to health information and their health record. Policy suggestions for simultaneously advancing health system delivery and information system interoperability are posited.

  6. Lottery semantics

    NARCIS (Netherlands)

    P. Galliani; A.L. Mann

    2010-01-01

    We present a compositional semantics for a logic of imperfect information and prove its equivalence to equilibrium semantics ([10]), thus extending to mixed (rather than just behavioural) strategies part of the work of ([2], [3]).

  7. Heterogeneous software system interoperability through computer-aided resolution of modeling differences

    OpenAIRE

    Young, Paul E.

    2002-01-01

    Approved for public release; distribution is unlimited Meeting future system requirements by integrating existing stand-alone systems is attracting renewed interest. Computer communications advances, functional similarities in related systems, and enhanced information description mechanisms suggest that improved capabilities may be possible; but full realization of this potential can only be achieved if stand-alone systems are fully interoperable. Interoperability among independently devel...

  8. D-ATM, a working example of health care interoperability: From dirt path to gravel road.

    Science.gov (United States)

    DeClaris, John-William

    2009-01-01

    For many years, there have been calls for interoperability within health care systems. The technology currently exists and is being used in business areas like banking and commerce, to name a few. Yet the question remains, why has interoperability not been achieved in health care? This paper examines issues encountered and success achieved with interoperability during the development of the Digital Access To Medication (D-ATM) project, sponsored by the Substance Abuse and Mental Health Services Administration (SAMHSA). D-ATM is the first government funded interoperable patient management system. The goal of this paper is to provide lessons learned and propose one possible road map for health care interoperability within private industry and how government can help. PMID:19963614

  9. Modeling and formal representation of geospatial knowledge for the Geospatial Semantic Web

    Science.gov (United States)

    Huang, Hong; Gong, Jianya

    2008-12-01

    GML can only achieve geospatial interoperation at syntactic level. However, it is necessary to resolve difference of spatial cognition in the first place in most occasions, so ontology was introduced to describe geospatial information and services. But it is obviously difficult and improper to let users to find, match and compose services, especially in some occasions there are complicated business logics. Currently, with the gradual introduction of Semantic Web technology (e.g., OWL, SWRL), the focus of the interoperation of geospatial information has shifted from syntactic level to Semantic and even automatic, intelligent level. In this way, Geospatial Semantic Web (GSM) can be put forward as an augmentation to the Semantic Web that additionally includes geospatial abstractions as well as related reasoning, representation and query mechanisms. To advance the implementation of GSM, we first attempt to construct the mechanism of modeling and formal representation of geospatial knowledge, which are also two mostly foundational phases in knowledge engineering (KE). Our attitude in this paper is quite pragmatical: we argue that geospatial context is a formal model of the discriminate environment characters of geospatial knowledge, and the derivation, understanding and using of geospatial knowledge are located in geospatial context. Therefore, first, we put forward a primitive hierarchy of geospatial knowledge referencing first order logic, formal ontologies, rules and GML. Second, a metamodel of geospatial context is proposed and we use the modeling methods and representation languages of formal ontologies to process geospatial context. Thirdly, we extend Web Process Service (WPS) to be compatible with local DLL for geoprocessing and possess inference capability based on OWL.

  10. CCP interoperability and system stability

    Science.gov (United States)

    Feng, Xiaobing; Hu, Haibo

    2016-09-01

    To control counterparty risk, financial regulations such as the Dodd-Frank Act are increasingly requiring standardized derivatives trades to be cleared by central counterparties (CCPs). It is anticipated that in the near term future, CCPs across the world will be linked through interoperability agreements that facilitate risk sharing but also serve as a conduit for transmitting shocks. This paper theoretically studies a networked network with CCPs that are linked through interoperability arrangements. The major finding is that the different configurations of networked network CCPs contribute to the different properties of the cascading failures.

  11. Federated Spatial Databases and Interoperability

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    It is a period of information explosion. Especially for spatialinfo rmation science, information can be acquired through many ways, such as man-mad e planet, aeroplane, laser, digital photogrammetry and so on. Spatial data source s are usually distributed and heterogeneous. Federated database is the best reso lution for the share and interoperation of spatial database. In this paper, the concepts of federated database and interoperability are introduced. Three hetero geneous kinds of spatial data, vector, image and DEM are used to create integrat ed database. A data model of federated spatial databases is given

  12. Intercloud Architecture for Interoperability and Integration

    NARCIS (Netherlands)

    Demchenko, Y.; Makkes, M.X.; Strijkers, R.J.; Ngo, C.

    2011-01-01

    This paper presents on-going research to develop the Intercloud Architecture (ICA) Framework that should address problems in multi-provider multi-domain heterogeneous Cloud based infrastructure services and applications integration and interoperability, including integration and interoperability wit

  13. Dynamic Business Networks: A Headache for Sustainable Systems Interoperability

    Science.gov (United States)

    Agostinho, Carlos; Jardim-Goncalves, Ricardo

    Collaborative networked environments emerged with the spread of the internet, contributing to overcome past communication barriers, and identifying interoperability as an essential property. When achieved seamlessly, efficiency is increased in the entire product life cycle. Nowadays, most organizations try to attain interoperability by establishing peer-to-peer mappings with the different partners, or in optimized networks, by using international standard models as the core for information exchange. In current industrial practice, mappings are only defined once, and the morphisms that represent them, are hardcoded in the enterprise systems. This solution has been effective for static environments, where enterprise and product models are valid for decades. However, with an increasingly complex and dynamic global market, models change frequently to answer new customer requirements. This paper draws concepts from the complex systems science and proposes a framework for sustainable systems interoperability in dynamic networks, enabling different organizations to evolve at their own rate.

  14. Interoperability between .Net framework and Python in Component way

    Directory of Open Access Journals (Sweden)

    M. K. Pawar

    2013-01-01

    Full Text Available The objective of this work is to make interoperability of the distributed object based on CORBA middleware technology and standards. The distributed objects for the client-server technology are implemented in C#.Net framework and the Python language. The interoperability result shows the possibilities of application in which objects can communicate in different environment and different languages. It is also analyzing that how to achieve client-server communication in heterogeneous environment using the OmniORBpy IDL compiler and IIOP.NET IDLtoCLS mapping. The results were obtained that demonstrate the interoperability between .Net Framework and Python language. This paper also summarizes a set of fairly simple examples using some reasonably complex software tools.

  15. A Research on E - learning Resources Construction Based on Semantic Web

    Science.gov (United States)

    Rui, Liu; Maode, Deng

    Traditional e-learning platforms have the flaws that it's usually difficult to query or positioning, and realize the cross platform sharing and interoperability. In the paper, the semantic web and metadata standard is discussed, and a kind of e - learning system framework based on semantic web is put forward to try to solve the flaws of traditional elearning platforms.

  16. Potential interoperability problems facing multi-site radiation oncology centers in The Netherlands

    Science.gov (United States)

    Scheurleer, J.; Koken, Ph; Wessel, R.

    2014-03-01

    Aim: To identify potential interoperability problems facing multi-site Radiation Oncology (RO) departments in the Netherlands and solutions for unambiguous multi-system workflows. Specific challenges confronting the RO department of VUmc (RO-VUmc), which is soon to open a satellite department, were characterized. Methods: A nationwide questionnaire survey was conducted to identify possible interoperability problems and solutions. Further detailed information was obtained by in-depth interviews at 3 Dutch RO institutes that already operate in more than one site. Results: The survey had a 100% response rate (n=21). Altogether 95 interoperability problems were described. Most reported problems were on a strategic and semantic level. The majority were DICOM(-RT) and HL7 related (n=65), primarily between treatment planning and verification systems or between departmental and hospital systems. Seven were identified as being relevant for RO-VUmc. Departments have overcome interoperability problems with their own, or with tailor-made vendor solutions. There was little knowledge about or utilization of solutions developed by Integrating the Healthcare Enterprise Radiation Oncology (IHE-RO). Conclusions: Although interoperability problems are still common, solutions have been identified. Awareness of IHE-RO needs to be raised. No major new interoperability problems are predicted as RO-VUmc develops into a multi-site department.

  17. Potential interoperability problems facing multi-site radiation oncology centers in The Netherlands

    International Nuclear Information System (INIS)

    Aim: To identify potential interoperability problems facing multi-site Radiation Oncology (RO) departments in the Netherlands and solutions for unambiguous multi-system workflows. Specific challenges confronting the RO department of VUmc (RO-VUmc), which is soon to open a satellite department, were characterized. Methods: A nationwide questionnaire survey was conducted to identify possible interoperability problems and solutions. Further detailed information was obtained by in-depth interviews at 3 Dutch RO institutes that already operate in more than one site. Results: The survey had a 100% response rate (n=21). Altogether 95 interoperability problems were described. Most reported problems were on a strategic and semantic level. The majority were DICOM(-RT) and HL7 related (n=65), primarily between treatment planning and verification systems or between departmental and hospital systems. Seven were identified as being relevant for RO-VUmc. Departments have overcome interoperability problems with their own, or with tailor-made vendor solutions. There was little knowledge about or utilization of solutions developed by Integrating the Healthcare Enterprise Radiation Oncology (IHE-RO). Conclusions: Although interoperability problems are still common, solutions have been identified. Awareness of IHE-RO needs to be raised. No major new interoperability problems are predicted as RO-VUmc develops into a multi-site department.

  18. The interoperability force in the ERP field

    OpenAIRE

    Boza Garcia, Andres; Cuenca, L.; Poler Escoto, Raúl; Michaelides, Zenon

    2015-01-01

    Enterprise resource planning (ERP) systems participate in interoperability projects and this participation sometimes leads to new proposals for the ERP field. The aim of this paper is to identify the role that interoperability plays in the evolution of ERP systems. To go about this, ERP systems have been first identified within interoperability frameworks. Second, the initiatives in the ERP field driven by interoperability requirements have been identified from two perspectives: technological...

  19. Semantic interoperability in sensor applications : Making sense of sensor data

    NARCIS (Netherlands)

    Brandt, P.; Basten, T.; Stuijk, S.; Bui, V.; Clercq, P. de; Ferreira Pires, L.; Sinderen, M. van

    2013-01-01

    Much effort has been spent on the optimization of sensor networks, mainly concerning their performance and power efficiency. Furthermore, open communication protocols for the exchange of sensor data have been developed and widely adopted, making sensor data widely available for software applications

  20. Contexts for concepts: Information modeling for semantic interoperability

    NARCIS (Netherlands)

    Oude Luttighuis, P.H.W.M.; Stap, R.E.; Quartel, D.

    2011-01-01

    Conceptual information modeling is a well-established practice, aimed at preparing the implementation of information systems, the specification of electronic message formats, and the design of information processes. Today's ever more connected world however poses new challenges for conceptual inform

  1. Semantic Enterprise Optimizer and Coexistence of Data Models

    OpenAIRE

    P. A. Sundararajan; Anupama Nithyanand; Subrahmanya, S. V.

    2012-01-01

    The authors propose a semantic ontology–driven enterprise data–model architecture for interoperability, integration, and adaptability for evolution, by autonomic agent-driven intelligent design of logical as well as physical data models in a heterogeneous distributed enterprise through its life cycle. An enterprise-standard ontology (in Web Ontology Language [OWL] and Semantic Web Rule Language [SWRL]) for data is required to enable an automated data platform that adds life-cycle activities t...

  2. Improving Groundwater Data Interoperability: Results of the Second OGC Groundwater Interoperability Experiment

    Science.gov (United States)

    Lucido, J. M.; Booth, N.

    2014-12-01

    Interoperable sharing of groundwater data across international boarders is essential for the proper management of global water resources. However storage and management of groundwater data is often times distributed across many agencies or organizations. Furthermore these data may be represented in disparate proprietary formats, posing a significant challenge for integration. For this reason standard data models are required to achieve interoperability across geographical and political boundaries. The GroundWater Markup Language 1.0 (GWML1) was developed in 2010 as an extension of the Geography Markup Language (GML) in order to support groundwater data exchange within Spatial Data Infrastructures (SDI). In 2013, development of GWML2 was initiated under the sponsorship of the Open Geospatial Consortium (OGC) for intended adoption by the international community as the authoritative standard for the transfer of groundwater feature data, including data about water wells, aquifers, and related entities. GWML2 harmonizes GWML1 and the EU's INSPIRE models related to geology and hydrogeology. Additionally, an interoperability experiment was initiated to test the model for commercial, technical, scientific, and policy use cases. The scientific use case focuses on the delivery of data required for input into computational flow modeling software used to determine the flow of groundwater within a particular aquifer system. It involves the delivery of properties associated with hydrogeologic units, observations related to those units, and information about the related aquifers. To test this use case web services are being implemented using GWML2 and WaterML2, which is the authoritative standard for water time series observations, in order to serve USGS water well and hydrogeologic data via standard OGC protocols. Furthermore, integration of these data into a computational groundwater flow model will be tested. This submission will present the GWML2 information model and results

  3. Semantic Web,Agent and Network-Virtual Society%Semantic Web、Agent和网络虚拟社会

    Institute of Scientific and Technical Information of China (English)

    戴欣; 申瑞民; 张同珍

    2003-01-01

    This paper tries to discuss one realizable mode of SW(Semantic Web). It is called NVS(Network-Virtual Society). SW is regarded as the next-generation Web. By adding semantics into Web,SW provides interoperability between applications and facilities to enable automated processing of Web resources. Agent will be the executer in the automated process. After analyzing relational theories and technologies, we put forward the concept and mode of NVS,and gives our reason.

  4. Open Health Tools: Tooling for Interoperable Healthcare

    Directory of Open Access Journals (Sweden)

    Skip McGaughey

    2008-11-01

    Full Text Available The Open Health Tools initiative is creating an ecosystem focused on the production of software tooling that promotes the exchange of medical information across political, geographic, cultural, product, and technology lines. At its core, OHT believes that the availability of high-quality tooling that interoperates will propel the industry forward, enabling organizations and vendors to build products and systems that effectively work together. This will ?raise the interoperability bar? as a result of having tools that just work. To achieve these lofty goals, careful consideration must be made to the constituencies that will be most affected by an OHT-influenced world. This document outlines a vision of OHT?s impact to these stakeholders. It does not explain the OHT process itself or how the OHT community operates. Instead, we place emphasis on the impact of that process within the health industry. The catchphrase ?code is king? underpins this document, meaning that the manifestation of any open source community lies in the products and technology it produces.

  5. Automated testing of healthcare document transformations in the PICASSO interoperability platform

    OpenAIRE

    Pascale, Massimo; Roselli, Marcello; Rugani, Umberto; Bartolini, Cesare; Bertolino, Antonia; Lonetti, Francesca; Marchetti, Eda; Polini, Andrea

    2009-01-01

    In every application domain, achieving interoperability among heterogenous information systems is a crucial challenge and alliances are formed to standardize data-exchange formats. In the healthcare sector, HL7-V3 provides the current international reference models for clinical and administrative documents. Codices, an Italian company, provides the PICASSO platform that uses HL7-V3 as the pivot format to fast achieve a highly integrated degree of interoperability among health-related applicat...

  6. The interoperability force in the ERP field

    Science.gov (United States)

    Boza, Andrés; Cuenca, Llanos; Poler, Raúl; Michaelides, Zenon

    2015-04-01

    Enterprise resource planning (ERP) systems participate in interoperability projects and this participation sometimes leads to new proposals for the ERP field. The aim of this paper is to identify the role that interoperability plays in the evolution of ERP systems. To go about this, ERP systems have been first identified within interoperability frameworks. Second, the initiatives in the ERP field driven by interoperability requirements have been identified from two perspectives: technological and business. The ERP field is evolving from classical ERP as information system integrators to a new generation of fully interoperable ERP. Interoperability is changing the way of running business, and ERP systems are changing to adapt to the current stream of interoperability.

  7. A Review of Ontologies with the Semantic Web in View.

    Science.gov (United States)

    Ding, Ying

    2001-01-01

    Discusses the movement of the World Wide Web from the first generation to the second, called the Semantic Web. Provides an overview of ontology, a philosophical theory about the nature of existence being applied to artificial intelligence that will have a crucial role in enabling content-based access, interoperability, and communication across the…

  8. Extravehicular activity space suit interoperability.

    Science.gov (United States)

    Skoog, A I; McBarron JW 2nd; Severin, G I

    1995-10-01

    The European Agency (ESA) and the Russian Space Agency (RKA) are jointly developing a new space suit system for improved extravehicular activity (EVA) capabilities in support of the MIR Space Station Programme, the EVA Suit 2000. Recent national policy agreements between the U.S. and Russia on planned cooperations in manned space also include joint extravehicular activity (EVA). With an increased number of space suit systems and a higher operational frequency towards the end of this century an improved interoperability for both routine and emergency operations is of eminent importance. It is thus timely to report the current status of ongoing work on international EVA interoperability being conducted by the Committee on EVA Protocols and Operations of the International Academy of Astronauts initiated in 1991. This paper summarises the current EVA interoperability issues to be harmonised and presents quantified vehicle interface requirements for the current U.S. Shuttle EMU and Russian MIR Orlan DMA and the new European/Russian EVA Suit 2000 extravehicular systems. Major critical/incompatible interfaces for suits/mother-craft of different combinations are discussed, and recommendations for standardisations given.

  9. Semantic Desktop

    Science.gov (United States)

    Sauermann, Leo; Kiesel, Malte; Schumacher, Kinga; Bernardi, Ansgar

    In diesem Beitrag wird gezeigt, wie der Arbeitsplatz der Zukunft aussehen könnte und wo das Semantic Web neue Möglichkeiten eröffnet. Dazu werden Ansätze aus dem Bereich Semantic Web, Knowledge Representation, Desktop-Anwendungen und Visualisierung vorgestellt, die es uns ermöglichen, die bestehenden Daten eines Benutzers neu zu interpretieren und zu verwenden. Dabei bringt die Kombination von Semantic Web und Desktop Computern besondere Vorteile - ein Paradigma, das unter dem Titel Semantic Desktop bekannt ist. Die beschriebenen Möglichkeiten der Applikationsintegration sind aber nicht auf den Desktop beschränkt, sondern können genauso in Web-Anwendungen Verwendung finden.

  10. PROPOSED CONCETUAL DEVELOPMENT LEVELS FOR IDEAL INTEROPERABILITY AND SECURITY IN MODERN DIGITAL GOVERNMENT

    Directory of Open Access Journals (Sweden)

    Md.Headayetullah

    2010-06-01

    protocol and ideal interoperability are unhurried the imperative issues for achieving a sophisticated phase of modern digital government.

  11. Comparison Latent Semantic and WordNet Approach for Semantic Similarity Calculation

    CERN Document Server

    Wicaksana, I Wayan Simri

    2011-01-01

    Information exchange among many sources in Internet is more autonomous, dynamic and free. The situation drive difference view of concepts among sources. For example, word 'bank' has meaning as economic institution for economy domain, but for ecology domain it will be defined as slope of river or lake. In this aper, we will evaluate latent semantic and WordNet approach to calculate semantic similarity. The evaluation will be run for some concepts from different domain with reference by expert or human. Result of the evaluation can provide a contribution for mapping of concept, query rewriting, interoperability, etc.

  12. Understanding semantics

    DEFF Research Database (Denmark)

    Thrane, Torben

    1997-01-01

    Understanding natural language is a cognitive, information-driven process. Discussing some of the consequences of this fact, the paper offers a novel look at the semantic effect of lexical nouns and the identification of reference types.......Understanding natural language is a cognitive, information-driven process. Discussing some of the consequences of this fact, the paper offers a novel look at the semantic effect of lexical nouns and the identification of reference types....

  13. K-Means Based Fingerprint Segmentation with Sensor Interoperability

    Directory of Open Access Journals (Sweden)

    Xiukun Yang

    2010-01-01

    Full Text Available A critical step in an automatic fingerprint recognition system is the segmentation of fingerprint images. Existing methods are usually designed to segment fingerprint images originated from a certain sensor. Thus their performances are significantly affected when dealing with fingerprints collected by different sensors. This work studies the sensor interoperability of fingerprint segmentation algorithms, which refers to the algorithm's ability to adapt to the raw fingerprints obtained from different sensors. We empirically analyze the sensor interoperability problem, and effectively address the issue by proposing a k-means based segmentation method called SKI. SKI clusters foreground and background blocks of a fingerprint image based on the k-means algorithm, where a fingerprint block is represented by a 3-dimensional feature vector consisting of block-wise coherence, mean, and variance (abbreviated as CMV. SKI also employs morphological postprocessing to achieve favorable segmentation results. We perform SKI on each fingerprint to ensure sensor interoperability. The interoperability and robustness of our method are validated by experiments performed on a number of fingerprint databases which are obtained from various sensors.

  14. Vocabulary services to support scientific data interoperability

    Science.gov (United States)

    Cox, Simon; Mills, Katie; Tan, Florence

    2013-04-01

    Shared vocabularies are a core element in interoperable systems. Vocabularies need to be available at run-time, and where the vocabularies are shared by a distributed community this implies the use of web technology to provide vocabulary services. Given the ubiquity of vocabularies or classifiers in systems, vocabulary services are effectively the base of the interoperability stack. In contemporary knowledge organization systems, a vocabulary item is considered a concept, with the "terms" denoting it appearing as labels. The Simple Knowledge Organization System (SKOS) formalizes this as an RDF Schema (RDFS) application, with a bridge to formal logic in Web Ontology Language (OWL). For maximum utility, a vocabulary should be made available through the following interfaces: * the vocabulary as a whole - at an ontology URI corresponding to a vocabulary document * each item in the vocabulary - at the item URI * summaries, subsets, and resources derived by transformation * through the standard RDF web API - i.e. a SPARQL endpoint * through a query form for human users. However, the vocabulary data model may be leveraged directly in a standard vocabulary API that uses the semantics provided by SKOS. SISSvoc3 [1] accomplishes this as a standard set of URI templates for a vocabulary. Any URI comforming to the template selects a vocabulary subset based on the SKOS properties, including labels (skos:prefLabel, skos:altLabel, rdfs:label) and a subset of the semantic relations (skos:broader, skos:narrower, etc). SISSvoc3 thus provides a RESTFul SKOS API to query a vocabulary, but hiding the complexity of SPARQL. It has been implemented using the Linked Data API (LDA) [2], which connects to a SPARQL endpoint. By using LDA, we also get content-negotiation, alternative views, paging, metadata and other functionality provided in a standard way. A number of vocabularies have been formalized in SKOS and deployed by CSIRO, the Australian Bureau of Meteorology (BOM) and their

  15. HEALTH SYSTEMS INTEROPERABILITY: ANALYSIS AND COMPARISON

    OpenAIRE

    Guedria, Wided; Lamine, Elyes; Pingaud, Hervé

    2014-01-01

    Colloque avec actes et comité de lecture. internationale. International audience ": Promoting eHealth interoperability is a priority in Europe to enhance the quality and safety of patient care. However, this priority is very difficult to establish. Develop an interoperable system, or controlling systems in-teroperation have been approached from multiple points of view, with many dimensions and under various types of approaches.Several studies and initiatives have been proposed in the fi...

  16. Interoperability Issues for VPN IPsec Solutions

    Directory of Open Access Journals (Sweden)

    Iulian Danalachi

    2011-03-01

    Full Text Available An issue of testing that should be taken into consideration is the compatibility and interoperability of the IPsec components when implementing an IPsec solution. This article will guide us trough some key point introductive notions involved in the interoperability problem, we’ll see a short overview of some of these problems and afterwards we will discuss about some of the testing solutions of IPsec interoperability that we should take into consideration.

  17. Information interoperability and information standardisation for NATO C2 - a practical approach

    OpenAIRE

    Lasschuyt, E.; Hekken, M.C. van

    2001-01-01

    Interoperability between information systems is usually 'achieved' by enabling connection at network level. Making systems really interoperable, by letting them understand and manipulate the exchanged information, requires a lot more. Above all, information standards are needed in order to gain common understanding about what will be exchanged. Besides that, information standardisation should be considered from a global point of view, taking into account the whole range of systems that will p...

  18. Interoperability for Space Mission Monitor and Control: Applying Technologies from Manufacturing Automation and Process Control Industries

    Science.gov (United States)

    Jones, Michael K.

    1998-01-01

    Various issues associated with interoperability for space mission monitor and control are presented in viewgraph form. Specific topics include: 1) Space Project Mission Operations Control Architecture (SuperMOCA) goals and methods for achieving them; 2) Specifics on the architecture: open standards ad layering, enhancing interoperability, and promoting commercialization; 3) An advertisement; 4) Status of the task - government/industry cooperation and architecture and technology demonstrations; and 5) Key features of messaging services and virtual devices.

  19. On the use of an Interoperability Framework in Coopetition Context

    OpenAIRE

    Guédria, Wided; Golnam, Arash; Naudet, Yannick; Chen, David; Wegmann, Alain

    2011-01-01

    The simultaneous cooperation and competition between companies referred to as coopetition in the strategy literature is becoming a recurring theme in the business settings. Companies cooperate with their competitors to gain access to supplementary and complementary resources and capabilities in order to create more value for the customers in order to achieve sustainable value creation and distribution. To coopete, the companies need to be interoperable. Growing globalization, competitiveness ...

  20. Interoperability Infrastructure and Incremental learning for unreliable heterogeneous communicating Systems

    OpenAIRE

    Haseeb, Abdul

    2009-01-01

    In a broader sense the main research objective of this thesis (and ongoing research work) is distributed knowledge management for mobile dynamic systems. But the primary focus and presented work focuses on communication/interoperability of heterogeneous entities in an infrastructure less paradigm, a distributed resource manipulation infrastructure and distributed learning in the absence of global knowledge. The research objectives achieved discover the design aspects of heterogeneous distribu...

  1. OGC Geographic Information Service Deductive Semantic Reasoning Based on Description Vocabularies Reduction

    OpenAIRE

    MIAO Lizhi; Xu, Jie; Zhou, Ya; CHENG Wenchao

    2015-01-01

    As geographic information interoperability and sharing developing, more and more interoperable OGC (open geospatial consortium) Web services (OWS) are generated and published through the internet. These services can facilitate the integration of different scientific applications by searching, finding, and utilizing the large number of scientific data and Web services. However, these services are widely dispersed and hard to be found and utilized with executive semantic retrieval. This is espe...

  2. Model for Trans-sector Digital Interoperability

    NARCIS (Netherlands)

    Madureira, António; Hartog, den Frank; Silva, Eduardo; Baken, Nico; Zhao, L.; Macaulay, L.

    2009-01-01

    Interoperability refers to the ability of two or more systems or components to exchange information and to use the information that has been exchanged. The importance of interoperability has grown together with the adoption of Digital Information Networks (DINs). DINs refer to information networks s

  3. Model for Trans-sector Digital Interoperability

    NARCIS (Netherlands)

    Madureira, A.; Hartog, F.T.H. den; Silva, E.; Baken, N.

    2010-01-01

    Interoperability refers to the ability of two or more systems or components to exchange information and to use the information that has been exchanged. The importance of interoperability has grown together with the adoption of Digital Information Networks (DINs). DINs refer to information networks s

  4. Analyzing Interoperability of Protocols Using Model Checking

    Institute of Scientific and Technical Information of China (English)

    WUPeng

    2005-01-01

    In practical terms, protocol interoperability testing is still laborious and error-prone with little effect, even for those products that have passed conformance testing. Deadlock and unsymmetrical data communication are familiar in interoperability testing, and it is always very hard to trace their causes. The previous work has not provided a coherent way to analyze why the interoperability was broken among protocol implementations under test. In this paper, an alternative approach is presented to analyzing these problems from a viewpoint of implementation structures. Sequential and concurrent structures are both representative implementation structures, especially in event-driven development model. Our research mainly discusses the influence of sequential and concurrent structures on interoperability, with two instructive conclusions: (a) a sequential structure may lead to deadlock; (b) a concurrent structure may lead to unsymmetrical data communication. Therefore, implementation structures carry weight on interoperability, which may not gain much attention before. To some extent, they are decisive on the result of interoperability testing. Moreover, a concurrent structure with a sound task-scheduling strategy may contribute to the interoperability of a protocol implementation. Herein model checking technique is introduced into interoperability analysis for the first time. As the paper shows, it is an effective way to validate developers' selections on implementation structures or strategies.

  5. Interoperation Modeling for Intelligent Domotic Environments

    Science.gov (United States)

    Bonino, Dario; Corno, Fulvio

    This paper introduces an ontology-based model for domotic device inter-operation. Starting from a previously published ontology (DogOnt) a refactoring and extension is described allowing to explicitly represent device capabilities, states and commands, and supporting abstract modeling of device inter-operation.

  6. Interoperability of Web Archives and Digital Libraries

    DEFF Research Database (Denmark)

    Kalb, Hendrik; Lazaridou, Paraskevi; Pinsent, Edward;

    2013-01-01

    The interoperability of web archives and digital libraries is crucial to avoid silos of preserved data and content. While various researches focus on specfic facets of the challenge to interoperate, there is a lack of empirical work about the overall situation of actual challenges. We conduct a D...

  7. Knowledge-oriented semantics modelling towards uncertainty reasoning.

    Science.gov (United States)

    Mohammed, Abdul-Wahid; Xu, Yang; Liu, Ming

    2016-01-01

    Distributed reasoning in M2M leverages the expressive power of ontology to enable semantic interoperability between heterogeneous systems of connected devices. Ontology, however, lacks the built-in, principled support to effectively handle the uncertainty inherent in M2M application domains. Thus, efficient reasoning can be achieved by integrating the inferential reasoning power of probabilistic representations with the first-order expressiveness of ontology. But there remains a gap with current probabilistic ontologies since state-of-the-art provides no compatible representation for simultaneous handling of discrete and continuous quantities in ontology. This requirement is paramount, especially in smart homes, where continuous quantities cannot be avoided, and simply mapping continuous information to discrete states through quantization can cause a great deal of information loss. In this paper, we propose a hybrid probabilistic ontology that can simultaneously handle distributions over discrete and continuous quantities in ontology. We call this new framework HyProb-Ontology, and it specifies distributions over properties of classes, which serve as templates for instances of classes to inherit as well as overwrite some aspects. Since there cannot be restriction on the dependency topology of models that HyProb-Ontology can induce across different domains, we can achieve a unified Ground Hybrid Probabilistic Model by conditional Gaussian fuzzification of the distributions of the continuous variables in ontology. From the results of our experiments, this unified model can achieve exact inference with better performance over classical Bayesian networks. PMID:27350935

  8. The Information Systems Interoperability Maturity Model (ISIMM: Towards Standardizing Technical Interoperability and Assessment within Government

    Directory of Open Access Journals (Sweden)

    STEFANUS Van Staden

    2012-10-01

    Full Text Available To establish and implement a workable e-Government, all possible and relevant stakeholders’ systems need to be inter-connected in such away that the hardware, software and data are interoperable. Thus, interoperability is the key to information exchange and sharing among the heterogeneous systems. In view of this, the paper introduces the Information Systems Interoperability Maturity Model (ISIMM that defines the levels and degree of interoperability sophistication that an organisation’s Information Systems will progress through. ISIMM focuses more on detailed technical aspects of interoperability that allows data to be exchanged and shared within an information system environment. In this way, it provides the practical means of assessing technical interoperability between information system pairs, groups or clusters and it facilitates a model to measure the maturity and compliancy levels of interoperability information systems.

  9. Semantic Deviation in Oliver Twist

    Institute of Scientific and Technical Information of China (English)

    康艺凡

    2016-01-01

    Dickens, with his adeptness with language, applies semantic deviation skillfully in his realistic novel Oliver Twist. However, most studies and comments home and abroad on it mainly focus on such aspects as humanity, society, and characters. Therefore, this thesis will take a stylistic approach to Oliver Twist from the perspective of semantic deviation, which is achieved by the use of irony, hyperbole, and pun and analyze how the application of the technique makes the novel attractive.

  10. Innovation in OGC: The Interoperability Program

    Directory of Open Access Journals (Sweden)

    George Percivall

    2015-10-01

    Full Text Available The OGC Interoperability Program is a source of innovation in the development of open standards. The approach to innovation is based on hands-on; collaborative engineering leading to more mature standards and implementations. The process of the Interoperability Program engages a community of sponsors and participants based on an economic model that benefits all involved. Each initiative begins with an innovative approach to identify interoperability needs followed by agile software development to advance the state of technology to the benefit of society. Over eighty initiatives have been conducted in the Interoperability Program since the breakthrough Web Mapping Testbed began the program in 1999. OGC standards that were initiated in Interoperability Program are the basis of two thirds of the certified compliant products.

  11. HTML5 microdata as a semantic container for medical information exchange.

    Science.gov (United States)

    Kimura, Eizen; Kobayashi, Shinji; Ishihara, Ken

    2014-01-01

    Achieving interoperability between clinical electronic medical records (EMR) systems and cloud computing systems is challenging because of the lack of a universal reference method as a standard for information exchange with a secure connection. Here we describe an information exchange scheme using HTML5 microdata, where the standard semantic container is an HTML document. We embed HL7 messages describing laboratory test results in the microdata. We also annotate items in the clinical research report with the microdata. We mapped the laboratory test result data into the clinical research report using an HL7 selector specified in the microdata. This scheme can provide secure cooperation between the cloud-based service and the EMR system. PMID:25160218

  12. A SEMANTICALLY DISTRIBUTED APPROACH TO MAP IP TRAFFIC MEASUREMENTS TO A STANDARDIZED ONTOLOGY

    Directory of Open Access Journals (Sweden)

    Alfredo Salvador

    2010-01-01

    Full Text Available Traffic monitoring in IP networks is a key issue for operators to guarantee Service Level Agreement bothto their clients and with regards to other connectivity providers. Thus, having efficient solutions fortraffic measurement and monitoring supports a good deal of their business and it is essential to fairdevelopment of Internet. However, even if service management is well recognized, QoS strategies mustevolve from circuit switching technological framework towards next generation networks and convergentservices concepts. Standardizing IP traffic measurement is a requirement for interoperable service awaremanagement systems upon which future Internet business would be based.A few projects have recently tackled the task of building rich infrastructures to provide IP trafficmeasurements. The European project MOMENT approach combines SOA and semantic search concepts:a mediator between clients and measurement tools has been designed in order to offer integrated accessto the infrastructures, regardless their specific details, with the possibility of achieving complex queries.Pervasiveness of ontologies has been used for various purposes in the project. As such, one ontologydeals traffic measurement data, another one describes metadata that is used instead of data for practicalreasons, a third one focuses on anonymization required for ethical (and legal restrictions and the lastone describes general concepts from the field. This paper outlines the role of these ontologies andpresents the process to achieve them from a set of traffic measurement databases as well as theintegration of specific modules in the mediator to achieve the semantic queries.

  13. Generative Semantics

    Science.gov (United States)

    Bagha, Karim Nazari

    2011-01-01

    Generative semantics is (or perhaps was) a research program within linguistics, initiated by the work of George Lakoff, John R. Ross, Paul Postal and later McCawley. The approach developed out of transformational generative grammar in the mid 1960s, but stood largely in opposition to work by Noam Chomsky and his students. The nature and genesis of…

  14. Inter-Operating Grids Through Delegated MatchMaking

    Directory of Open Access Journals (Sweden)

    Alexandru Iosup

    2008-01-01

    Full Text Available The grid vision of a single computing utility has yet to materialize: while many grids with thousands of processors each exist, most work in isolation. An important obstacle for the effective and efficient inter-operation of grids is the problem of resource selection. In this paper we propose a solution to this problem that combines the hierarchical and decentralized approaches for interconnecting grids. In our solution, a hierarchy of grid sites is augmented with peer-to-peer connections between sites under the same administrative control. To operate this architecture, we employ the key concept of delegated matchmaking, which temporarily binds resources from remote sites to the local environment. With trace-based simulations we evaluate our solution under various infrastructural and load conditions, and we show that it outperforms other approaches to inter-operating grids. Specifically, we show that delegated matchmaking achieves up to 60% more goodput and completes 26% more jobs than its best alternative.

  15. Improving IEC 61850 interoperability : experiences and recommendations

    Energy Technology Data Exchange (ETDEWEB)

    Niejahr, J. [Siemens Canada Ltd., Mississauga, ON (Canada); Englert, H.; Dawidczak, H. [Siemens AG, Munich (Germany)

    2010-07-01

    The worldwide established communication standard for power utility automation is the International Electrotechnical Commission (IEC) 61850. The key drivers for its use are performance, reduced life-cycle costs and interoperability. The major application of IEC 61850 is in substation automation, where practical experience from thousands of installations has been realized. Most of these installations are primarily single-vendor solutions with some special devices from other vendors, while only a few are full multivendor systems. These multivendor projects showed that the interoperability capabilities of the available products and systems are currently limited, requiring additional engineering efforts. This paper provided a definition of interoperability in the context of IEC 61850 and discussed the experiences collected in multivendor projects and interoperability tests. It identified the technical reasons for limited interoperability. In order to help overcome the interoperability limitations and allow the exchange of devices with a minimum of re-engineering, a new concept for flexible IEC 61850 data modeling was also presented. Recommendations were offered as to how this concept could be applied in practice in order to avoid additional engineering costs. It was concluded that the new concept for flexible adaption of IEC 61850 data models and communication services improved the interoperability of products and systems regarding simplicity and functionality. 10 refs., 4 figs.

  16. Political, policy and social barriers to health system interoperability: emerging opportunities of Web 2.0 and 3.0.

    Science.gov (United States)

    Juzwishin, Donald W M

    2009-01-01

    Achieving effective health informatics interoperability in a fragmented and uncoordinated health system is by definition not possible. Interoperability requires the simultaneous integration of health care processes and information across different types and levels of care (systems thinking). The fundamental argument of this paper is that information system interoperability will remain an unfulfilled hope until health reforms effectively address the governance (accountability), structural and process barriers to interoperability of health care delivery. The ascendency of Web 2.0 and 3.0, although still unproven, signals the opportunity to accelerate patients' access to health information and their health record. Policy suggestions for simultaneously advancing health system delivery and information system interoperability are posited. PMID:20166516

  17. Interoperable Solar Data and Metadata via LISIRD 3

    Science.gov (United States)

    Wilson, A.; Lindholm, D. M.; Pankratz, C. K.; Snow, M. A.; Woods, T. N.

    2015-12-01

    LISIRD 3 is a major upgrade of the LASP Interactive Solar Irradiance Data Center (LISIRD), which serves several dozen space based solar irradiance and related data products to the public. Through interactive plots, LISIRD 3 provides data browsing supported by data subsetting and aggregation. Incorporating a semantically enabled metadata repository, LISIRD 3 users see current, vetted, consistent information about the datasets offered. Users can now also search for datasets based on metadata fields such as dataset type and/or spectral or temporal range. This semantic database enables metadata browsing, so users can discover the relationships between datasets, instruments, spacecraft, mission and PI. The database also enables creation and publication of metadata records in a variety of formats, such as SPASE or ISO, making these datasets more discoverable. The database also enables the possibility of a public SPARQL endpoint, making the metadata browsable in an automated fashion. LISIRD 3's data access middleware, LaTiS, provides dynamic, on demand reformatting of data and timestamps, subsetting and aggregation, and other server side functionality via a RESTful OPeNDAP compliant API, enabling interoperability between LASP datasets and many common tools. LISIRD 3's templated front end design, coupled with the uniform data interface offered by LaTiS, allows easy integration of new datasets. Consequently the number and variety of datasets offered by LISIRD has grown to encompass several dozen, with many more to come. This poster will discuss design and implementation of LISIRD 3, including tools used, capabilities enabled, and issues encountered.

  18. Provenance in Data Interoperability for Multi-Sensor Intercomparison

    Science.gov (United States)

    Lynnes, Chris; Leptoukh, Greg; Berrick, Steve; Shen, Suhung; Prados, Ana; Fox, Peter; Yang, Wenli; Min, Min; Holloway, Dan; Enloe, Yonsook

    2008-01-01

    As our inventory of Earth science data sets grows, the ability to compare, merge and fuse multiple datasets grows in importance. This requires a deeper data interoperability than we have now. Efforts such as Open Geospatial Consortium and OPeNDAP (Open-source Project for a Network Data Access Protocol) have broken down format barriers to interoperability; the next challenge is the semantic aspects of the data. Consider the issues when satellite data are merged, cross-calibrated, validated, inter-compared and fused. We must match up data sets that are related, yet different in significant ways: the phenomenon being measured, measurement technique, location in space-time or quality of the measurements. If subtle distinctions between similar measurements are not clear to the user, results can be meaningless or lead to an incorrect interpretation of the data. Most of these distinctions trace to how the data came to be: sensors, processing and quality assessment. For example, monthly averages of satellite-based aerosol measurements often show significant discrepancies, which might be due to differences in spatio- temporal aggregation, sampling issues, sensor biases, algorithm differences or calibration issues. Provenance information must be captured in a semantic framework that allows data inter-use tools to incorporate it and aid in the intervention of comparison or merged products. Semantic web technology allows us to encode our knowledge of measurement characteristics, phenomena measured, space-time representation, and data quality attributes in a well-structured, machine-readable ontology and rulesets. An analysis tool can use this knowledge to show users the provenance-related distrintions between two variables, advising on options for further data processing and analysis. An additional problem for workflows distributed across heterogeneous systems is retrieval and transport of provenance. Provenance may be either embedded within the data payload, or transmitted

  19. Integrated semantics service platform for the Internet of Things: a case study of a smart office.

    Science.gov (United States)

    Ryu, Minwoo; Kim, Jaeho; Yun, Jaeseok

    2015-01-19

    The Internet of Things (IoT) allows machines and devices in the world to connect with each other and generate a huge amount of data, which has a great potential to provide useful knowledge across service domains. Combining the context of IoT with semantic technologies, we can build integrated semantic systems to support semantic interoperability. In this paper, we propose an integrated semantic service platform (ISSP) to support ontological models in various IoT-based service domains of a smart city. In particular, we address three main problems for providing integrated semantic services together with IoT systems: semantic discovery, dynamic semantic representation, and semantic data repository for IoT resources. To show the feasibility of the ISSP, we develop a prototype service for a smart office using the ISSP, which can provide a preset, personalized office environment by interpreting user text input via a smartphone. We also discuss a scenario to show how the ISSP-based method would help build a smart city, where services in each service domain can discover and exploit IoT resources that are wanted across domains. We expect that our method could eventually contribute to providing people in a smart city with more integrated, comprehensive services based on semantic interoperability.

  20. Integrated Semantics Service Platform for the Internet of Things: A Case Study of a Smart Office

    Directory of Open Access Journals (Sweden)

    Minwoo Ryu

    2015-01-01

    Full Text Available The Internet of Things (IoT allows machines and devices in the world to connect with each other and generate a huge amount of data, which has a great potential to provide useful knowledge across service domains. Combining the context of IoT with semantic technologies, we can build integrated semantic systems to support semantic interoperability. In this paper, we propose an integrated semantic service platform (ISSP to support ontological models in various IoT-based service domains of a smart city. In particular, we address three main problems for providing integrated semantic services together with IoT systems: semantic discovery, dynamic semantic representation, and semantic data repository for IoT resources. To show the feasibility of the ISSP, we develop a prototype service for a smart office using the ISSP, which can provide a preset, personalized office environment by interpreting user text input via a smartphone. We also discuss a scenario to show how the ISSP-based method would help build a smart city, where services in each service domain can discover and exploit IoT resources that are wanted across domains. We expect that our method could eventually contribute to providing people in a smart city with more integrated, comprehensive services based on semantic interoperability.

  1. Interoperability of CAD Standards and Robotics in CIME

    DEFF Research Database (Denmark)

    Sørensen, Torben

    The research presented in this dissertation concerns the identification of problems and provision of solutions for increasing the degree of interoperability between CAD, CACSD (Computer Aided Control Systems Design) and CAR (Computer Aided Robotics) in Computer Integrated Manufacturing and Engine......The research presented in this dissertation concerns the identification of problems and provision of solutions for increasing the degree of interoperability between CAD, CACSD (Computer Aided Control Systems Design) and CAR (Computer Aided Robotics) in Computer Integrated Manufacturing......· The development of a STEP based interface for general control system data and functions, especially related to robot motion control for interoperability of CAD, CACSD, and CAR systems for the extension of the inter-system communication capabilities beyond the stage achieved up to now.This interface development...... comprehends the following work:· The definition of the concepts of 'information' and 'information model', and the selection of a proper information modeling methodology within the STEP methodologies.· The elaboration of a general function model of a generic robot motion controller in IDEF0 for interface...

  2. Live Social Semantics

    Science.gov (United States)

    Alani, Harith; Szomszor, Martin; Cattuto, Ciro; van den Broeck, Wouter; Correndo, Gianluca; Barrat, Alain

    Social interactions are one of the key factors to the success of conferences and similar community gatherings. This paper describes a novel application that integrates data from the semantic web, online social networks, and a real-world contact sensing platform. This application was successfully deployed at ESWC09, and actively used by 139 people. Personal profiles of the participants were automatically generated using several Web 2.0 systems and semantic academic data sources, and integrated in real-time with face-to-face contact networks derived from wearable sensors. Integration of all these heterogeneous data layers made it possible to offer various services to conference attendees to enhance their social experience such as visualisation of contact data, and a site to explore and connect with other participants. This paper describes the architecture of the application, the services we provided, and the results we achieved in this deployment.

  3. River Basin Standards Interoperability Pilot

    Science.gov (United States)

    Pesquer, Lluís; Masó, Joan; Stasch, Christoph

    2016-04-01

    There is a lot of water information and tools in Europe to be applied in the river basin management but fragmentation and a lack of coordination between countries still exists. The European Commission and the member states have financed several research and innovation projects in support of the Water Framework Directive. Only a few of them are using the recently emerging hydrological standards, such as the OGC WaterML 2.0. WaterInnEU is a Horizon 2020 project focused on creating a marketplace to enhance the exploitation of EU funded ICT models, tools, protocols and policy briefs related to water and to establish suitable conditions for new market opportunities based on these offerings. One of WaterInnEU's main goals is to assess the level of standardization and interoperability of these outcomes as a mechanism to integrate ICT-based tools, incorporate open data platforms and generate a palette of interchangeable components that are able to use the water data emerging from the recently proposed open data sharing processes and data models stimulated by initiatives such as the INSPIRE directive. As part of the standardization and interoperability activities in the project, the authors are designing an experiment (RIBASE, the present work) to demonstrate how current ICT-based tools and water data can work in combination with geospatial web services in the Scheldt river basin. The main structure of this experiment, that is the core of the present work, is composed by the following steps: - Extraction of information from river gauges data in OGC WaterML 2.0 format using SOS services (preferably compliant to the OGC SOS 2.0 Hydrology Profile Best Practice). - Model floods using a WPS 2.0, WaterML 2.0 data and weather forecast models as input. - Evaluation of the applicability of Sensor Notification Services in water emergencies. - Open distribution of the input and output data as OGC web services WaterML, / WCS / WFS and with visualization utilities: WMS. The architecture

  4. Enhancing the Interoperability of Multimedia Learning Objects Based on the Ontology Mapping

    Directory of Open Access Journals (Sweden)

    Jihad Chaker

    2014-09-01

    Full Text Available This article addresses the interoperability between the semantic learning platforms and the educational resources banks, more precisely between the LOM and MPEG-7 standards. LOM is a set of metadata associated with e-learning content, while MPEG-7 is a standard for describing multimedia content. The use of educational resources has become an essential component to meet the learning needs. Given the multimedia nature of these resources, such use causes problems in the interoperability of multimedia learning objects in e-Learning environments, indexing and retrieval of digital resources. Faced with these problems, we propose a new approach for the multimedia learning objects by using the ontology mapping between the LOM and MPEG-7 ontologies.

  5. Towards Model Driven Tool Interoperability: Bridging Eclipse and Microsoft Modeling Tools

    Science.gov (United States)

    Brunelière, Hugo; Cabot, Jordi; Clasen, Cauê; Jouault, Frédéric; Bézivin, Jean

    Successful application of model-driven engineering approaches requires interchanging a lot of relevant data among the tool ecosystem employed by an engineering team (e.g., requirements elicitation tools, several kinds of modeling tools, reverse engineering tools, development platforms and so on). Unfortunately, this is not a trivial task. Poor tool interoperability makes data interchange a challenge even among tools with a similar scope. This paper presents a model-based solution to overcome such interoperability issues. With our approach, the internal schema/s (i.e., metamodel/s) of each tool are explicited and used as basis for solving syntactic and semantic differences between the tools. Once the corresponding metamodels are aligned, model-to-model transformations are (semi)automatically derived and executed to perform the actual data interchange. We illustrate our approach by bridging the Eclipse and Microsoft (DSL Tools and SQL Server Modeling) modeling tools.

  6. Epimenides: Interoperability Reasoning for Digital Preservation

    NARCIS (Netherlands)

    Kargakis, Yannis; Tzitzikas, Yannis; van Horik, M.P.M.

    2014-01-01

    This paper presents Epimenides, a system that implements a novel interoperability dependency reasoning approach for assisting digital preservation activities. A distinctive feature is that it can model also converters and emulators, and the adopted modelling approach enables the automatic reasoning

  7. Requirements for Interoperability in Healthcare Information Systems

    Directory of Open Access Journals (Sweden)

    Rita Noumeir

    2012-01-01

    Full Text Available Interoperability is a requirement for the successful deployment of Electronic Health Records (EHR. EHR improves the quality of healthcare by enabling access to all relevant information at the diagnostic decision moment, regardless of location. It is a system that results from the cooperation of several heterogeneous distributed subsystems that need to successfully exchange information relative to a specific healthcare process. This paper analyzes interoperability impediments in healthcare by first defining them and providing concrete healthcare examples, followed by discussion of how specifications can be defined and how verification can be conducted to eliminate those impediments and ensure interoperability in healthcare. This paper also analyzes how Integrating the Healthcare Enterprise (IHE has been successful in enabling interoperability, and identifies some neglected aspects that need attention.

  8. Interoperability for Entreprise Systems and Applications '12

    CERN Document Server

    Doumeingts, Guy; Katzy, Bernhard; Chalmeta, Ricardo

    2012-01-01

    Within a scenario of globalised markets, where the capacity to efficiently cooperate with other firms starts to become essential in order to remain in the market in an economically, socially and environmentally cost-effective manner, it can be seen how the most innovative enterprises are beginning to redesign their business model to become interoperable. This goal of interoperability is essential, not only from the perspective of the individual enterprise but also in the new business structures that are now emerging, such as supply chains, virtual enterprises, interconnected organisations or extended enterprises, as well as in mergers and acquisitions. Composed of over 40 papers, Enterprise Interoperability V ranges from academic research through case studies to industrial and administrative experience of interoperability. The international nature of the authorship contnues to broaden. Many of the papers have examples and illustrations calculated to deepen understanding and generate new ideas. The I-ESA'12 Co...

  9. Interoperability for Enterprise Systems and Applications

    CERN Document Server

    Jardim-Gonçalves, Ricardo; Popplewell, Keith; Mendonça, João

    2016-01-01

    A concise reference to the state of the art in systems interoperability, Enterprise Interoperability VII will be of great value to engineers and computer scientists working in manufacturing and other process industries and to software engineers and electronic and manufacturing engineers working in the academic environment. Furthermore, it shows how knowledge of the meaning within information and the use to which it will be put have to be held in common between enterprises for consistent and efficient inter-enterprise networks. Over 30 papers, ranging from academic research through case studies to industrial and administrative experience of interoperability show how, in a scenario of globalised markets, where the capacity to cooperate with other organizations efficiently is essential in order to remain economically, socially and environmentally cost-effective, the most innovative digitized and networked enterprises ensure that their systems and applications are able to interoperate across heterogeneous collabo...

  10. Intercloud Architecture Framework for Interoperability and Integration

    NARCIS (Netherlands)

    Demchenko, Y.; Ngo, C.; Makkes, M.X.; Strijkers, R.J.

    2013-01-01

    This report presents on-going research to develop the Intercloud Architecture Framework (ICAF) that addresses interoperability and integration issues in multi-provider multi-domain heterogeneous Cloud based infrastructure services and applications provisioning, including integration and interoperabi

  11. INFRAWEBS Semantic Web Service Development on the Base of Knowledge Management Layer

    OpenAIRE

    Nern, Joachim; Agre, Gennady; Atanasova, Tatiana; Marinova, Zlatina; Micsik, András; Kovács, László; Saarela, Janne; Westkaemper, Timo

    2006-01-01

    The paper gives an overview about the ongoing FP6-IST INFRAWEBS project and describes the main layers and software components embedded in an application oriented realisation framework. An important part of INFRAWEBS is a Semantic Web Unit (SWU) – a collaboration platform and interoperable middleware for ontology-based handling and maintaining of SWS. The framework provides knowledge about a specific domain and relies on ontologies to structure and exchange this knowledge to semant...

  12. GEOSS interoperability for Weather, Ocean and Water

    Science.gov (United States)

    Richardson, David; Nyenhuis, Michael; Zsoter, Ervin; Pappenberger, Florian

    2013-04-01

    "Understanding the Earth system — its weather, climate, oceans, atmosphere, water, land, geodynamics, natural resources, ecosystems, and natural and human-induced hazards — is crucial to enhancing human health, safety and welfare, alleviating human suffering including poverty, protecting the global environment, reducing disaster losses, and achieving sustainable development. Observations of the Earth system constitute critical input for advancing this understanding." With this in mind, the Group on Earth Observations (GEO) started implementing the Global Earth Observation System of Systems (GEOSS). GEOWOW, short for "GEOSS interoperability for Weather, Ocean and Water", is supporting this objective. GEOWOW's main challenge is to improve Earth observation data discovery, accessibility and exploitability, and to evolve GEOSS in terms of interoperability, standardization and functionality. One of the main goals behind the GEOWOW project is to demonstrate the value of the TIGGE archive in interdisciplinary applications, providing a vast amount of useful and easily accessible information to the users through the GEO Common Infrastructure (GCI). GEOWOW aims at developing funcionalities that will allow easy discovery, access and use of TIGGE archive data and of in-situ observations, e.g. from the Global Runoff Data Centre (GRDC), to support applications such as river discharge forecasting.TIGGE (THORPEX Interactive Grand Global Ensemble) is a key component of THORPEX: a World Weather Research Programme to accelerate the improvements in the accuracy of 1-day to 2 week high-impact weather forecasts for the benefit of humanity. The TIGGE archive consists of ensemble weather forecast data from ten global NWP centres, starting from October 2006, which has been made available for scientific research. The TIGGE archive has been used to analyse hydro-meteorological forecasts of flooding in Europe as well as in China. In general the analysis has been favourable in terms of

  13. From BPMN 2.0 to the Setting-Up on an ESB - Application to an Interoperability Problem

    OpenAIRE

    Lemrabet, Y.; Clin, D.; Bigand, M.; Bourey, J. -P.

    2010-01-01

    To solve interoperability problem from semantic level, we propose to contribute to orchestration of the business processes to implement a mediation based on Enterprise Service Bus (ESB). We show how to take advantage of the forthcoming version of Business Process Modeling Notation 2.0 (BPMN 2.0) of the Object Management Group (OMG) within the framework of a Services Oriented Architecture (SOA) development. This new version of BPMN is characterized by the addition of the notion of private/publ...

  14. Diabetes Device Interoperability for Improved Diabetes Management

    OpenAIRE

    Silk, Alain D.

    2015-01-01

    Scientific and technological advancements have led to the increasing availability and use of sophisticated devices for diabetes management, with corresponding improvements in public health. These devices are often capable of sharing data with a few other specific devices but are generally not broadly interoperable; they cannot work together with a wide variety of other devices. As a result of limited interoperability, benefits of modern diabetes devices and potential for development of innova...

  15. Interoperability and Standardization of Intercloud Cloud Computing

    OpenAIRE

    Wang, Jingxin K.; Ding, Jianrui; Niu, Tian

    2012-01-01

    Cloud computing is getting mature, and the interoperability and standardization of the clouds is still waiting to be solved. This paper discussed the interoperability among clouds about message transmission, data transmission and virtual machine transfer. Starting from IEEE Pioneering Cloud Computing Initiative, this paper discussed about standardization of the cloud computing, especially intercloud cloud computing. This paper also discussed the standardization from the market-oriented view.

  16. Towards an Excellence Framework for Business Interoperability

    OpenAIRE

    Legner, Christine; Wende, Kristin

    2006-01-01

    Organisations that wish to establish IT-supported business relationships with business partners face major challenges, among them the need for creating a win-win-situation and the effort to align business processes and link up information systems across company borders. Whereas interoperability has been widely dis-cussed in a technical context, it has not (yet) been explored how interoperability relates to the business strategy and organisational design of the business relation-ship. This pap...

  17. Semantic-Driven e-Government: Application of Uschold and King Ontology Building Methodology for Semantic Ontology Models Development

    CERN Document Server

    Fonou-Dombeu, Jean Vincent; 10.5121/ijwest.2011.2401

    2011-01-01

    Electronic government (e-government) has been one of the most active areas of ontology development during the past six years. In e-government, ontologies are being used to describe and specify e-government services (e-services) because they enable easy composition, matching, mapping and merging of various e-government services. More importantly, they also facilitate the semantic integration and interoperability of e-government services. However, it is still unclear in the current literature how an existing ontology building methodology can be applied to develop semantic ontology models in a government service domain. In this paper the Uschold and King ontology building methodology is applied to develop semantic ontology models in a government service domain. Firstly, the Uschold and King methodology is presented, discussed and applied to build a government domain ontology. Secondly, the domain ontology is evaluated for semantic consistency using its semi-formal representation in Description Logic. Thirdly, an...

  18. Semantic Clustering of Search Engine Results.

    Science.gov (United States)

    Soliman, Sara Saad; El-Sayed, Maged F; Hassan, Yasser F

    2015-01-01

    This paper presents a novel approach for search engine results clustering that relies on the semantics of the retrieved documents rather than the terms in those documents. The proposed approach takes into consideration both lexical and semantics similarities among documents and applies activation spreading technique in order to generate semantically meaningful clusters. This approach allows documents that are semantically similar to be clustered together rather than clustering documents based on similar terms. A prototype is implemented and several experiments are conducted to test the prospered solution. The result of the experiment confirmed that the proposed solution achieves remarkable results in terms of precision.

  19. Semantic Clustering of Search Engine Results

    Directory of Open Access Journals (Sweden)

    Sara Saad Soliman

    2015-01-01

    Full Text Available This paper presents a novel approach for search engine results clustering that relies on the semantics of the retrieved documents rather than the terms in those documents. The proposed approach takes into consideration both lexical and semantics similarities among documents and applies activation spreading technique in order to generate semantically meaningful clusters. This approach allows documents that are semantically similar to be clustered together rather than clustering documents based on similar terms. A prototype is implemented and several experiments are conducted to test the prospered solution. The result of the experiment confirmed that the proposed solution achieves remarkable results in terms of precision.

  20. On MDA - SOA based Intercloud Interoperability framework

    Directory of Open Access Journals (Sweden)

    Tahereh Nodehi

    2013-01-01

    Full Text Available Cloud computing has been one of the latest technologies which assures reliable delivery of on - demand computing services over the Internet. Cloud service providers have established geographically distributed data centers and computing resources, which are available online as service. The clouds operated by different service providers working together in collaboration can open up lots more spaces for innovative scenarios with huge amount of resources provisioning on demand. However, current cloud systems do not support intercloud interoperability. This paper is thus motivated to address Intercloud Interoperabilityby analyzing different methodologies that have been applied to resolve various scenarios of interoperability. Model Driven Architecture (MDA and Service Oriented Architecture (SOA method have been used to address interoperability in various scenarios, which also opens up spaces to address intercloud interoperability by making use of these well accepted methodologies. The focus of this document is to show Intercloud Interoperability can be supported through a Model Driven approach and Service Oriented systems. Moreover, the current state of the art in Intercloud, concept and benefits of MDA and SOA are discussed in the paper. At the same time this paper also proposes a generic architecture for MDA - SOA based framework, which can be useful for developing applications which will require intercloud interoperability. The paper justi fies the usability of the framework by a use - case scenario for dynamic workload migration among heterogeneous clouds.

  1. XML databases and the semantic web

    CERN Document Server

    Thuraisingham, Bhavani

    2002-01-01

    Efficient access to data, sharing data, extracting information from data, and making use of the information have become urgent needs for today''s corporations. With so much data on the Web, managing it with conventional tools is becoming almost impossible. New tools and techniques are necessary to provide interoperability as well as warehousing between multiple data sources and systems, and to extract information from the databases. XML Databases and the Semantic Web focuses on critical and new Web technologies needed for organizations to carry out transactions on the Web, to understand how to use the Web effectively, and to exchange complex documents on the Web.This reference for database administrators, database designers, and Web designers working in tandem with database technologists covers three emerging technologies of significant impact for electronic business: Extensible Markup Language (XML), semi-structured databases, and the semantic Web. The first two parts of the book explore these emerging techn...

  2. Towards Interoperable Preservation Repositories: TIPR

    Directory of Open Access Journals (Sweden)

    Priscilla Caplan

    2010-07-01

    Full Text Available Towards Interoperable Preservation Repositories (TIPR is a project funded by the Institute of Museum and Library Services to create and test a Repository eXchange Package (RXP. The package will make it possible to transfer complex digital objects between dissimilar preservation repositories.  For reasons of redundancy, succession planning and software migration, repositories must be able to exchange copies of archival information packages with each other. Every different repository application, however, describes and structures its archival packages differently. Therefore each system produces dissemination packages that are rarely understandable or usable as submission packages by other repositories. The RXP is an answer to that mismatch. Other solutions for transferring packages between repositories focus either on transfers between repositories of the same type, such as DSpace-to-DSpace transfers, or on processes that rely on central translation services.  Rather than build translators between many dissimilar repository types, the TIPR project has defined a standards-based package of metadata files that can act as an intermediary information package, the RXP, a lingua franca all repositories can read and write.

  3. Data interoperability software solution for emergency reaction in the Europe Union

    Science.gov (United States)

    Casado, R.; Rubiera, E.; Sacristan, M.; Schütte, F.; Peters, R.

    2015-07-01

    Emergency management becomes more challenging in international crisis episodes because of cultural, semantic and linguistic differences between all stakeholders, especially first responders. Misunderstandings between first responders makes decision making slower and more difficult. However, spread and development of networks and IT-based emergency management systems (EMSs) have improved emergency responses, which have become more coordinated. Despite improvements made in recent years, EMSs have not still solved problems related to cultural, semantic and linguistic differences which are the real cause of slower decision making. In addition, from a technical perspective, the consolidation of current EMSs and the different formats used to exchange information offers another problem to be solved in any solution proposed for information interoperability between heterogeneous EMSs in different contexts. To overcome these problems, we present a software solution based on semantic and mediation technologies. EMERGency ELements (EMERGEL) (Fundacion CTIC and AntwortING Ingenieurbüro PartG, 2013), a common and modular ontology shared by all the stakeholders, has been defined. It offers the best solution to gather all stakeholders' knowledge in a unique and flexible data model, taking into account different countries' cultural and linguistic issues. To deal with the diversity of data protocols and formats, we have designed a service-oriented architecture for data interoperability (named DISASTER: Data Interoperability Solution At STakeholders Emergency Reaction) providing a flexible extensible solution to solve the mediation issues. Web services have been adopted as specific technology to implement this paradigm that has the most significant academic and industrial visibility and attraction. Contributions of this work have been validated through the design and development of a cross-border realistic prototype scenario, actively involving both emergency managers and emergency

  4. Jigsaw Semantics

    Directory of Open Access Journals (Sweden)

    Paul J. E. Dekker

    2010-12-01

    Full Text Available In the last decade the enterprise of formal semantics has been under attack from several philosophical and linguistic perspectives, and it has certainly suffered from its own scattered state, which hosts quite a variety of paradigms which may seem to be incompatible. It will not do to try and answer the arguments of the critics, because the arguments are often well-taken. The negative conclusions, however, I believe are not. The only adequate reply seems to be a constructive one, which puts several pieces of formal semantics, in particular dynamic semantics, together again. In this paper I will try and sketch an overview of tasks, techniques, and results, which serves to at least suggest that it is possible to develop a coherent overall picture of undeniably important and structural phenomena in the interpretation of natural language. The idea is that the concept of meanings as truth conditions after all provides an excellent start for an integrated study of the meaning and use of natural language, and that an extended notion of goal directed pragmatics naturally complements this picture. None of the results reported here are really new, but we think it is important to re-collect them.ReferencesAsher, Nicholas & Lascarides, Alex. 1998. ‘Questions in Dialogue’. Linguistics and Philosophy 23: 237–309.http://dx.doi.org/10.1023/A:1005364332007Borg, Emma. 2007. ‘Minimalism versus contextualism in semantics’. In Gerhard Preyer & Georg Peter (eds. ‘Context-Sensitivity and Semantic Minimalism’, pp. 339–359. Oxford: Oxford University Press.Cappelen, Herman & Lepore, Ernest. 1997. ‘On an Alleged Connection between Indirect Quotation and Semantic Theory’. Mind and Language 12: pp. 278–296.Cappelen, Herman & Lepore, Ernie. 2005. Insensitive Semantics. Oxford: Blackwell.http://dx.doi.org/10.1002/9780470755792Dekker, Paul. 2002. ‘Meaning and Use of Indefinite Expressions’. Journal of Logic, Language and Information 11: pp. 141–194

  5. Semantic-Driven e-Government: Application of Uschold and King Ontology Building Methodology for Semantic Ontology Models Development

    OpenAIRE

    Jean Vincent Fonou-Dombeu; Magda Huisman

    2011-01-01

    Electronic government (e-government) has been one of the most active areas of ontology developmentduring the past six years. In e-government, ontologies are being used to describe and specify e-governmentservices (e-services) because they enable easy composition, matching, mapping and merging of various egovernmentservices. More importantly, they also facilitate the semantic integration and interoperability ofe-government services. However, it is still unclear in the current literature how an...

  6. Event-Driven Interoperability Framework For Interoperation In E-Learning Information Systems - Monitored Repository

    NARCIS (Netherlands)

    Petrov, Milen

    2006-01-01

    M.Petrov "Event-Driven Interoperability Framework For Interoperation In E-Learning Information Systems - Monitored Repository", IADAT-e2006, 3rd International Conference on Education, Barcelona (Spain), July 12-14, 2006, ISBN: 84-933971-9-9, pp.198 - pp.202

  7. European Interoperability Assets Register and Quality Framework Implementation.

    Science.gov (United States)

    Moreno-Conde, Alberto; Thienpont, Geert; Lamote, Inge; Coorevits, Pascal; Parra, Carlos; Kalra, Dipak

    2016-01-01

    Interoperability assets is the term applied to refer to any resource that can support the design, implementation and successful adoption of eHealth services that can exchange data meaningfully. Some examples may include functional requirements, specifications, standards, clinical models and term lists, guidance on how standards may be used concurrently, implementation guides, educational resources, and other resources. Unfortunately, these are largely accessible in ad hoc ways and result in scattered fragments of a solution space that urgently need to be brought together. At present, it is well known that new initiatives and projects will reinvent assets of which they were unaware, while those assets which were potentially of great value are forgotten, not maintained and eventually fall into disuse. This research has defined a quality in use model and assessed the suitability of this quality framework based on the feedback and opinion of a representative sample of potential end users. This quality framework covers the following domains of asset development and adoption: (i) Development process, (ii) Maturity level, (iii) Trustworthiness, (iv) Support & skills, (v) Sustainability, (vi) Semantic interoperability, (vii) Cost & effort of adoption (viii) Maintenance. When participants were requested to evaluate how the overall quality in use framework, 70% would recommend using the register to their colleagues, 70% felt that it could provide relevant benefits for discovering new assets, and 50% responded that it would support their decision making about the recommended asset to adopt or implement in their organisation. Several European projects have expressed interest in using the register, which will now be sustained and promoted by the the European Institute for Innovation through Health Data. PMID:27577473

  8. Convergence of Health Level Seven Version 2 Messages to Semantic Web Technologies for Software-Intensive Systems in Telemedicine Trauma Care

    OpenAIRE

    Menezes, Pedro Monteiro; Cook, Timothy Wayne; Cavalini, Luciana Tricai

    2016-01-01

    Objectives To present the technical background and the development of a procedure that enriches the semantics of Health Level Seven version 2 (HL7v2) messages for software-intensive systems in telemedicine trauma care. Methods This study followed a multilevel model-driven approach for the development of semantically interoperable health information systems. The Pre-Hospital Trauma Life Support (PHTLS) ABCDE protocol was adopted as the use case. A prototype application embedded the semantics i...

  9. Enterprise interoperability VI : Interoperability for Agility, Resilience and Plasticity of Collaboration

    CERN Document Server

    Bénaben, Frédérick; Poler, Raúl; Bourrières, Jean-Paul

    2014-01-01

    A concise reference to the state of the art in systems interoperability, Enterprise Interoperability VI will be of great value to engineers and computer scientists working in manufacturing and other process industries and to software engineers and electronic and manufacturing engineers working in the academic environment. Over 40 papers, ranging from academic research through case studies to industrial and administrative experience of interoperability show how, in a scenario of globalised markets, where the capacity to cooperate with other firms efficiently starts to become essential in order to remain in the market in an economically, socially and environmentally cost-effective manner, the most innovative enterprises are beginning to redesign their business model to become interoperable. This goal of interoperability is essential, not only from the perspective of the individual enterprise but also in the new business structures that are now emerging, such as supply chains, virtual enterprises, interconnected...

  10. Provenance-Based Approaches to Semantic Web Service Discovery and Usage

    Science.gov (United States)

    Narock, Thomas William

    2012-01-01

    The World Wide Web Consortium defines a Web Service as "a software system designed to support interoperable machine-to-machine interaction over a network." Web Services have become increasingly important both within and across organizational boundaries. With the recent advent of the Semantic Web, web services have evolved into semantic…

  11. Code lists for interoperability - Principles and best practices in INSPIRE

    Science.gov (United States)

    Lutz, M.; Portele, C.; Cox, S.; Murray, K.

    2012-04-01

    external vocabulary. In the former case, for each value, an external identifier, one or more labels (possibly in different languages), a definition and other metadata should be specified. In the latter case, the external vocabulary should be characterised, e.g. by specifying the version to be used, the format(s) in which the vocabulary is available, possible constraints (e.g. if only as specific part of the external list is to be used), rules for using values in the encoding of instance data, and the maintenance rules applied to the external vocabulary. This information is crucial for enabling implementation and interoperability in distributed systems (such as SDIs) and should be made available through a code list registry. While thus the information on allowed code list values is usually managed outside the UML application schema, we recommend inclusion of «codeList»-stereotyped classes in the model for semantic clarity. Information on the obligation, extensibility and a reference to the specified values should be provided through tagged values. Acknowledgements: The authors would like to thank the INSPIRE Thematic Working Groups, the Data Specifications Drafting Team and the JRC Contact Points for their contributions to the discussions on code lists in INSPIRE and to this abstract.

  12. A web services choreography scenario for interoperating bioinformatics applications

    Directory of Open Access Journals (Sweden)

    Cheung David W

    2004-03-01

    Full Text Available Abstract Background Very often genome-wide data analysis requires the interoperation of multiple databases and analytic tools. A large number of genome databases and bioinformatics applications are available through the web, but it is difficult to automate interoperation because: 1 the platforms on which the applications run are heterogeneous, 2 their web interface is not machine-friendly, 3 they use a non-standard format for data input and output, 4 they do not exploit standards to define application interface and message exchange, and 5 existing protocols for remote messaging are often not firewall-friendly. To overcome these issues, web services have emerged as a standard XML-based model for message exchange between heterogeneous applications. Web services engines have been developed to manage the configuration and execution of a web services workflow. Results To demonstrate the benefit of using web services over traditional web interfaces, we compare the two implementations of HAPI, a gene expression analysis utility developed by the University of California San Diego (UCSD that allows visual characterization of groups or clusters of genes based on the biomedical literature. This utility takes a set of microarray spot IDs as input and outputs a hierarchy of MeSH Keywords that correlates to the input and is grouped by Medical Subject Heading (MeSH category. While the HTML output is easy for humans to visualize, it is difficult for computer applications to interpret semantically. To facilitate the capability of machine processing, we have created a workflow of three web services that replicates the HAPI functionality. These web services use document-style messages, which means that messages are encoded in an XML-based format. We compared three approaches to the implementation of an XML-based workflow: a hard coded Java application, Collaxa BPEL Server and Taverna Workbench. The Java program functions as a web services engine and interoperates

  13. A Novel Approach for Periodic Assessment of Business Process Interoperability

    CERN Document Server

    Badr, Elmir

    2011-01-01

    Business collaboration networks provide collaborative organizations a favorable context for automated business process interoperability. This paper aims to present a novel approach for assessing interoperability of process driven services by considering the three main aspects of interoperation: potentiality, compatibility and operational performance. It presents also a software tool that supports the proposed assessment method. In addition to its capacity to track and control the evolution of interoperation degree in time, the proposed tool measures the required effort to reach a planned degree of interoperability. Public accounting of financial authority is given as an illustrative case study of interoperability monitoring in public collaboration network.

  14. A Novel Approach for Periodic Assessment of Business Process Interoperability

    Directory of Open Access Journals (Sweden)

    Badr Elmir

    2011-07-01

    Full Text Available Business collaboration networks provide collaborative organizations a favorable context for automated business process interoperability. This paper aims to present a novel approach for assessing interoperability of process driven services by considering the three main aspects of interoperation: potentiality, compatibility and operational performance. It presents also a software tool that supports the proposed assessment method. In addition to its capacity to track and control the evolution of interoperation degree in time, the proposed tool measures the required effort to reach a planned degree of interoperability. Public accounting of financial authority is given as an illustrative case study of interoperability monitoring in public collaboration network.

  15. A flexible integration framework for a Semantic Geospatial Web application

    Science.gov (United States)

    Yuan, Ying; Mei, Kun; Bian, Fuling

    2008-10-01

    With the growth of the World Wide Web technologies, the access to and use of geospatial information changed in the past decade radically. Previously, the data processed by a GIS as well as its methods had resided locally and contained information that was sufficiently unambiguous in the respective information community. Now, both data and methods may be retrieved and combined from anywhere in the world, escaping their local contexts. The last few years have seen a growing interest in the field of semantic geospatial web. With the development of semantic web technologies, we have seen the possibility of solving the heterogeneity/interoperation problem in the GIS community. The semantic geospatial web application can support a wide variety of tasks including data integration, interoperability, knowledge reuse, spatial reasoning and many others. This paper proposes a flexible framework called GeoSWF (short for Geospatial Semantic Web Framework), which supports the semantic integration of the distributed and heterogeneous geospatial information resources and also supports the semantic query and spatial relationship reasoning. We design the architecture of GeoSWF by extending the MVC Pattern. The GeoSWF use the geo-2007.owl proposed by W3C as the reference ontology of the geospatial information and design different application ontologies according to the situation of heterogeneous geospatial information resources. A Geospatial Ontology Creating Algorithm (GOCA) is designed for convert the geospatial information to the ontology instances represented by RDF/OWL. On the top of these ontology instances, the GeoSWF carry out the semantic reasoning by the rule set stored in the knowledge base to generate new system query. The query result will be ranking by ordering the Euclidean distance of each ontology instances. At last, the paper gives the conclusion and future work.

  16. SOLE: Applying Semantics and Social Web to Support Technology Enhanced Learning in Software Engineering

    Science.gov (United States)

    Colomo-Palacios, Ricardo; Jiménez-López, Diego; García-Crespo, Ángel; Blanco-Iglesias, Borja

    eLearning educative processes are a challenge for educative institutions and education professionals. In an environment in which learning resources are being produced, catalogued and stored using innovative ways, SOLE provides a platform in which exam questions can be produced supported by Web 2.0 tools, catalogued and labeled via semantic web and stored and distributed using eLearning standards. This paper presents, SOLE, a social network of exam questions sharing particularized for Software Engineering domain, based on semantics and built using semantic web and eLearning standards, such as IMS Question and Test Interoperability specification 2.1.

  17. Open Standards And Open Source: Enabling Interoperability

    Directory of Open Access Journals (Sweden)

    Fernando Almeida

    2011-02-01

    Full Text Available Interoperability is a major requirement for industries and governments in a society that increasingly moves towards global collaboration and integration. Open standards built on the principles of openness,transparency and consensus lay the grounds for innovation, growth and fair competition. Open standards are not synonymous of open source. The former is a set of specifications, the latter is an implementation. However, they share their commitment to openness and defend the equal opportunities of everyone to participate. This paper looks to the open source as the best way to enable interoperability between different technologies and applications. The role of open standards in interoperability is analyzed and some of the policies introduced by the European Union for the use and dissemination inside Members States are examined. Additionally, the use of open source software combined with open standards is presented and its major social benefits and economic impacts are highlighted.

  18. Certifying the interoperability of RDF database systems

    OpenAIRE

    Rafes, Karima; Nauroy, Julien; Germain, Cécile

    2015-01-01

    International audience In March 2013, the W3C recommended SPARQL 1.1 to retrieve and manipulate decentralized RDF data. Real-world usage requires advanced features of SPARQL 1.1. recommendations As these are not consistently implemented, we propose a test framework named TFT (Tests for Triple stores) to test the interoperability of the SPARQL end-point of RDF database systems. This framework can execute the W3C's SPARQL 1.1 test suite and also its own tests of interoperability. To help the...

  19. A Semantic Web Blackboard System

    Science.gov (United States)

    McKenzie, Craig; Preece, Alun; Gray, Peter

    In this paper, we propose a Blackboard Architecture as a means for coordinating hybrid reasoning over the Semantic Web. We describe the components of traditional blackboard systems (Knowledge Sources, Blackboard, Controller) and then explain how we have enhanced these by incorporating some of the principles of the Semantic Web to pro- duce our Semantic Web Blackboard. Much of the framework is already in place to facilitate our research: the communication protocol (HTTP); the data representation medium (RDF); a rich expressive description language (OWL); and a method of writing rules (SWRL). We further enhance this by adding our own constraint based formalism (CIF/SWRL) into the mix. We provide an example walk-though of our test-bed system, the AKTive Workgroup Builder and Blackboard(AWB+B), illustrating the interaction and cooperation of the Knowledge Sources and providing some context as to how the solution is achieved. We conclude with the strengths and weaknesses of the architecture.

  20. Interoperability, Trust Based Information Sharing Protocol and Security: Digital Government Key Issues

    CERN Document Server

    Headayetullah, Md

    2010-01-01

    Improved interoperability between public and private organizations is of key significance to make digital government newest triumphant. Digital Government interoperability, information sharing protocol and security are measured the key issue for achieving a refined stage of digital government. Flawless interoperability is essential to share the information between diverse and merely dispersed organisations in several network environments by using computer based tools. Digital government must ensure security for its information systems, including computers and networks for providing better service to the citizens. Governments around the world are increasingly revolving to information sharing and integration for solving problems in programs and policy areas. Evils of global worry such as syndrome discovery and manage, terror campaign, immigration and border control, prohibited drug trafficking, and more demand information sharing, harmonization and cooperation amid government agencies within a country and acros...

  1. A Dynamic Defense Modeling and Simulation Methodology using Semantic Web Services

    Directory of Open Access Journals (Sweden)

    Kangsun Lee

    2010-04-01

    Full Text Available Defense Modeling and Simulations require interoperable and autonomous federates in order to fully simulate complex behavior of war-fighters and to dynamically adapt themselves to various war-game events, commands and controls. In this paper, we propose a semantic web service based methodology to develop war-game simulations. Our methodology encapsulates war-game logic into a set of web services with additional semantic information in WSDL (Web Service Description Language and OWL (Web Ontology Language. By utilizing dynamic discovery and binding power of semantic web services, we are able to dynamically reconfigure federates according to various simulation events. An ASuW (Anti-Surface Warfare simulator is constructed to demonstrate the methodology and successfully shows that the level of interoperability and autonomy can be greatly improved.

  2. The improvement of the semantic classification tool to a SOA to ensure a better tutoring

    Directory of Open Access Journals (Sweden)

    Saadia Lgarch

    2011-07-01

    Full Text Available For a best collaboration between tutor and learner and relatively to discussion forum, we have proposed to tutor a semantic classification tool of messages, which helps him to manage the mass of messages accumulating during the time. The tool provides a semantic classification mechanism based on a chosen theme. For a classification more intelligent semantically, and focusing more the chosen theme, our tool incorporates essentially a formal OWL ontology. The reuse and interoperability offered by ontology remain restrictive in the tool's knowledge base. To overcome this limitation, the improvement of the SOA architecture already proposed will be presented in this paper. An implementation of our classifier using the composite application concept will also be explained. The respect of standards: XML, SOAP, WSDL and BPEL in our implementation, will guarantee the tool's interoperability with platforms which solicit its classification service, while allowing its reuse with a high degree of granularity.

  3. The MED-SUV Multidisciplinary Interoperability Infrastructure

    Science.gov (United States)

    Mazzetti, Paolo; D'Auria, Luca; Reitano, Danilo; Papeschi, Fabrizio; Roncella, Roberto; Puglisi, Giuseppe; Nativi, Stefano

    2016-04-01

    In accordance with the international Supersite initiative concept, the MED-SUV (MEDiterranean SUpersite Volcanoes) European project (http://med-suv.eu/) aims to enable long-term monitoring experiment in two relevant geologically active regions of Europe prone to natural hazards: Mt. Vesuvio/Campi Flegrei and Mt. Etna. This objective requires the integration of existing components, such as monitoring systems and data bases and novel sensors for the measurements of volcanic parameters. Moreover, MED-SUV is also a direct contribution to the Global Earth Observation System of Systems (GEOSS) as one the volcano Supersites recognized by the Group on Earth Observation (GEO). To achieve its goal, MED-SUV set up an advanced e-infrastructure allowing the discovery of and access to heterogeneous data for multidisciplinary applications, and the integration with external systems like GEOSS. The MED-SUV overall infrastructure is conceived as a three layer architecture with the lower layer (Data level) including the identified relevant data sources, the mid-tier (Supersite level) including components for mediation and harmonization , and the upper tier (Global level) composed of the systems that MED-SUV must serve, such as GEOSS and possibly other global/community systems. The Data level is mostly composed of existing data sources, such as space agencies satellite data archives, the UNAVCO system, the INGV-Rome data service. They share data according to different specifications for metadata, data and service interfaces, and cannot be changed. Thus, the only relevant MED-SUV activity at this level was the creation of a MED-SUV local repository based on Web Accessible Folder (WAF) technology, deployed in the INGV site in Catania, and hosting in-situ data and products collected and generated during the project. The Supersite level is at the core of the MED-SUV architecture, since it must mediate between the disparate data sources in the layer below, and provide a harmonized view to

  4. An Interoperability Platform Enabling Reuse of Electronic Health Records for Signal Verification Studies

    Science.gov (United States)

    Yuksel, Mustafa; Gonul, Suat; Laleci Erturkmen, Gokce Banu; Sinaci, Ali Anil; Invernizzi, Paolo; Facchinetti, Sara; Migliavacca, Andrea; Bergvall, Tomas; Depraetere, Kristof; De Roo, Jos

    2016-01-01

    Depending mostly on voluntarily sent spontaneous reports, pharmacovigilance studies are hampered by low quantity and quality of patient data. Our objective is to improve postmarket safety studies by enabling safety analysts to seamlessly access a wide range of EHR sources for collecting deidentified medical data sets of selected patient populations and tracing the reported incidents back to original EHRs. We have developed an ontological framework where EHR sources and target clinical research systems can continue using their own local data models, interfaces, and terminology systems, while structural interoperability and Semantic Interoperability are handled through rule-based reasoning on formal representations of different models and terminology systems maintained in the SALUS Semantic Resource Set. SALUS Common Information Model at the core of this set acts as the common mediator. We demonstrate the capabilities of our framework through one of the SALUS safety analysis tools, namely, the Case Series Characterization Tool, which have been deployed on top of regional EHR Data Warehouse of the Lombardy Region containing about 1 billion records from 16 million patients and validated by several pharmacovigilance researchers with real-life cases. The results confirm significant improvements in signal detection and evaluation compared to traditional methods with the missing background information. PMID:27123451

  5. An Interoperability Platform Enabling Reuse of Electronic Health Records for Signal Verification Studies

    Directory of Open Access Journals (Sweden)

    Mustafa Yuksel

    2016-01-01

    Full Text Available Depending mostly on voluntarily sent spontaneous reports, pharmacovigilance studies are hampered by low quantity and quality of patient data. Our objective is to improve postmarket safety studies by enabling safety analysts to seamlessly access a wide range of EHR sources for collecting deidentified medical data sets of selected patient populations and tracing the reported incidents back to original EHRs. We have developed an ontological framework where EHR sources and target clinical research systems can continue using their own local data models, interfaces, and terminology systems, while structural interoperability and Semantic Interoperability are handled through rule-based reasoning on formal representations of different models and terminology systems maintained in the SALUS Semantic Resource Set. SALUS Common Information Model at the core of this set acts as the common mediator. We demonstrate the capabilities of our framework through one of the SALUS safety analysis tools, namely, the Case Series Characterization Tool, which have been deployed on top of regional EHR Data Warehouse of the Lombardy Region containing about 1 billion records from 16 million patients and validated by several pharmacovigilance researchers with real-life cases. The results confirm significant improvements in signal detection and evaluation compared to traditional methods with the missing background information.

  6. WS/PIDS: standard interoperable PIDS in web services environments.

    Science.gov (United States)

    Vasilescu, E; Dorobanţu, M; Govoni, S; Padh, S; Mun, S K

    2008-01-01

    An electronic health record depends on the consistent handling of people's identities within and outside healthcare organizations. Currently, the Person Identification Service (PIDS), a CORBA specification, is the only well-researched standard that meets these needs. In this paper, we introduce WS/PIDS, a PIDS specification for Web Services (WS) that closely matches the original PIDS and improves on it by providing explicit support for medical multimedia attributes. WS/PIDS is currently supported by a test implementation, layered on top of a PIDS back-end, with Java- and NET-based, and Web clients. WS/PIDS is interoperable among platforms; it preserves PIDS semantics to a large extent, and it is intended to be fully compliant with established and emerging WS standards. The specification is open source and immediately usable in dynamic clinical systems participating in grid environments. WS/PIDS has been tested successfully with a comprehensive set of use cases, and it is being used in a clinical research setting.

  7. Semantic Enterprise Optimizer and Coexistence of Data Models

    Directory of Open Access Journals (Sweden)

    P. A. Sundararajan

    2012-09-01

    Full Text Available The authors propose a semantic ontology–driven enterprise data–model architecture for interoperability, integration, and adaptability for evolution, by autonomic agent-driven intelligent design of logical as well as physical data models in a heterogeneous distributed enterprise through its life cycle. An enterprise-standard ontology (in Web Ontology Language [OWL] and Semantic Web Rule Language [SWRL] for data is required to enable an automated data platform that adds life-cycle activities to the current Microsoft Enterprise Search and extend Microsoft SQL Server through various engines for unstructured data types, as well as many domain types that are configurable by users through a Semantic- query optimizer, and using Microsoft Office SharePoint Server (MOSS as a content and metadata repository to tie all these components together.

  8. Gazetteer Brokering through Semantic Mediation

    Science.gov (United States)

    Hobona, G.; Bermudez, L. E.; Brackin, R.

    2013-12-01

    A gazetteer is a geographical directory containing some information regarding places. It provides names, location and other attributes for places which may include points of interest (e.g. buildings, oilfields and boreholes), and other features. These features can be published via web services conforming to the Gazetteer Application Profile of the Web Feature Service (WFS) standard of the Open Geospatial Consortium (OGC). Against the backdrop of advances in geophysical surveys, there has been a significant increase in the amount of data referenced to locations. Gazetteers services have played a significant role in facilitating access to such data, including through provision of specialized queries such as text, spatial and fuzzy search. Recent developments in the OGC have led to advances in gazetteers such as support for multilingualism, diacritics, and querying via advanced spatial constraints (e.g. search by radial search and nearest neighbor). A challenge remaining however, is that gazetteers produced by different organizations have typically been modeled differently. Inconsistencies from gazetteers produced by different organizations may include naming the same feature in a different way, naming the attributes differently, locating the feature in a different location, and providing fewer or more attributes than the other services. The Gazetteer application profile of the WFS is a starting point to address such inconsistencies by providing a standardized interface based on rules specified in ISO 19112, the international standard for spatial referencing by geographic identifiers. The profile, however, does not provide rules to deal with semantic inconsistencies. The USGS and NGA commissioned research into the potential for a Single Point of Entry Global Gazetteer (SPEGG). The research was conducted by the Cross Community Interoperability thread of the OGC testbed, referenced OWS-9. The testbed prototyped approaches for brokering gazetteers through use of semantic

  9. Programming the semantic web

    CERN Document Server

    Segaran, Toby; Taylor, Jamie

    2009-01-01

    With this book, the promise of the Semantic Web -- in which machines can find, share, and combine data on the Web -- is not just a technical possibility, but a practical reality Programming the Semantic Web demonstrates several ways to implement semantic web applications, using current and emerging standards and technologies. You'll learn how to incorporate existing data sources into semantically aware applications and publish rich semantic data. Each chapter walks you through a single piece of semantic technology and explains how you can use it to solve real problems. Whether you're writing

  10. Intercloud Architecture for interoperability and integration

    NARCIS (Netherlands)

    Demchenko, Y.; Makkes, M.X.; Strijkers, R.J.; Laat, C. de

    2012-01-01

    This paper presents on-going research to develop the Intercloud Architecture Framework (ICAF) that addresses problems in multi-provider multi-domain heterogeneous cloud based infrastructure services and applications integration and interoperability. The paper refers to existing standards in Cloud Co

  11. GIS interoperability: current activities and military implications

    Science.gov (United States)

    Lam, Sylvia

    1997-07-01

    Geographic information systems (GIS) are gaining importance in military operations because of their capability to spatially and visually integrate various kinds of information. In an era of limited resources, geospatial data must be shared efficiently whenever possible. The military-initiated Global Geospatial Information and Services (GGI&S) Project aims at developing the infrastructure for GIS interoperability for the military. Current activities in standardization and new technology have strong implications on the design and development of GGI&S. To facilitate data interoperability at both the national and international levels, standards and specifications in geospatial data sharing are being studied, developed and promoted. Of particular interest to the military community are the activities related to the NATO DIGEST, ISO TC/211 Geomatics standardization and the industry-led Open Geodata Interoperability Specifications (OGIS). Together with new information technology, standardization provides the infrastructure for interoperable GIS for both civilian and military environments. The first part of this paper describes the major activities in standardization. The second part presents the technologies developed at DREV in support of the GGI&S. These include the Open Geospatial Datastore Interface (OGDI) and the geospatial data warehouse. DREV has been working closely with Defence Geomatics and private industry in the research and development of new technology for the GGI&S project.

  12. Benefit quantification of interoperability in coordinate metrology

    DEFF Research Database (Denmark)

    Savio, E.; Carmignato, S.; De Chiffre, Leonardo

    2014-01-01

    One of the factors contributing to limited reproducibility of coordinate measurements is the use of different inspection software. Time-consuming efforts for translation of part programmes are sometimes needed, and interoperability of inspection equipment has the potential to reduce these ineffic...

  13. Smart Grid Interoperability Maturity Model Beta Version

    Energy Technology Data Exchange (ETDEWEB)

    Widergren, Steven E.; Drummond, R.; Giroti, Tony; Houseman, Doug; Knight, Mark; Levinson, Alex; longcore, Wayne; Lowe, Randy; Mater, J.; Oliver, Terry V.; Slack, Phil; Tolk, Andreas; Montgomery, Austin

    2011-12-02

    The GridWise Architecture Council was formed by the U.S. Department of Energy to promote and enable interoperability among the many entities that interact with the electric power system. This balanced team of industry representatives proposes principles for the development of interoperability concepts and standards. The Council provides industry guidance and tools that make it an available resource for smart grid implementations. In the spirit of advancing interoperability of an ecosystem of smart grid devices and systems, this document presents a model for evaluating the maturity of the artifacts and processes that specify the agreement of parties to collaborate across an information exchange interface. You are expected to have a solid understanding of large, complex system integration concepts and experience in dealing with software component interoperation. Those without this technical background should read the Executive Summary for a description of the purpose and contents of the document. Other documents, such as checklists, guides, and whitepapers, exist for targeted purposes and audiences. Please see the www.gridwiseac.org website for more products of the Council that may be of interest to you.

  14. Intercloud architecture for interoperability and integration

    NARCIS (Netherlands)

    Y. Demchenko; M.X. Makkes; R. Strijkers; C. de Laat

    2012-01-01

    This paper presents on-going research to develop the Intercloud Architecture Framework (ICAF) that addresses problems in multi-provider multi-domain heterogeneous cloud based infrastructure services and applications integration and interoperability. The paper refers to existing standards in Cloud Co

  15. Interoperability Outlook in the Big Data Future

    Science.gov (United States)

    Kuo, K. S.; Ramachandran, R.

    2015-12-01

    The establishment of distributed active archive centers (DAACs) as data warehouses and the standardization of file format by NASA's Earth Observing System Data Information System (EOSDIS) had doubtlessly propelled interoperability of NASA Earth science data to unprecedented heights in the 1990s. However, we obviously still feel wanting two decades later. We believe the inadequate interoperability we experience is a result of the the current practice that data are first packaged into files before distribution and only the metadata of these files are cataloged into databases and become searchable. Data therefore cannot be efficiently filtered. Any extensive study thus requires downloading large volumes of data files to a local system for processing and analysis.The need to download data not only creates duplication and inefficiency but also further impedes interoperability, because the analysis has to be performed locally by individual researchers in individual institutions. Each institution or researcher often has its/his/her own preference in the choice of data management practice as well as programming languages. Analysis results (derived data) so produced are thus subject to the differences of these practices, which later form formidable barriers to interoperability. A number of Big Data technologies are currently being examined and tested to address Big Earth Data issues. These technologies share one common characteristics: exploiting compute and storage affinity to more efficiently analyze large volumes and great varieties of data. Distributed active "archive" centers are likely to evolve into distributed active "analysis" centers, which not only archive data but also provide analysis service right where the data reside. "Analysis" will become the more visible function of these centers. It is thus reasonable to expect interoperability to improve because analysis, in addition to data, becomes more centralized. Within a "distributed active analysis center

  16. Social Semantics for an Effective Enterprise

    Science.gov (United States)

    Berndt, Sarah; Doane, Mike

    2012-01-01

    An evolution of the Semantic Web, the Social Semantic Web (s2w), facilitates knowledge sharing with "useful information based on human contributions, which gets better as more people participate." The s2w reaches beyond the search box to move us from a collection of hyperlinked facts, to meaningful, real time context. When focused through the lens of Enterprise Search, the Social Semantic Web facilitates the fluid transition of meaningful business information from the source to the user. It is the confluence of human thought and computer processing structured with the iterative application of taxonomies, folksonomies, ontologies, and metadata schemas. The importance and nuances of human interaction are often deemphasized when focusing on automatic generation of semantic markup, which results in dissatisfied users and unrealized return on investment. Users consistently qualify the value of information sets through the act of selection, making them the de facto stakeholders of the Social Semantic Web. Employers are the ultimate beneficiaries of s2w utilization with a better informed, more decisive workforce; one not achieved with an IT miracle technology, but by improved human-computer interactions. Johnson Space Center Taxonomist Sarah Berndt and Mike Doane, principal owner of Term Management, LLC discuss the planning, development, and maintenance stages for components of a semantic system while emphasizing the necessity of a Social Semantic Web for the Enterprise. Identification of risks and variables associated with layering the successful implementation of a semantic system are also modeled.

  17. Enabling Semantic Technology Empowered Smart Spaces

    Directory of Open Access Journals (Sweden)

    Jussi Kiljander

    2012-01-01

    Full Text Available It has been proposed that Semantic Web technologies would be key enablers in achieving context-aware computing in our everyday environments. In our vision of semantic technology empowered smart spaces, the whole interaction model is based on the sharing of semantic data via common blackboards. This approach allows smart space applications to take full advantage of semantic technologies. Because of its novelty, there is, however, a lack of solutions and methods for developing semantic smart space applications according to this vision. In this paper, we present solutions to the most relevant challenges we have faced when developing context-aware computing in smart spaces. In particular the paper describes (1 methods for utilizing semantic technologies with resource restricted-devices, (2 a solution for identifying real world objects in semantic technology empowered smart spaces, (3 a method for users to modify the behavior of context-aware smart space applications, and (4 an approach for content sharing between autonomous smart space agents. The proposed solutions include ontologies, system models, and guidelines for building smart spaces with the M3 semantic information sharing platform. To validate and demonstrate the approaches in practice, we have implemented various prototype smart space applications and tools.

  18. THE Interoperability Challenge for the Geosciences: Stepping up from Interoperability between Disciplinary Siloes to Creating Transdisciplinary Data Platforms.

    Science.gov (United States)

    Wyborn, L. A.; Evans, B. J. K.; Trenham, C.; Druken, K. A.; Wang, J.

    2015-12-01

    The National Computational Infrastructure (NCI) at the Australian National University (ANU) has collocated over 10 PB of national and international data assets within a HPC facility to create the National Environmental Research Data Interoperability Platform (NERDIP). The data span a wide range of fields from the earth systems and environment (climate, coasts, oceans, and geophysics) through to astronomy, bioinformatics, and the social sciences. These diverse data collections are collocated on a major data storage node that is linked to a Petascale HPC and Cloud facility. Users can search across all of the collections and either log in and access the data directly, or they can access the data via standards-based web services. These collocated petascale data collections are theoretically a massive resource for interdisciplinary science at scales and resolutions never hitherto possible. But once collocated, multiple barriers became apparent that make cross-domain data integration very difficult and often so time consuming, that either less ambitious research goals are attempted or the project is abandoned. Incompatible content is only one half of the problem: other showstoppers are differing access models, licences and issues of ownership of derived products. Brokers can enable interdisciplinary research but in reality are we just delaying the inevitable? A call to action is required adopt a transdiciplinary approach at the conception of development of new multi-disciplinary systems whereby those across all the scientific domains, the humanities, social sciences and beyond work together to create a unity of informatics plaforms that interoperate horizontally across the multiple discipline boundaries, and also operate vertically to enable a diversity of people to access data from high end researchers, to undergraduate, school students and the general public. Once we master such a transdisciplinary approach to our vast global information assets, we will then achieve

  19. Combining Ontology Development Methodologies and Semantic Web Platforms for E-government Domain Ontology Development

    CERN Document Server

    Dombeu, Jean Vincent Fonou; 10.5121/ijwest.2011.2202

    2011-01-01

    One of the key challenges in electronic government (e-government) is the development of systems that can be easily integrated and interoperated to provide seamless services delivery to citizens. In recent years, Semantic Web technologies based on ontology have emerged as promising solutions to the above engineering problems. However, current research practicing semantic development in e-government does not focus on the application of available methodologies and platforms for developing government domain ontologies. Furthermore, only a few of these researches provide detailed guidelines for developing semantic ontology models from a government service domain. This research presents a case study combining an ontology building methodology and two state-of-the-art Semantic Web platforms namely Protege and Java Jena ontology API for semantic ontology development in e-government. Firstly, a framework adopted from the Uschold and King ontology building methodology is employed to build a domain ontology describing th...

  20. Processing biological literature with customizable Web services supporting interoperable formats.

    Science.gov (United States)

    Rak, Rafal; Batista-Navarro, Riza Theresa; Carter, Jacob; Rowley, Andrew; Ananiadou, Sophia

    2014-01-01

    Web services have become a popular means of interconnecting solutions for processing a body of scientific literature. This has fuelled research on high-level data exchange formats suitable for a given domain and ensuring the interoperability of Web services. In this article, we focus on the biological domain and consider four interoperability formats, BioC, BioNLP, XMI and RDF, that represent domain-specific and generic representations and include well-established as well as emerging specifications. We use the formats in the context of customizable Web services created in our Web-based, text-mining workbench Argo that features an ever-growing library of elementary analytics and capabilities to build and deploy Web services straight from a convenient graphical user interface. We demonstrate a 2-fold customization of Web services: by building task-specific processing pipelines from a repository of available analytics, and by configuring services to accept and produce a combination of input and output data interchange formats. We provide qualitative evaluation of the formats as well as quantitative evaluation of automatic analytics. The latter was carried out as part of our participation in the fourth edition of the BioCreative challenge. Our analytics built into Web services for recognizing biochemical concepts in BioC collections achieved the highest combined scores out of 10 participating teams. Database URL: http://argo.nactem.ac.uk. PMID:25006225

  1. Pragmatics for formal semantics

    DEFF Research Database (Denmark)

    Danvy, Olivier

    2011-01-01

    This tech talk describes how to write and how to inter-derive formal semantics for sequential programming languages. The progress reported here is (1) concrete guidelines to write each formal semantics to alleviate their proof obligations, and (2) simple calculational tools to obtain a formal...... semantics from another....

  2. The semantic structure of gratitude

    Directory of Open Access Journals (Sweden)

    Smirnov, Alexander V.

    2016-06-01

    Full Text Available In the modern social and economic environment of Russia, gratitude might be considered an ambiguous phenomenon. It can have different meaning for a person in different contexts and can manifest itself differently as well (that is, as an expression of sincere feelings or as an element of corruption. In this respect it is topical to investigate the system of meanings and relationships that define the semantic space of gratitude. The goal of the study was the investigation and description of the content and structure of the semantic space of the gratitude phenomenon as well as the determination of male, female, age, and ethnic peculiarities of the expression of gratitude. The objective was achieved by using the semantic differential designed by the authors to investigate attitudes toward gratitude. This investigation was carried out with the participation of 184 respondents (Russians, Tatars, Ukrainians, Jews living in the Russian Federation, Belarus, Kazakhstan, Tajikistan, Israel, Australia, Canada, and the United Kingdom and identifying themselves as representatives of one of these nationalities. The structural components of gratitude were singled out by means of exploratory factor analysis of the empirical data from the designed semantic differential. Gender, age, and ethnic differences were differentiated by means of Student’s t-test. Gratitude can be represented by material and nonmaterial forms as well as by actions in response to help given. The empirical data allowed us to design the ethnically nonspecified semantic structure of gratitude. During the elaboration of the differential, semantic universals of gratitude, which constitute its psychosemantic content, were distinguished. Peculiarities of attitudes toward gratitude by those in different age and gender groups were revealed. Differences in the degree of manifestation of components of the psychosemantic structure of gratitude related to ethnic characteristics were not discovered

  3. Network effects, cascades and CCP interoperability

    Science.gov (United States)

    Feng, Xiaobing; Hu, Haibo; Pritsker, Matthew

    2014-03-01

    To control counterparty risk, financial regulations such as the Dodd Frank Act are increasingly requiring standardized derivatives trades to be cleared by central counterparties (CCPs). It is anticipated that in the near-term future, CCPs across the world will be linked through interoperability agreements that facilitate risk-sharing but also serve as a conduit for transmitting shocks. This paper theoretically studies a network with CCPs that are linked through interoperability arrangements, and studies the properties of the network that contribute to cascading failures. The magnitude of the cascading is theoretically related to the strength of network linkages, the size of the network, the logistic mapping coefficient, a stochastic effect and CCP's defense lines. Simulations indicate that larger network effects increase systemic risk from cascading failures. The size of the network N raises the threshold value of shock sizes that are required to generate cascades. Hence, the larger the network, the more robust it will be.

  4. Semantic networks of English.

    Science.gov (United States)

    Miller, G A; Fellbaum, C

    1991-12-01

    Principles of lexical semantics developed in the course of building an on-line lexical database are discussed. The approach is relational rather than componential. The fundamental semantic relation is synonymy, which is required in order to define the lexicalized concepts that words can be used to express. Other semantic relations between these concepts are then described. No single set of semantic relations or organizational structure is adequate for the entire lexicon: nouns, adjectives, and verbs each have their own semantic relations and their own organization determined by the role they must play in the construction of linguistic messages.

  5. A PLCS framework for PDM / ERP interoperability

    OpenAIRE

    Paviot, Thomas; Cheutet, Vincent; Lamouri, Samir

    2011-01-01

    International audience Wide di usion of methodologies and software relevant to Product Lifecycle Management (PLM) in industrial companies faces heterogeneity of information technology (IT) systems. Especially, the lack of interoperability between Product Data Management (PDM) systems, that drive virtual product development, and Enterprise Resource Planning (ERP), which manages real product, cannot lead to a global description of the product development process. We demonstrate that a mediat...

  6. FINANCIAL AND ACCOUNTING INFORMATION SYSTEMS INTEROPERABILITY

    OpenAIRE

    Iuliana Ionescu; Bogdan Ionescu; Florin Mihai; Silviu Cojocaru

    2009-01-01

    The current economic developments have led to substantial changes in terms of how thefinancial and accounting activities are carried. The business environment can be characterized by awidespread consolidation of companies and their grouping in holding companies. Thus, the focus ison consolidation the financial data, and the integration and interoperability of the financial andaccounting applications and also the integrated information systems have a major significance. Withthe development of ...

  7. Digital Identity Interoperability and eInnovation

    OpenAIRE

    Palfrey, John; Gasser, Urs

    2009-01-01

    This paper, one of three case studies in a transatlantic research project exploring the connection between Information and Communication Technology interoperability and eInnovation, considers the current state and possible evolution of Digital Identity. While consumers would undoubtedly reap convenience benefits from an ubiquitous single sign-on (SSO) technology, the potential for privacy and security issues makes Digital ID a complex issue. The user-centric, federated, and centralized models...

  8. Designing Interoperable Data Products with Community Conventions

    Science.gov (United States)

    Habermann, T.; Jelenak, A.; Lee, H.

    2015-12-01

    The HDF Product Designer (HPD) is a cloud-based client-server collaboration tool that can bring existing netCDF-3/4/CF, HDF4/5, and HDF-EOS2/5 products together to create new interoperable data products that serve the needs of the Earth Science community. The tool is designed to reduce the burden of creating and storing data in standards-compliant, interoperable HDF5 files and lower the technical and programming skill threshold needed to design such products by providing a user interface that combines the netCDF-4/HDF5 interoperable feature set with applicable metadata conventions. Users can collaborate quickly to devise new HDF5 products while at the same time seamlessly incorporating the latest best practices and conventions in their community by importing existing data products. The tool also incorporates some expert system features through CLIPS, allowing custom approaches in the file design, as well as easy transfer of preferred conventions as they are being developed. The current state of the tool and the plans for future development will be presented. Constructive input from any interested parties is always welcome.

  9. Interoperability of satellite-based augmentation systems for aircraft navigation

    Science.gov (United States)

    Dai, Donghai

    The Federal Aviation Administration (FAA) is pioneering a transformation of the national airspace system from its present ground based navigation and landing systems to a satellite based system using the Global Positioning System (GPS). To meet the critical safety-of-life aviation positioning requirements, a Satellite-Based Augmentation System (SBAS), the Wide Area Augmentation System (WAAS), is being implemented to support navigation for all phases of flight, including Category I precision approach. The system is designed to be used as a primary means of navigation, capable of meeting the Required Navigation Performance (RNP), and therefore must satisfy the accuracy, integrity, continuity and availability requirements. In recent years there has been international acceptance of Global Navigation Satellite Systems (GNSS), spurring widespread growth in the independent development of SBASs. Besides the FAA's WAAS, the European Geostationary Navigation Overlay Service System (EGNOS) and the Japan Civil Aviation Bureau's MTSAT-Satellite Augmentation System (MSAS) are also being actively developed. Although all of these SBASs can operate as stand-alone, regional systems, there is increasing interest in linking these SBASs together to reduce costs while improving service coverage. This research investigated the coverage and availability improvements due to cooperative efforts among regional SBAS networks. The primary goal was to identify the optimal interoperation strategies in terms of performance, complexity and practicality. The core algorithms associated with the most promising concepts were developed and demonstrated. Experimental verification of the most promising concepts was conducted using data collected from a joint international test between the National Satellite Test Bed (NSTB) and the EGNOS System Test Bed (ESTB). This research clearly shows that a simple switch between SBASs made by the airborne equipment is the most effective choice for achieving the

  10. Semantics-driven event clustering in Twitter feeds

    OpenAIRE

    De Boom, Cedric; Van Canneyt, Steven; Dhoedt, Bart

    2015-01-01

    Detecting events using social media such as Twitter has many useful applications in real-life situations. Many algorithms which all use different information sources - either textual, temporal, geographic or community features - have been developed to achieve this task. Semantic information is often added at the end of the event detection to classify events into semantic topics. But semantic information can also be used to drive the actual event detection, which is less covered by academic re...

  11. The quest for information retrieval on the semantic web

    OpenAIRE

    Vallet-Weadon, David; Fernández-Sánchez, Miriam; Castells-Azpilicueta, Pablo

    2005-01-01

    Semantic search has been one of the motivations of the Semantic Web since it was envisioned. We propose a model for the exploitation of ontology-based KBs to improve search over large document repositories. The retrieval model is based on an adaptation of the classic vector-space model, including an annotation weighting algorithm, and a ranking algorithm. Semantic search is combined with keyword-based search to achieve tolerance to KB incompleteness. Our proposal has been tested on corpora of...

  12. IHE cross-enterprise document sharing for imaging: interoperability testing software

    Directory of Open Access Journals (Sweden)

    Renaud Bérubé

    2010-09-01

    Full Text Available Abstract Background With the deployments of Electronic Health Records (EHR, interoperability testing in healthcare is becoming crucial. EHR enables access to prior diagnostic information in order to assist in health decisions. It is a virtual system that results from the cooperation of several heterogeneous distributed systems. Interoperability between peers is therefore essential. Achieving interoperability requires various types of testing. Implementations need to be tested using software that simulates communication partners, and that provides test data and test plans. Results In this paper we describe a software that is used to test systems that are involved in sharing medical images within the EHR. Our software is used as part of the Integrating the Healthcare Enterprise (IHE testing process to test the Cross Enterprise Document Sharing for imaging (XDS-I integration profile. We describe its architecture and functionalities; we also expose the challenges encountered and discuss the elected design solutions. Conclusions EHR is being deployed in several countries. The EHR infrastructure will be continuously evolving to embrace advances in the information technology domain. Our software is built on a web framework to allow for an easy evolution with web technology. The testing software is publicly available; it can be used by system implementers to test their implementations. It can also be used by site integrators to verify and test the interoperability of systems, or by developers to understand specifications ambiguities, or to resolve implementations difficulties.

  13. Developing enterprise collaboration: a methodology to implement and improve interoperability

    Science.gov (United States)

    Daclin, Nicolas; Chen, David; Vallespir, Bruno

    2016-06-01

    The aim of this article is to present a methodology for guiding enterprises to implement and improve interoperability. This methodology is based on three components: a framework of interoperability which structures specific solutions of interoperability and is composed of abstraction levels, views and approaches dimensions; a method to measure interoperability including interoperability before (maturity) and during (operational performances) a partnership; and a structured approach defining the steps of the methodology, from the expression of an enterprise's needs to implementation of solutions. The relationship which consistently relates these components forms the methodology and enables developing interoperability in a step-by-step manner. Each component of the methodology and the way it operates is presented. The overall approach is illustrated in a case study example on the basis of a process between a given company and its dealers. Conclusions and future perspectives are given at the end of the article.

  14. Interoperability of Knowledge Units in Plant Protection: Case Studies

    Directory of Open Access Journals (Sweden)

    M. Beránková

    2010-12-01

    Full Text Available In this work, we provide two case studies on interoperability and transfer of knowledge in the environment of acompany dealing with plant protection. We find that the area of plant protection is highly oriented on workingwith knowledge. In this case interoperability of knowledge can play an important role in acquiring knowledgefrom different environments to solve specific problem in companies dealing with plant protection. Nevertheless,the concept of interoperability is well-developed on the level of data and information only.We stem from our previous works, where we defined a logical concept for the interoperability of knowledge onthe level of knowledge units. The objective of this work is to show how to apply our process model ofknowledge interoperability in a particular plant protection company. Two case studies are provided in order todemonstrate distinguishing between simple knowledge transfer and knowledge interoperability.

  15. The Semantics of Web Services: An Examination in GIScience Applications

    Directory of Open Access Journals (Sweden)

    Xuan Shi

    2013-09-01

    Full Text Available Web service is a technological solution for software interoperability that supports the seamless integration of diverse applications. In the vision of web service architecture, web services are described by the Web Service Description Language (WSDL, discovered through Universal Description, Discovery and Integration (UDDI and communicate by the Simple Object Access Protocol (SOAP. Such a divination has never been fully accomplished yet. Although it was criticized that WSDL only has a syntactic definition of web services, but was not semantic, prior initiatives in semantic web services did not establish a correct methodology to resolve the problem. This paper examines the distinction and relationship between the syntactic and semantic definitions for web services that characterize different purposes in service computation. Further, this paper proposes that the semantics of web service are neutral and independent from the service interface definition, data types and platform. Such a conclusion can be a universal law in software engineering and service computing. Several use cases in the GIScience application are examined in this paper, while the formalization of geospatial services needs to be constructed by the GIScience community towards a comprehensive ontology of the conceptual definitions and relationships for geospatial computation. Advancements in semantic web services research will happen in domain science applications.

  16. Common Patterns with End-to-end Interoperability for Data Access

    Science.gov (United States)

    Gallagher, J.; Potter, N.; Jones, M. B.

    2010-12-01

    file transfers. These options affect seamlessness in that they represent tradeoffs in new development (required for the first option) with cumbersome extra user actions (required by the last option). While the middle option, adding new functionality to an existing library (netCDF), is very appealing because practice has shown that it can be very effective over a wide range of clients, it's very hard to build these libraries because correctly writing a new implementation of an existing API that preserves the original's exact semantics can be a daunting task. In the example discussed here, we developed a new module for Kepler using OPeNDAP's Java API. This provided a way to leverage internal optimizations for data organization in Kepler and we felt that outweighed the additional cost of new development and the need for users to learn how to use a new Kepler module. While common storage formats and open standards play an important role in data access, our work with the Kepler workflow system reinforces the experience that matching the data models of the data server (source) and user client (sink) and choosing the most appropriate integration strategy are critical to achieving interoperability.

  17. The Semantic eScience Framework

    Science.gov (United States)

    McGuinness, Deborah; Fox, Peter; Hendler, James

    2010-05-01

    The goal of this effort is to design and implement a configurable and extensible semantic eScience framework (SESF). Configuration requires research into accommodating different levels of semantic expressivity and user requirements from use cases. Extensibility is being achieved in a modular approach to the semantic encodings (i.e. ontologies) performed in community settings, i.e. an ontology framework into which specific applications all the way up to communities can extend the semantics for their needs.We report on how we are accommodating the rapid advances in semantic technologies and tools and the sustainable software path for the future (certain) technical advances. In addition to a generalization of the current data science interface, we will present plans for an upper-level interface suitable for use by clearinghouses, and/or educational portals, digital libraries, and other disciplines.SESF builds upon previous work in the Virtual Solar-Terrestrial Observatory. The VSTO utilizes leading edge knowledge representation, query and reasoning techniques to support knowledge-enhanced search, data access, integration, and manipulation. It encodes term meanings and their inter-relationships in ontologies anduses these ontologies and associated inference engines to semantically enable the data services. The Semantically-Enabled Science Data Integration (SESDI) project implemented data integration capabilities among three sub-disciplines; solar radiation, volcanic outgassing and atmospheric structure using extensions to existingmodular ontolgies and used the VSTO data framework, while adding smart faceted search and semantic data registrationtools. The Semantic Provenance Capture in Data Ingest Systems (SPCDIS) has added explanation provenance capabilities to an observational data ingest pipeline for images of the Sun providing a set of tools to answer diverseend user questions such as ``Why does this image look bad?. http://tw.rpi.edu/portal/SESF

  18. Telemedicine system interoperability architecture: concept description and architecture overview.

    Energy Technology Data Exchange (ETDEWEB)

    Craft, Richard Layne, II

    2004-05-01

    In order for telemedicine to realize the vision of anywhere, anytime access to care, it must address the question of how to create a fully interoperable infrastructure. This paper describes the reasons for pursuing interoperability, outlines operational requirements that any interoperability approach needs to consider, proposes an abstract architecture for meeting these needs, identifies candidate technologies that might be used for rendering this architecture, and suggests a path forward that the telemedicine community might follow.

  19. Geospatial Semantics and the Semantic Web

    CERN Document Server

    Ashish, Naveen

    2011-01-01

    The availability of geographic and geospatial information and services, especially on the open Web has become abundant in the last several years with the proliferation of online maps, geo-coding services, geospatial Web services and geospatially enabled applications. The need for geospatial reasoning has significantly increased in many everyday applications including personal digital assistants, Web search applications, local aware mobile services, specialized systems for emergency response, medical triaging, intelligence analysis and more. Geospatial Semantics and the Semantic Web: Foundation

  20. Characterizing semantic web services

    OpenAIRE

    Moyano, Marcelo; Buccella, Agustina; Cechich, Alejandra; Estévez, Elsa Clara

    2004-01-01

    Semantic Web is an extension of the current web in which data contained in the web documents are machine-understandable. On the other hand, Web Services provide a new model of the web in which sites exchange dynamic information on demand. Combination of both introduces a new concept named Semantic Web Services in which semantic information is added to the different activities involved in Web Services, such as discovering, publication, composition, etc. In this paper, we analyze several propos...

  1. 77 FR 19575 - Promoting Interoperability in the 700 MHz Commercial Spectrum; Interoperability of Mobile User...

    Science.gov (United States)

    2012-04-02

    ....'' Unfortunately, no industry-led solution to the lack of interoperability has yet emerged. 5. Therefore, the... adjacent to Channel 51 (692-698 MHz), which has been allocated for TV broadcast operations at power levels... Telecommunications Bureau sought comment on the Petition in 2010. See 75 FR 9210. All future filings concerning...

  2. Applied Semantic Web Technologies

    CERN Document Server

    Sugumaran, Vijayan

    2011-01-01

    The rapid advancement of semantic web technologies, along with the fact that they are at various levels of maturity, has left many practitioners confused about the current state of these technologies. Focusing on the most mature technologies, Applied Semantic Web Technologies integrates theory with case studies to illustrate the history, current state, and future direction of the semantic web. It maintains an emphasis on real-world applications and examines the technical and practical issues related to the use of semantic technologies in intelligent information management. The book starts with

  3. Semantic Role Labeling

    CERN Document Server

    Palmer, Martha; Xue, Nianwen

    2011-01-01

    This book is aimed at providing an overview of several aspects of semantic role labeling. Chapter 1 begins with linguistic background on the definition of semantic roles and the controversies surrounding them. Chapter 2 describes how the theories have led to structured lexicons such as FrameNet, VerbNet and the PropBank Frame Files that in turn provide the basis for large scale semantic annotation of corpora. This data has facilitated the development of automatic semantic role labeling systems based on supervised machine learning techniques. Chapter 3 presents the general principles of applyin

  4. Semantic prosody and judgment.

    Science.gov (United States)

    Hauser, David J; Schwarz, Norbert

    2016-07-01

    Some words tend to co-occur exclusively with a positive or negative context in natural language use, even though such valence patterns are not dictated by definitions or are part of the words' core meaning. These words contain semantic prosody, a subtle valenced meaning derived from co-occurrence in language. As language and thought are heavily intertwined, we hypothesized that semantic prosody can affect evaluative inferences about related ambiguous concepts. Participants inferred that an ambiguous medical outcome was more negative when it was caused, a verb with negative semantic prosody, than when it was produced, a synonymous verb with no semantic prosody (Studies 1a, 1b). Participants completed sentence fragments in a manner consistent with semantic prosody (Study 2), and semantic prosody affected various other judgments in line with evaluative inferences (estimates of an event's likelihood in Study 3). Finally, semantic prosody elicited both positive and negative evaluations of outcomes across a large set of semantically prosodic verbs (Study 4). Thus, semantic prosody can exert a strong influence on evaluative judgment. (PsycINFO Database Record PMID:27243765

  5. Semantic web for dummies

    CERN Document Server

    Pollock, Jeffrey T

    2009-01-01

    Semantic Web technology is already changing how we interact with data on the Web. By connecting random information on the Internet in new ways, Web 3.0, as it is sometimes called, represents an exciting online evolution. Whether you're a consumer doing research online, a business owner who wants to offer your customers the most useful Web site, or an IT manager eager to understand Semantic Web solutions, Semantic Web For Dummies is the place to start! It will help you:Know how the typical Internet user will recognize the effects of the Semantic WebExplore all the benefits the data Web offers t

  6. GENESIG Platform: taking steps towards a geo-semantic environment Plataforma GENESIG: Dando pasos hacia un entorno geosemántico

    OpenAIRE

    Manuel Enrique Puebla Martínez; Adrian Gracia Águila

    2012-01-01

    The addition of semantics to spatial information management is a need for better exploitation and use of spatial information and to overcome the problems of heterogeneity and interoperability little existing. This research report presents the current state of the line "Geo-semantic" or “Geospatial Semantics”, reflecting the results, current and future jobs. Presents the most important platform for developing sovereign Geographic Information Systems: GENESIG; deve...

  7. SMART-fm:Setting Interoperability in SME-based Industrial Environments

    OpenAIRE

    Goncalves, Ricardo; Panetto, Hervé; Nunez, Maria-José; Steiger-Garcao, Adolfo

    2007-01-01

    PLM (Product Lifecycle Management) is a set of capabilities that enable an enterprise to manage its products and services throughout the business lifecycle. A major trend in the present global market is the increasing need for cooperation among enterprises, which organizations can increase flexibility and reduce operational costs by focusing on its core competencies. However, enterprise applications need to be interoperable in order to achieve seamless interaction across organizations, leadin...

  8. RFID in libraries a step toward interoperability

    CERN Document Server

    Ayre, Lori Bowen

    2012-01-01

    The approval by The National Information Standards Organization (NISO) of a new standard for RFID in libraries is a big step toward interoperability among libraries and vendors. By following this set of practices and procedures, libraries can ensure that an RFID tag in one library can be used seamlessly by another, assuming both comply, even if they have different suppliers for tags, hardware, and software. In this issue of Library Technology Reports, Lori Bowen Ayre, an experienced implementer of automated materials handling systems, Provides background on the evolution of the standard

  9. AliEn - EDG Interoperability in ALICE

    OpenAIRE

    Bagnasco, S.; Barbera, R.; Buncic, P.; Carminati, F.; P. Cerello; Saiz, P.

    2003-01-01

    AliEn (ALICE Environment) is a GRID-like system for large scale job submission and distributed data management developed and used in the context of ALICE, the CERN LHC heavy-ion experiment. With the aim of exploiting upcoming Grid resources to run AliEn-managed jobs and store the produced data, the problem of AliEn-EDG interoperability was addressed and an in-terface was designed. One or more EDG (European Data Grid) User Interface machines run the AliEn software suite (Cluster Monitor, Stora...

  10. Open Source Interoperability: It's More than Technology

    Directory of Open Access Journals (Sweden)

    Dominic Sartorio

    2008-01-01

    Full Text Available The Open Solutions Alliance is a consortium of leading commercial open source vendors, integrators and end users dedicated to the growth of open source based solutions in the enterprise. We believe Linux and other infrastructure software, such as Apache, has become mainstream, and packaged solutions represent the next great growth opportunity. However some unique challenges can temper that opportunity. These challenges include getting the word out about the maturity and enterprise-readiness of those solutions, ensuring interoperability both with each other and with other proprietary and legacy solutions, and ensuring healthy collaboration between vendors and their respective customer and developer communities.

  11. Interoperable PKI Data Distribution in Computational Grids

    Energy Technology Data Exchange (ETDEWEB)

    Pala, Massimiliano; Cholia, Shreyas; Rea, Scott A.; Smith, Sean W.

    2008-07-25

    One of the most successful working examples of virtual organizations, computational grids need authentication mechanisms that inter-operate across domain boundaries. Public Key Infrastructures(PKIs) provide sufficient flexibility to allow resource managers to securely grant access to their systems in such distributed environments. However, as PKIs grow and services are added to enhance both security and usability, users and applications must struggle to discover available resources-particularly when the Certification Authority (CA) is alien to the relying party. This article presents how to overcome these limitations of the current grid authentication model by integrating the PKI Resource Query Protocol (PRQP) into the Grid Security Infrastructure (GSI).

  12. OTF CCSDS SM and C Interoperability Prototype

    Science.gov (United States)

    Reynolds, Walter F.; Lucord, Steven A.; Stevens, John E.

    2008-01-01

    A presentation is provided to demonstrate the interoperability between two space flight Mission Operation Centers (MOCs) and to emulate telemetry, actions, and alert flows between the two centers. One framework uses a COTS C31 system that uses CORBA to interface to the local OTF data network. The second framework relies on current Houston MCC frameworks and ad hoc clients. Messaging relies on SM and C MAL, Core and Common Service formats, while the transport layer uses AMS. A centralized SM and C Registry uses HTTP/XML for transport/encoding. The project's status and progress are reviewed.

  13. 78 FR 50075 - Statewide Communication Interoperability Plan Template and Annual Progress Report

    Science.gov (United States)

    2013-08-16

    ... SECURITY Statewide Communication Interoperability Plan Template and Annual Progress Report AGENCY: National... Statewide Communication Interoperability Plan (SCIP) Implementation Report was cleared in accordance with... Communications. Title: Statewide Communication Interoperability Plan Template and Annual Progress Report....

  14. Semantic technologies and linked data for the Italian PA: the case of data.cnr.it

    Directory of Open Access Journals (Sweden)

    Aldo Gangemi

    2013-01-01

    Full Text Available Governmental data are being published in many countries, providing an unprecedented opportunity to create innovative services and to increase societal awareness about administration dynamics. In particular, semantic technologies for linked data production and exploitation prove to be ideal for managing identity and interoperability of administrative entities and data. This paper presents the current state of art, and evolution scenarios of these technologies, with reference to several case studies, including two of them from the Italian context: CNR's Semantic Scout, and DigitPA's Linked Open IPA.

  15. OGC Geographic Information Service Deductive Semantic Reasoning Based on Description Vocabularies Reduction

    Directory of Open Access Journals (Sweden)

    MIAO Lizhi

    2015-09-01

    Full Text Available As geographic information interoperability and sharing developing, more and more interoperable OGC (open geospatial consortium Web services (OWS are generated and published through the internet. These services can facilitate the integration of different scientific applications by searching, finding, and utilizing the large number of scientific data and Web services. However, these services are widely dispersed and hard to be found and utilized with executive semantic retrieval. This is especially true when considering the weak semantic description of geographic information service data. Focusing on semantic retrieval and reasoning of the distributed OWS resources, a deductive and semantic reasoning method is proposed to describe and search relevant OWS resources. Specifically, ①description words are extracted from OWS metadata file to generate GISe ontology-database and instance-database based on geographic ontology according to basic geographic elements category, ②a description words reduction model is put forward to implement knowledge reduction on GISe instance-database based on rough set theory and generate optimized instances database, ③utilizing GISe ontology-database and optimized instance-database to implement semantic inference and reasoning of geographic searching objects is used as an example to demonstrate the efficiency, feasibility and recall ration of the proposed description-word-based reduction model.

  16. Semantic Alignment between ICD-11 and SNOMED CT.

    Science.gov (United States)

    Rodrigues, Jean-Marie; Robinson, David; Della Mea, Vincenzo; Campbell, James; Rector, Alan; Schulz, Stefan; Brear, Hazel; Üstün, Bedirhan; Spackman, Kent; Chute, Christopher G; Millar, Jane; Solbrig, Harold; Brand Persson, Kristina

    2015-01-01

    Due to fundamental differences in design and editorial policies, semantic interoperability between two de facto standard terminologies in the healthcare domain--the International Classification of Diseases (ICD) and SNOMED CT (SCT), requires combining two different approaches: (i) axiom-based, which states logically what is universally true, using an ontology language such as OWL; (ii) rule-based, expressed as queries on the axiom-based knowledge. We present the ICD-SCT harmonization process including: a) a new architecture for ICD-11, b) a protocol for the semantic alignment of ICD and SCT, and c) preliminary results of the alignment applied to more than half the domain currently covered by the draft ICD-11. PMID:26262160

  17. Semantic-Driven e-Government: Application of Uschold and King Ontology Building Methodology for Semantic Ontology Models Development

    Directory of Open Access Journals (Sweden)

    Jean Vincent Fonou-Dombeu

    2011-11-01

    Full Text Available Electronic government (e-government has been one of the most active areas of ontology developmentduring the past six years. In e-government, ontologies are being used to describe and specify e-governmentservices (e-services because they enable easy composition, matching, mapping and merging of various egovernmentservices. More importantly, they also facilitate the semantic integration and interoperability ofe-government services. However, it is still unclear in the current literature how an existing ontologybuilding methodology can be applied to develop semantic ontology models in a government servicedomain. In this paper the Uschold and King ontology building methodology is applied to develop semanticontology models in a government service domain. Firstly, the Uschold and King methodology is presented,discussed and applied to build a government domain ontology. Secondly, the domain ontology is evaluatedfor semantic consistency using its semi-formal representation in Description Logic. Thirdly, an alignmentof the domain ontology with the Descriptive Ontology for Linguistic and Cognitive Engineering (DOLCEupper level ontology is drawn to allow its wider visibility and facilitate its integration with existingmetadata standard. Finally, the domain ontology is formally written in Web Ontology Language (OWL toenable its automatic processing by computers. The study aims to provide direction for the application ofexisting ontology building methodologies in the Semantic Web development processes of e-governmentdomain specific ontology models; which would enable their repeatability in other e-government projectsand strengthen the adoption of semantic technologies in e-government. The research would be of interest toe-government system developers as well as the Semantic Web community, as the framework and techniquesemployed to develop the semantic ontology models might be repeated in other domains of knowledge tobuild ontologies.

  18. Electronic Healthcare Record and clinical research in cardiovascular radiology. HL7 CDA and CDISC ODM interoperability.

    Science.gov (United States)

    El Fadly, A; Daniel, C; Bousquet, C; Dart, T; Lastic, P-Y; Degoulet, P

    2007-01-01

    Integrating clinical research data entry with patient care data entry is a challenging issue. At the G. Pompidou European Hospital (HEGP), cardiovascular radiology reports are captured twice, first in the Electronic Health Record (EHR) and then in a national clinical research server. Informatics standards are different for EHR (HL7 CDA) and clinical research (CDISC ODM). The objective of this work is to feed both the EHR and a Clinical Research Data Management System (CDMS) from a single multipurpose form. We adopted and compared two approaches. First approach consists in implementing the single "care-research" form within the EHR and aligning XML structures of HL7 CDA document and CDISC ODM message to export relevant data from EHR to CDMS. Second approach consists in displaying a single "care-research" XForms form within the EHR and generating both HL7 CDA document and CDISC message to feed both EHR and CDMS. The solution based on XForms avoids overloading both EHR and CDMS with irrelevant information. Beyond syntactic interoperability, a perspective is to address the issue of semantic interoperability between both domains. PMID:18693829

  19. Integrated coastal management, marine spatial data infrastructures, and semantic web services

    OpenAIRE

    Cömert, Çetin; Ulutaş, Deniztan; Akıncı, Halil; Kara, Gülten

    2008-01-01

    The aim of this work was to get acquainted with semantic web services (SWS) and assess their potential for the implementation of technical interoperability infrastructure of Spatial Data Infrastructures (SDIs). SDIs are widely accepted way of enabling collaboration among various parties allowing sharing of “data” and “services” of each other. Collaboration is indispensable given either the business model or other requirements such as that of “Sustainable Development” of the date. SDIs can be ...

  20. THE SEMANTIC INFORMATION MODEL FOR CLUSTER "BIOLOGICAL ACTIVE SUBSTANCES IN FEEDING AND COSMETICS"

    OpenAIRE

    Stefan Velikov

    2015-01-01

    The article presents a unified data model for cluster "Biologically active substances in feeding and cosmetics”. The basic information components and their relationship are indicated. The information system provides the data in a structured format thereby realize the concept of interoperability and allows the integration of different systems, storages, processing and re-using of information. The best practices for combining and adapting information resources to support semantic interoperabili...

  1. Toward an E-Government Semantic Platform

    Science.gov (United States)

    Sbodio, Marco Luca; Moulin, Claude; Benamou, Norbert; Barthès, Jean-Paul

    This chapter describes the major aspects of an e-government platform in which semantics underpins more traditional technologies in order to enable new capabilities and to overcome technical and cultural challenges. The design and development of such an e-government Semantic Platform has been conducted with the financial support of the European Commission through the Terregov research project: "Impact of e-government on Territorial Government Services" (Terregov 2008). The goal of this platform is to let local government and government agencies offer online access to their services in an interoperable way, and to allow them to participate in orchestrated processes involving services provided by multiple agencies. Implementing a business process through an electronic procedure is indeed a core goal in any networked organization. However, the field of e-government brings specific constraints to the operations allowed in procedures, especially concerning the flow of private citizens' data: because of legal reasons in most countries, such data are allowed to circulate only from agency to agency directly. In order to promote transparency and responsibility in e-government while respecting the specific constraints on data flows, Terregov supports the creation of centrally controlled orchestrated processes; while the cross agencies data flows are centrally managed, data flow directly across agencies.

  2. Towards E-Society Policy Interoperability

    Science.gov (United States)

    Iannella, Renato

    The move towards the Policy-Oriented Web is destined to provide support for policy expression and management in the core web layers. One of the most promising areas that can drive this new technology adoption is e-Society communities. With so much user-generated content being shared by these social networks, there is the real danger that the implicit sharing rules that communities have developed over time will be lost in translation in the new digital communities. This will lead to a corresponding loss in confidence in e-Society sites. The Policy-Oriented Web attempts to turn the implicit into the explicit with a common framework for policy language interoperability and awareness. This paper reports on the policy driving factors from the Social Networks experiences using real-world use cases and scenarios. In particular, the key functions of policy-awareness - for privacy, rights, and identity - will be the driving force that enables the e-Society to appreciate new interoperable policy regimes.

  3. An Interoperable GridWorkflow Management System

    Science.gov (United States)

    Mirto, Maria; Passante, Marco; Epicoco, Italo; Aloisio, Giovanni

    A WorkFlow Management System (WFMS) is a fundamental componentenabling to integrate data, applications and a wide set of project resources. Although a number of scientific WFMSs support this task, many analysis pipelines require large-scale Grid computing infrastructures to cope with their high compute and storage requirements. Such scientific workflows complicate the management of resources, especially in cases where they are offered by several resource providers, managed by different Grid middleware, since resource access must be synchronised in advance to allow reliable workflow execution. Different types of Grid middleware such as gLite, Unicore and Globus are used around the world and may cause interoperability issues if applications involve two or more of them. In this paperwe describe the ProGenGrid Workflow Management System which the main goal is to provide interoperability among these different grid middleware when executing workflows. It allows the composition of batch; parameter sweep and MPI based jobs. The ProGenGrid engine implements the logic to execute such jobs by using a standard language OGF compliant such as JSDL that has been extended for this purpose. Currently, we are testing our system on some bioinformatics case studies in the International Laboratory of Bioinformatics (LIBI) Project (www.libi.it).

  4. Food product tracing technology capabilities and interoperability.

    Science.gov (United States)

    Bhatt, Tejas; Zhang, Jianrong Janet

    2013-12-01

    Despite the best efforts of food safety and food defense professionals, contaminated food continues to enter the food supply. It is imperative that contaminated food be removed from the supply chain as quickly as possible to protect public health and stabilize markets. To solve this problem, scores of technology companies purport to have the most effective, economical product tracing system. This study sought to compare and contrast the effectiveness of these systems at analyzing product tracing information to identify the contaminated ingredient and likely source, as well as distribution of the product. It also determined if these systems can work together to better secure the food supply (their interoperability). Institute of Food Technologists (IFT) hypothesized that when technology providers are given a full set of supply-chain data, even for a multi-ingredient product, their systems will generally be able to trace a contaminated product forward and backward through the supply chain. However, when provided with only a portion of supply-chain data, even for a product with a straightforward supply chain, it was expected that interoperability of the systems will be lacking and that there will be difficulty collaborating to identify sources and/or recipients of potentially contaminated product. IFT provided supply-chain data for one complex product to 9 product tracing technology providers, and then compared and contrasted their effectiveness at analyzing product tracing information to identify the contaminated ingredient and likely source, as well as distribution of the product. A vertically integrated foodservice restaurant agreed to work with IFT to secure data from its supply chain for both a multi-ingredient and a simpler product. Potential multi-ingredient products considered included canned tuna, supreme pizza, and beef tacos. IFT ensured that all supply-chain data collected did not include any proprietary information or information that would otherwise

  5. Semantics-informed cartography: the case of Piemonte Geological Map

    Science.gov (United States)

    Piana, Fabrizio; Lombardo, Vincenzo; Mimmo, Dario; Giardino, Marco; Fubelli, Giandomenico

    2016-04-01

    In modern digital geological maps, namely those supported by a large geo-database and devoted to dynamical, interactive representation on WMS-WebGIS services, there is the need to provide, in an explicit form, the geological assumptions used for the design and compilation of the database of the Map, and to get a definition and/or adoption of semantic representation and taxonomies, in order to achieve a formal and interoperable representation of the geologic knowledge. These approaches are fundamental for the integration and harmonisation of geological information and services across cultural (e.g. different scientific disciplines) and/or physical barriers (e.g. administrative boundaries). Initiatives such as GeoScience Markup Language (last version is GeoSciML 4.0, 2015, http://www.geosciml.org) and the INSPIRE "Data Specification on Geology" http://inspire.jrc.ec.europa.eu/documents/Data_Specifications/INSPIRE_DataSpecification_GE_v3.0rc3.pdf (an operative simplification of GeoSciML, last version is 3.0 rc3, 2013), as well as the recent terminological shepherding of the Geoscience Terminology Working Group (GTWG) have been promoting information exchange of the geologic knowledge. Grounded on these standard vocabularies, schemas and data models, we provide a shared semantic classification of geological data referring to the study case of the synthetic digital geological map of the Piemonte region (NW Italy), named "GEOPiemonteMap", developed by the CNR Institute of Geosciences and Earth Resources, Torino (CNR IGG TO) and hosted as a dynamical interactive map on the geoportal of ARPA Piemonte Environmental Agency. The Piemonte Geological Map is grounded on a regional-scale geo-database consisting of some hundreds of GeologicUnits whose thousands instances (Mapped Features, polygons geometry) widely occur in Piemonte region, and each one is bounded by GeologicStructures (Mapped Features, line geometry). GeologicUnits and GeologicStructures have been spatially

  6. Defining inter-cloud architecture for interoperability and integration

    NARCIS (Netherlands)

    Y. Demchenko; C. Ngo; M.X. Makkes; R. Strijkers; C. de Laat

    2012-01-01

    This paper presents an on-going research to develop the Inter-Cloud Architecture, which addresses the architectural problems in multi-provider multi-domain heterogeneous cloud based applications integration and interoperability, including integration and interoperability with legacy infrastructure s

  7. Defining Inter-Cloud Architecture for Interoperability and Integration

    NARCIS (Netherlands)

    Demchenko, Y.; Ngo, C.; Makkes, M.X.; Strijkers, R.J.; Laat, C. de

    2012-01-01

    This paper presents on-going research to develop the Inter-Cloud Architecture that should address problems in multi-provider multi-domain heterogeneous Cloud based applications integration and interoperability, including integration and interoperability with legacy infrastructure services. Cloud tec

  8. Interoperability of Demand Response Resources Demonstration in NY

    Energy Technology Data Exchange (ETDEWEB)

    Wellington, Andre

    2014-03-31

    The Interoperability of Demand Response Resources Demonstration in NY (Interoperability Project) was awarded to Con Edison in 2009. The objective of the project was to develop and demonstrate methodologies to enhance the ability of customer sited Demand Response resources to integrate more effectively with electric delivery companies and regional transmission organizations.

  9. A formal method to real-time protocol interoperability testing

    Institute of Scientific and Technical Information of China (English)

    WANG ZhiLiang; YIN Xia; JING ChuanMing

    2008-01-01

    Interoperability testing is an important technique to ensure the quality of implementations of network communication protocol. In the next generation Internet protocol, real-time applications should be supported effectively. However, time constraints were not considered in the related studies of protocol interoperability testing, so existing interoperability testing methods are difficult to be applied in real-time protocol interoperability testing. In this paper, a formal method to realtime protocol interoperability testing is proposed. Firstly, a formal model CMpTIOA (communicating multi-port timed input output automata) is defined to specify the system under test (SUT) in real-time protocol interoperability testing; based on this model, timed interoperability relation is then defined. In order to check this relation,a test generation method is presented to generate a parameterized test behavior tree from SUT model; a mechanism of executability pre-determination is also integrated in the test generation method to alleviate state space explosion problem to some extent. The proposed theory and method are then applied in interoperability testing of IPv6 neighbor discovery protocol to show the feasibility of this method.

  10. Interactive test tool for interoperable C-ITS development

    NARCIS (Netherlands)

    Voronov, A.; Englund, C.; Bengtsson, H.H.; Chen, L.; Ploeg, J.; Jongh, J.F.C.M. de; Sluis, H.J.D. van de

    2015-01-01

    This paper presents the architecture of an Interactive Test Tool (ITT) for interoperability testing of Cooperative Intelligent Transport Systems (C-ITS). Cooperative systems are developed by different manufacturers at different locations, which makes interoperability testing a tedious task. Up until

  11. Communication: General Semantics Perspectives.

    Science.gov (United States)

    Thayer, Lee, Ed.

    This book contains the edited papers from the eleventh International Conference on General Semantics, titled "A Search for Relevance." The conference questioned, as a central theme, the relevance of general semantics in a world of wars and human misery. Reacting to a fundamental Korzybski-ian principle that man's view of reality is distorted by…

  12. The Semantic Learning Organization

    Science.gov (United States)

    Sicilia, Miguel-Angel; Lytras, Miltiadis D.

    2005-01-01

    Purpose: The aim of this paper is introducing the concept of a "semantic learning organization" (SLO) as an extension of the concept of "learning organization" in the technological domain. Design/methodology/approach: The paper takes existing definitions and conceptualizations of both learning organizations and Semantic Web technology to develop…

  13. Semantics of Statebuilding

    DEFF Research Database (Denmark)

    Grasten, Maj Lervad

    2016-01-01

    Book review of: Semantics of Statebuilding: Language, Meanings & Sovereignty / (eds) Nicolas Lemay-Hébert, Nicholas Onuf, Vojin Rakić, Petar Bojanić. Abingdon: Routledge, 2014. 200 pp.......Book review of: Semantics of Statebuilding: Language, Meanings & Sovereignty / (eds) Nicolas Lemay-Hébert, Nicholas Onuf, Vojin Rakić, Petar Bojanić. Abingdon: Routledge, 2014. 200 pp....

  14. PERSPECTIVES ON INTEROPERABILITY INTEGRATION WITHIN NATO DEFENSE PLANNING PROCESS

    Directory of Open Access Journals (Sweden)

    Florian CIOCAN

    2011-01-01

    Full Text Available Interoperability is not a new area of effort at NATO level. In fact, interoperability and more specifi cally standardization, has been a key element of the Alliance’s approach to fi elding forces for decades. But as the security and operational environment has been in a continuous change, the need to face the new threats and the current involvement in challenging operations in Afghanistan and elsewhere alongside with the necessity to interoperate at lower and lower levels of command with an increasing number of nations, including non-NATO ISAF partners, NGOs, and other organizations, have made the task even more challenging. In this respect Interoperability Integration within NATO Defense Planning Process will facilitate the timely identifi cation, development and delivery of required forces and capabilities that are interoperable and adequately prepared, equipped, trained and supported to undertake the Alliance’s full spectrum of missions.

  15. Interactive Digital Terrestrial Television: The Interoperability Challenge in Brazil

    Directory of Open Access Journals (Sweden)

    Jordi Amatller Clarasó

    2009-01-01

    Full Text Available This paper introduces different standards implemented in existing Digital Terrestrial Television Broadcasting systems to allow the fruition of interactive services and applications through digital Set Top Boxes. It focuses on the interoperability issue between the Brazilian and the European architectures. In fact, despite in Brazil the GEM specification has been designed to foster wide content compatibility across a range of interactive platforms, it has never come to a final implementation and deployment. As a result the interoperability issue has been deeply explored in the BEACON project and an innovative system architecture has been developed to deploy t-learning services across Europe and Brazil, providing integration of those systems that were not able to interoperate until nowadays. This work is an important step in the direction of standards' interoperability. As a result, MHP and Ginga NCL-Lua implementation appeared to be the very best choice to deliver interactive services in an interoperable mode between European and Brazilian digital television.

  16. S3QL: A distributed domain specific language for controlled semantic integration of life sciences data

    Directory of Open Access Journals (Sweden)

    de Lencastre Hermínia

    2011-07-01

    Full Text Available Abstract Background The value and usefulness of data increases when it is explicitly interlinked with related data. This is the core principle of Linked Data. For life sciences researchers, harnessing the power of Linked Data to improve biological discovery is still challenged by a need to keep pace with rapidly evolving domains and requirements for collaboration and control as well as with the reference semantic web ontologies and standards. Knowledge organization systems (KOSs can provide an abstraction for publishing biological discoveries as Linked Data without complicating transactions with contextual minutia such as provenance and access control. We have previously described the Simple Sloppy Semantic Database (S3DB as an efficient model for creating knowledge organization systems using Linked Data best practices with explicit distinction between domain and instantiation and support for a permission control mechanism that automatically migrates between the two. In this report we present a domain specific language, the S3DB query language (S3QL, to operate on its underlying core model and facilitate management of Linked Data. Results Reflecting the data driven nature of our approach, S3QL has been implemented as an application programming interface for S3DB systems hosting biomedical data, and its syntax was subsequently generalized beyond the S3DB core model. This achievement is illustrated with the assembly of an S3QL query to manage entities from the Simple Knowledge Organization System. The illustrative use cases include gastrointestinal clinical trials, genomic characterization of cancer by The Cancer Genome Atlas (TCGA and molecular epidemiology of infectious diseases. Conclusions S3QL was found to provide a convenient mechanism to represent context for interoperation between public and private datasets hosted at biomedical research institutions and linked data formalisms.

  17. A semantic security framework for systems of systems

    NARCIS (Netherlands)

    Trivellato, Daniel; Zannone, Nicola; Glaundrup, Maurice; Skowronek, Jacek; Etalle, Sandro

    2013-01-01

    Systems of systems (SoS) are dynamic coalitions of distributed, autonomous and heterogeneous systems that collaborate to achieve a common goal. While offering several advantages in terms of scalability and flexibility, the SoS paradigm has a strong impact on systems interoperability and on the secur

  18. Intelligent Discovery for Learning Objects Using Semantic Web Technologies

    Science.gov (United States)

    Hsu, I-Ching

    2012-01-01

    The concept of learning objects has been applied in the e-learning field to promote the accessibility, reusability, and interoperability of learning content. Learning Object Metadata (LOM) was developed to achieve these goals by describing learning objects in order to provide meaningful metadata. Unfortunately, the conventional LOM lacks the…

  19. Utilizing Statistical Semantic Similarity Techniques for Ontology Mapping——with Applications to AEC Standard Models

    Institute of Scientific and Technical Information of China (English)

    Pan Jiayi; Chin-Pang Jack Cheng; Gloria T. Lau; Kincho H. Law

    2008-01-01

    The objective of this paper is to introduce three semi-automated approaches for ontology mapping using relatedness analysis techniques. In the architecture, engineering, and construction (AEC) industry, there exist a number of ontological standards to describe the semantics of building models. Although the standards share similar scopes of interest, the task of comparing and mapping concepts among standards is challenging due to their differences in terminologies and perspectives. Ontology mapping is therefore necessary to achieve information interoperability, which allows two or more information sources to exchange data and to re-use the data for further purposes. The attribute-based approach, corpus-based approach, and name-based approach presented in this paper adopt the statistical relatedness analysis techniques to discover related concepts from heterogeneous ontologies. A pilot study is conducted on IFC and CIS/2 ontologies to evaluate the approaches. Preliminary results show that the attribute-based approach outperforms the other two approaches in terms of precision and F-measure.

  20. Test protocols for advanced inverter interoperability functions :

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Jay Dean; Gonzalez, Sigifredo; Ralph, Mark E.; Ellis, Abraham; Broderick, Robert Joseph

    2013-11-01

    Distributed energy resources (DER) such as photovoltaic (PV) systems, when deployed in a large scale, are capable of influencing significantly the operation of power systems. Looking to the future, stakeholders are working on standards to make it possible to manage the potentially complex interactions between DER and the power system. In 2009, the Electric Power Research Institute (EPRI), Sandia National Laboratories (SNL) with the U.S. Department of Energy (DOE), and the Solar Electric Power Association (SEPA) initiated a large industry collaborative to identify and standardize definitions for a set of DER grid support functions. While the initial effort concentrated on grid-tied PV inverters and energy storage systems, the concepts have applicability to all DER. A partial product of this on-going effort is a reference definitions document (IEC TR 61850-90-7, Object models for power converters in distributed energy resources (DER) systems) that has become a basis for expansion of related International Electrotechnical Commission (IEC) standards, and is supported by US National Institute of Standards and Technology (NIST) Smart Grid Interoperability Panel (SGIP). Some industry-led organizations advancing communications protocols have also embraced this work. As standards continue to evolve, it is necessary to develop test protocols to independently verify that the inverters are properly executing the advanced functions. Interoperability is assured by establishing common definitions for the functions and a method to test compliance with operational requirements. This document describes test protocols developed by SNL to evaluate the electrical performance and operational capabilities of PV inverters and energy storage, as described in IEC TR 61850-90-7. While many of these functions are not now required by existing grid codes or may not be widely available commercially, the industry is rapidly moving in that direction. Interoperability issues are already apparent as

  1. Secure Interoperable Open Smart Grid Demonstration Project

    Energy Technology Data Exchange (ETDEWEB)

    Magee, Thoman

    2014-12-31

    The Consolidated Edison, Inc., of New York (Con Edison) Secure Interoperable Open Smart Grid Demonstration Project (SGDP), sponsored by the United States (US) Department of Energy (DOE), demonstrated that the reliability, efficiency, and flexibility of the grid can be improved through a combination of enhanced monitoring and control capabilities using systems and resources that interoperate within a secure services framework. The project demonstrated the capability to shift, balance, and reduce load where and when needed in response to system contingencies or emergencies by leveraging controllable field assets. The range of field assets includes curtailable customer loads, distributed generation (DG), battery storage, electric vehicle (EV) charging stations, building management systems (BMS), home area networks (HANs), high-voltage monitoring, and advanced metering infrastructure (AMI). The SGDP enables the seamless integration and control of these field assets through a common, cyber-secure, interoperable control platform, which integrates a number of existing legacy control and data systems, as well as new smart grid (SG) systems and applications. By integrating advanced technologies for monitoring and control, the SGDP helps target and reduce peak load growth, improves the reliability and efficiency of Con Edison’s grid, and increases the ability to accommodate the growing use of distributed resources. Con Edison is dedicated to lowering costs, improving reliability and customer service, and reducing its impact on the environment for its customers. These objectives also align with the policy objectives of New York State as a whole. To help meet these objectives, Con Edison’s long-term vision for the distribution grid relies on the successful integration and control of a growing penetration of distributed resources, including demand response (DR) resources, battery storage units, and DG. For example, Con Edison is expecting significant long-term growth of DG

  2. Advancing translational research with the Semantic Web

    Directory of Open Access Journals (Sweden)

    Marshall M Scott

    2007-05-01

    Full Text Available Abstract Background A fundamental goal of the U.S. National Institute of Health (NIH "Roadmap" is to strengthen Translational Research, defined as the movement of discoveries in basic research to application at the clinical level. A significant barrier to translational research is the lack of uniformly structured data across related biomedical domains. The Semantic Web is an extension of the current Web that enables navigation and meaningful use of digital resources by automatic processes. It is based on common formats that support aggregation and integration of data drawn from diverse sources. A variety of technologies have been built on this foundation that, together, support identifying, representing, and reasoning across a wide range of biomedical data. The Semantic Web Health Care and Life Sciences Interest Group (HCLSIG, set up within the framework of the World Wide Web Consortium, was launched to explore the application of these technologies in a variety of areas. Subgroups focus on making biomedical data available in RDF, working with biomedical ontologies, prototyping clinical decision support systems, working on drug safety and efficacy communication, and supporting disease researchers navigating and annotating the large amount of potentially relevant literature. Results We present a scenario that shows the value of the information environment the Semantic Web can support for aiding neuroscience researchers. We then report on several projects by members of the HCLSIG, in the process illustrating the range of Semantic Web technologies that have applications in areas of biomedicine. Conclusion Semantic Web technologies present both promise and challenges. Current tools and standards are already adequate to implement components of the bench-to-bedside vision. On the other hand, these technologies are young. Gaps in standards and implementations still exist and adoption is limited by typical problems with early technology, such as the need

  3. A health analytics semantic ETL service for obesity surveillance.

    Science.gov (United States)

    Poulymenopoulou, M; Papakonstantinou, D; Malamateniou, F; Vassilacopoulos, G

    2015-01-01

    The increasingly large amount of data produced in healthcare (e.g. collected through health information systems such as electronic medical records - EMRs or collected through novel data sources such as personal health records - PHRs, social media, web resources) enable the creation of detailed records about people's health, sentiments and activities (e.g. physical activity, diet, sleep quality) that can be used in the public health area among others. However, despite the transformative potential of big data in public health surveillance there are several challenges in integrating big data. In this paper, the interoperability challenge is tackled and a semantic Extract Transform Load (ETL) service is proposed that seeks to semantically annotate big data to result into valuable data for analysis. This service is considered as part of a health analytics engine on the cloud that interacts with existing healthcare information exchange networks, like the Integrating the Healthcare Enterprise (IHE), PHRs, sensors, mobile applications, and other web resources to retrieve patient health, behavioral and daily activity data. The semantic ETL service aims at semantically integrating big data for use by analytic mechanisms. An illustrative implementation of the service on big data which is potentially relevant to human obesity, enables using appropriate analytic techniques (e.g. machine learning, text mining) that are expected to assist in identifying patterns and contributing factors (e.g. genetic background, social, environmental) for this social phenomenon and, hence, drive health policy changes and promote healthy behaviors where residents live, work, learn, shop and play. PMID:25991273

  4. A health analytics semantic ETL service for obesity surveillance.

    Science.gov (United States)

    Poulymenopoulou, M; Papakonstantinou, D; Malamateniou, F; Vassilacopoulos, G

    2015-01-01

    The increasingly large amount of data produced in healthcare (e.g. collected through health information systems such as electronic medical records - EMRs or collected through novel data sources such as personal health records - PHRs, social media, web resources) enable the creation of detailed records about people's health, sentiments and activities (e.g. physical activity, diet, sleep quality) that can be used in the public health area among others. However, despite the transformative potential of big data in public health surveillance there are several challenges in integrating big data. In this paper, the interoperability challenge is tackled and a semantic Extract Transform Load (ETL) service is proposed that seeks to semantically annotate big data to result into valuable data for analysis. This service is considered as part of a health analytics engine on the cloud that interacts with existing healthcare information exchange networks, like the Integrating the Healthcare Enterprise (IHE), PHRs, sensors, mobile applications, and other web resources to retrieve patient health, behavioral and daily activity data. The semantic ETL service aims at semantically integrating big data for use by analytic mechanisms. An illustrative implementation of the service on big data which is potentially relevant to human obesity, enables using appropriate analytic techniques (e.g. machine learning, text mining) that are expected to assist in identifying patterns and contributing factors (e.g. genetic background, social, environmental) for this social phenomenon and, hence, drive health policy changes and promote healthy behaviors where residents live, work, learn, shop and play.

  5. Investigating the capabilities of semantic enrichment of 3D CityEngine data

    Science.gov (United States)

    Solou, Dimitra; Dimopoulou, Efi

    2016-08-01

    In recent years the development of technology and the lifting of several technical limitations, has brought the third dimension to the fore. The complexity of urban environments and the strong need for land administration, intensify the need of using a three-dimensional cadastral system. Despite the progress in the field of geographic information systems and 3D modeling techniques, there is no fully digital 3D cadastre. The existing geographic information systems and the different methods of three-dimensional modeling allow for better management, visualization and dissemination of information. Nevertheless, these opportunities cannot be totally exploited because of deficiencies in standardization and interoperability in these systems. Within this context, CityGML was developed as an international standard of the Open Geospatial Consortium (OGC) for 3D city models' representation and exchange. CityGML defines geometry and topology for city modeling, also focusing on semantic aspects of 3D city information. The scope of CityGML is to reach common terminology, also addressing the imperative need for interoperability and data integration, taking into account the number of available geographic information systems and modeling techniques. The aim of this paper is to develop an application for managing semantic information of a model generated based on procedural modeling. The model was initially implemented in CityEngine ESRI's software, and then imported to ArcGIS environment. Final goal was the original model's semantic enrichment and then its conversion to CityGML format. Semantic information management and interoperability seemed to be feasible by the use of the 3DCities Project ESRI tools, since its database structure ensures adding semantic information to the CityEngine model and therefore automatically convert to CityGML for advanced analysis and visualization in different application areas.

  6. Semdrops: A Social Semantic Tagging Approach for Emerging Semantic Data

    OpenAIRE

    Torres, Diego; Diaz, Alicia; Skaf-Molli, Hala; Molli, Pascal

    2011-01-01

    Abstract--This paper proposes a collective intelligence strategy for emerging semantic data. It presents a combination of social web practices with semantic web technologies to enrich existing web resources with semantic data. The paper introduces a social semantic tagging approach called Semdrops. Semdrops defines a conceptual model which is an extension of the Gruber's tag model where the tag concept is extended to semantic tag. Semdrops is implemented as a Firefox add-on tool that turns the...

  7. The Founded Semantics and Constraint Semantics of Logic Rules

    OpenAIRE

    Liu, Yanhong A.; Stoller, Scott D.

    2016-01-01

    This paper describes a simple new semantics for logic rules, the founded semantics, and its straightforward extension to another simple new semantics, the constraint semantics. The new semantics support unrestricted negation, as well as unrestricted existential and universal quantifications. They are uniquely expressive and intuitive by allowing assumptions about the predicates and rules to be specified explicitly, are completely declarative and easy to understand, and relate cleanly to prior...

  8. ELN in the semantic era

    OpenAIRE

    Frey, Jeremy G.

    2006-01-01

    The importance of semantics in human-computer and computer-computer communications Capturing the laboratory processes and data in a semantically rich form at source. Implementing semantics - The use of the semantic web & grid The importance of context in the use of ELNs Publication and dissemination - Using the information obtained with ELNs

  9. A Semantic Graph Query Language

    Energy Technology Data Exchange (ETDEWEB)

    Kaplan, I L

    2006-10-16

    Semantic graphs can be used to organize large amounts of information from a number of sources into one unified structure. A semantic query language provides a foundation for extracting information from the semantic graph. The graph query language described here provides a simple, powerful method for querying semantic graphs.

  10. Wrapping and interoperating bioinformatics resources using CORBA.

    Science.gov (United States)

    Stevens, R; Miller, C

    2000-02-01

    Bioinformaticians seeking to provide services to working biologists are faced with the twin problems of distribution and diversity of resources. Bioinformatics databases are distributed around the world and exist in many kinds of storage forms, platforms and access paradigms. To provide adequate services to biologists, these distributed and diverse resources have to interoperate seamlessly within single applications. The Common Object Request Broker Architecture (CORBA) offers one technical solution to these problems. The key component of CORBA is its use of object orientation as an intermediate form to translate between different representations. This paper concentrates on an explanation of object orientation and how it can be used to overcome the problems of distribution and diversity by describing the interfaces between objects.

  11. Interoperability of Standards for Robotics in CIME

    DEFF Research Database (Denmark)

    Kroszynski, Uri; Sørensen, Torben; Ludwig, Arnold;

    1997-01-01

    Esprit Project 6457 "Interoperability of Standards for Robotics in CIME (InterRob)" belongs to the Subprogramme "Integration in Manufacturing" of Esprit, the European Specific Programme for Research and Development in Information Technology supported by the European Commision.The first main goal...... of InterRob was to close the information chain between product design, simulation, programming, and robot control by developing standardized interfaces and their software implementation for standards STEP (International Standard for the Exchange of Product model data, ISO 10303) and IRL (Industrial Robot...... Language, DIN 66312). This is a continuation of the previous Esprit projects CAD*I and NIRO, which developed substantial basics of STEP.The InterRob approach is based on standardized models for product geometry, kinematics, robotics, dynamics and control, hence on a coherent neutral information model...

  12. Towards Automatic Semantic Labelling of 3D City Models

    Science.gov (United States)

    Rook, M.; Biljecki, F.; Diakité, A. A.

    2016-10-01

    The lack of semantic information in many 3D city models is a considerable limiting factor in their use, as a lot of applications rely on semantics. Such information is not always available, since it is not collected at all times, it might be lost due to data transformation, or its lack may be caused by non-interoperability in data integration from other sources. This research is a first step in creating an automatic workflow that semantically labels plain 3D city model represented by a soup of polygons, with semantic and thematic information, as defined in the CityGML standard. The first step involves the reconstruction of the topology, which is used in a region growing algorithm that clusters upward facing adjacent triangles. Heuristic rules, embedded in a decision tree, are used to compute a likeliness score for these regions that either represent the ground (terrain) or a RoofSurface. Regions with a high likeliness score, to one of the two classes, are used to create a decision space, which is used in a support vector machine (SVM). Next, topological relations are utilised to select seeds that function as a start in a region growing algorithm, to create regions of triangles of other semantic classes. The topological relationships of the regions are used in the aggregation of the thematic building features. Finally, the level of detail is detected to generate the correct output in CityGML. The results show an accuracy between 85 % and 99 % in the automatic semantic labelling on four different test datasets. The paper is concluded by indicating problems and difficulties implying the next steps in the research.

  13. A framework for semantic reconciliation of disparate earth observation thematic data

    Science.gov (United States)

    Durbha, S. S.; King, R. L.; Shah, V. P.; Younan, N. H.

    2009-04-01

    There is a growing demand for digital databases of topographic and thematic information for a multitude of applications in environmental management, and also in data integration and efficient updating of other spatially oriented data. These thematic data sets are highly heterogeneous in syntax, structure and semantics as they are produced and provided by a variety of agencies having different definitions, standards and applications of the data. In this paper, we focus on the semantic heterogeneity in thematic information sources, as it has been widely recognized that the semantic conflicts are responsible for the most serious data heterogeneity problems hindering the efficient interoperability between heterogeneous information sources. In particular, we focus on the semantic heterogeneities present in the land cover classification schemes corresponding to the global land cover characterization data. We propose a framework (semantics enabled thematic data Integration (SETI)) that describes in depth the methodology involved in the reconciliation of such semantic conflicts by adopting the emerging semantic web technologies. Ontologies were developed for the classification schemes and a shared-ontology approach for integrating the application level ontologies as described. We employ description logics (DL)-based reasoning on the terminological knowledge base developed for the land cover characterization which enables querying and retrieval that goes beyond keyword-based searches.

  14. Arabic web pages clustering and annotation using semantic class features

    Directory of Open Access Journals (Sweden)

    Hanan M. Alghamdi

    2014-12-01

    Full Text Available To effectively manage the great amount of data on Arabic web pages and to enable the classification of relevant information are very important research problems. Studies on sentiment text mining have been very limited in the Arabic language because they need to involve deep semantic processing. Therefore, in this paper, we aim to retrieve machine-understandable data with the help of a Web content mining technique to detect covert knowledge within these data. We propose an approach to achieve clustering with semantic similarities. This approach comprises integrating k-means document clustering with semantic feature extraction and document vectorization to group Arabic web pages according to semantic similarities and then show the semantic annotation. The document vectorization helps to transform text documents into a semantic class probability distribution or semantic class density. To reach semantic similarities, the approach extracts the semantic class features and integrates them into the similarity weighting schema. The quality of the clustering result has evaluated the use of the purity and the mean intra-cluster distance (MICD evaluation measures. We have evaluated the proposed approach on a set of common Arabic news web pages. We have acquired favorable clustering results that are effective in minimizing the MICD, expanding the purity and lowering the runtime.

  15. UML 2 Semantics and Applications

    CERN Document Server

    Lano, Kevin

    2009-01-01

    A coherent and integrated account of the leading UML 2 semantics work and the practical applications of UML semantics development With contributions from leading experts in the field, the book begins with an introduction to UML and goes on to offer in-depth and up-to-date coverage of: The role of semantics Considerations and rationale for a UML system model Definition of the UML system model UML descriptive semantics Axiomatic semantics of UML class diagrams The object constraint language Axiomatic semantics of state machines A coalgebraic semantic framework for reasoning about interaction des

  16. BIM Interoperability Limitations: Australian and Malaysian Rail Projects

    Directory of Open Access Journals (Sweden)

    Kenley Russell

    2016-01-01

    Full Text Available Building information modelling (BIM is defined as a process involving the generation and management of digital representation of physical and functional characteristics of a facility. The purpose of interoperability in integrated or “open” BIM is to facilitate the information exchange between different digital systems, models and tools. There has been effort towards data interoperability with development of open source standards and object-oriented models, such as industry foundation classes (IFC for vertical infrastructure. However, the lack of open data standards for the information exchange for horizontal infrastructure limits the adoption and effectiveness of integrated BIM. The paper outlines two interoperability issues for construction of rail infrastructure. The issues are presented in two case study reports, one from Australia and one from Malaysia. The each case study includes: a description of the project, the application of BIM in the project, a discussion of the promised BIM interoperability solution plus the identification of the unresolved lack of interoperability for horizontal infrastructure project management. The Moreton Bay Rail project in Australia introduces general software interoperability issues. The Light Rail Extension project in Kuala Lumpur outlines an example of the integration problems related to two different location data structures. The paper highlights how the continuing lack of data interoperability limits utilisation of integrated BIM for horizontal infrastructure rail projects.

  17. Uniform Information Service Interoperation Framework among Heterogeneous Grids

    Directory of Open Access Journals (Sweden)

    Yongcai Tao

    2011-08-01

    Full Text Available Currently there is no practical standard for grid middleware, most of the grid platforms are built by their own, and it’s not easy to interoperate these grid platforms. Information service is one of the key components of the service oriented grid system, and information service interoperation among heterogeneous grids is the first step towards grid interoperation. Existing grid interoperability projects mainly focus on immediate bridge mechanisms, which is intricate and of poor scalability. To address the issues, a uniform information service interoperation framework (UISIF is proposed in the paper, which is based on the idea of mediated bridge mechanisms, and adopts virtual layer and plug-in technology. There are two well-established grid systems in China, ChinaGrid and China National Grid, and they are built with two different grid middleware, CGSP and VEGA. With UISIF, these two grid systems make information service interoperation a reality without changing the codes of current grid systems. The experiments also show that the information service interoperation has low information querying latency and high accuracy; Moreover, UISIF has good scalability based on hierarchical architecture, virtual layer and plug-in technology.

  18. Reactive Kripke semantics

    CERN Document Server

    Gabbay, Dov M

    2013-01-01

    This text offers an extension to the traditional Kripke semantics for non-classical logics by adding the notion of reactivity. Reactive Kripke models change their accessibility relation as we progress in the evaluation process of formulas in the model. This feature makes the reactive Kripke semantics strictly stronger and more applicable than the traditional one. Here we investigate the properties and axiomatisations of this new and most effective semantics, and we offer a wide landscape of applications of the idea of reactivity. Applied topics include reactive automata, reactive grammars, rea

  19. Semantic Web Evaluation Challenge

    CERN Document Server

    2014-01-01

    This book constitutes the thoroughly refereed post conference proceedings of the first edition of the Semantic Web Evaluation Challenge, SemWebEval 2014, co-located with the 11th Extended Semantic Web conference, held in Anissaras, Crete, Greece, in May 2014. This book includes the descriptions of all methods and tools that competed at SemWebEval 2014, together with a detailed description of the tasks, evaluation procedures and datasets. The contributions are grouped in three areas: semantic publishing (sempub), concept-level sentiment analysis (ssa), and linked-data enabled recommender systems (recsys).

  20. Design and Implementation of an Interoperable Object Platform for Multi-Databases

    Institute of Scientific and Technical Information of China (English)

    GU Ning; XU Xuebiao; SHI Baile

    2000-01-01

    In this paper, the authors present the design and implementation of an Interoperable Object Platform for Multi-Databases (IOPMD). The aim of the system is to provide a uniform object view and a set of tools for object manipulation and query based on heterogeneous multiple data sources under client/server environment. The common object model is compatible with ODMG2.0 and OMG's CORBA, which provides main OO features such as OID, attribute, method, inheritance, reference, etc. Three types of interfaces, namely Vface, IOQL and C++ API, are given to provide the database programmer with tools and functionalities for application development. Nested transactions and compensating technology are adopted in transaction manager. In discussing some key implementation techniques, translation and mapping approaches from various schemata to a common object schema are proposed. Buffer management provides the data caching policy and consistency maintenance of cached data. Version management presents some operations based on the definitions in semantic version model, and introduces the implementation of the semantic version graph.

  1. Interoperability between biomedical ontologies through relation expansion, upper-level ontologies and automatic reasoning.

    Directory of Open Access Journals (Sweden)

    Robert Hoehndorf

    Full Text Available Researchers design ontologies as a means to accurately annotate and integrate experimental data across heterogeneous and disparate data- and knowledge bases. Formal ontologies make the semantics of terms and relations explicit such that automated reasoning can be used to verify the consistency of knowledge. However, many biomedical ontologies do not sufficiently formalize the semantics of their relations and are therefore limited with respect to automated reasoning for large scale data integration and knowledge discovery. We describe a method to improve automated reasoning over biomedical ontologies and identify several thousand contradictory class definitions. Our approach aligns terms in biomedical ontologies with foundational classes in a top-level ontology and formalizes composite relations as class expressions. We describe the semi-automated repair of contradictions and demonstrate expressive queries over interoperable ontologies. Our work forms an important cornerstone for data integration, automatic inference and knowledge discovery based on formal representations of knowledge. Our results and analysis software are available at http://bioonto.de/pmwiki.php/Main/ReasonableOntologies.

  2. Compactness theorems of fuzzy semantics

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    The relationship among diverse fuzzy semantics vs. the corresponding logic consequence operators has been analyzed systematically. The results that compactness and logical compactness of fuzzy semantics are equivalent to compactness and continuity of the logic consequence operator induced by the semantics respectively have been proved under certain conditions. A general compactness theorem of fuzzy semantics have been established which says that every fuzzy semantics defined on a free algebra with members corresponding to continuous functions is compact.

  3. Belief Semantics of Authorization Logic

    OpenAIRE

    Hirsch, Andrew K.; Clarkson, Michael R.

    2013-01-01

    Authorization logics have been used in the theory of computer security to reason about access control decisions. In this work, a formal belief semantics for authorization logics is given. The belief semantics is proved to subsume a standard Kripke semantics. The belief semantics yields a direct representation of principals' beliefs, without resorting to the technical machinery used in Kripke semantics. A proof system is given for the logic; that system is proved sound with respect to the beli...

  4. IoT interoperability:a hub-based approach

    OpenAIRE

    Blackstock, Michael; Lea, Rodger

    2014-01-01

    Interoperability in the Internet of Things is critical for emerging services and applications. In this paper we advocate the use of IoT ‘hubs’ to aggregate things using web protocols, and suggest a staged approach to interoperability. In the context of a UK government funded project involving 8 IoT projects to address cross-domain IoT interoperability, we introduce the HyperCat IoT catalogue specification. We then describe the tools and techniques we developed to adapt an existing data portal...

  5. Interoperability in Collaborative Processes: Requirements Characterisation and Proof Approach

    Science.gov (United States)

    Roque, Matthieu; Chapurlat, Vincent

    Interoperability problems which can occur during the collaboration between several enterprises can endanger this collaboration. Consequently, it is necessary to become able to anticipate these problems. The proposed approach in this paper is based on the specification of properties, representing interoperability requirements, and their analysis on enterprise models. Due to the conceptual limits of existing modeling languages, formalizing these requirements and intending to translate them under the form of properties need to add conceptual enrichments to these languages. Finally, the analysis of the properties on enriched enterprise models, by formal checking techniques, aims to provide tools allowing to reasoning on enterprise models in order to detect interoperability problems, from an anticipative manner.

  6. Cloud portability and interoperability issues and current trends

    CERN Document Server

    Di Martino, Beniamino; Esposito, Antonio

    2015-01-01

    This book offers readers a quick, comprehensive and up-to-date overview of the most important methodologies, technologies, APIs and standards related to the portability and interoperability of cloud applications and services, illustrated by a number of use cases representing a variety of interoperability and portability scenarios. The lack of portability and interoperability between cloud platforms at different service levels is the main issue affecting cloud-based services today. The brokering, negotiation, management, monitoring and reconfiguration of cloud resources are challenging tasks

  7. A Semantic-Aware Data Management System for Seismic Engineering Research Projects and Experiments

    Directory of Open Access Journals (Sweden)

    Md. Rashedul Hasan

    2015-04-01

    Full Text Available The invention of the Semantic Web and related technologies is fostering a computing paradigm that entails a shift from databases to Knowledge Bases (KBs. There the core is the ontology that plays a main role in enabling reasoning power that can make implicit facts explicit; in order to produce better results for users. In addition, KB-based systems provide mechanisms to manage information and semantics thereof, that can make systems semantically interoperable and as such can exchange and share data between them. In order to overcome the interoperability issues and to exploit the benefits offered by state of the art technologies, we moved to KB-based system. This paper presents the development of an earthquake engineering ontology with a focus on research project management and experiments. The developed ontology was validated by domain experts, published in RDF and integrated into WordNet. Data originating from scientific experiments such as cyclic and pseudo dynamic tests were also published in RDF. We exploited the power of Semantic Web technologies, namely Jena, Virtuoso and VirtGraph tools in order to publish, storage and manage RDF data, respectively. Finally, a system was developed with the full integration of ontology, experimental data and tools, to evaluate the effectiveness of the KB-based approach; it yielded favorable outcomes.

  8. Live Social Semantics

    OpenAIRE

    Alani, Harith; Szomszor, Martin; Cattuto, Ciro; Van den Broeck, Wouter; Correndo, Gianluca; Barrat, Alain

    2009-01-01

    Social interactions are one of the key factors to the success of conferences and similar community gatherings. This paper describes a novel application that integrates data from the semantic web, online social networks, and a real-world contact sensing platform. This application was successfully deployed at ESWC09, and actively used by 139 people. Personal profiles of the participants were automatically generated using several Web~2.0 systems and semantic academic data sources, and integrated...

  9. The Semantic Web Languages

    OpenAIRE

    Giunchiglia, Fausto; Farazi, Feroz; Tanca, Letizia; Virgilio, Roberto

    2009-01-01

    The Semantic Web is basically an extension of the Web and of the Web-enabling database and Internet technology, and, as a consequence, the Semantic Web methodologies, representation mechanisms and logics strongly rely on those developed in databases. This is the motivation for many attempts to, more or less loosely, merge the two worlds like, for instance, the various proposals to use relational technology for storing web data or the use of ontologies for data integration. This article comes ...

  10. Semantics on Translation

    Institute of Scientific and Technical Information of China (English)

    李琦

    2014-01-01

    Semantics is the study of the meanings of words and sentences. While word is the most basic unit in every language and the understanding of the word meaning is the most important problem in translation. Therefore, the analysis of semantics just provides a very direct approach to doing translation. In this paper, I’d like to focus on the three kinds of word meaning in transla- tion, the ambiguities caused by the word meaning and how to deal with such ambiguities.

  11. Ontology-Based Semantic Cache in AOKB

    Institute of Scientific and Technical Information of China (English)

    郑红; 陆汝钤; 金芝; 胡思康

    2002-01-01

    When querying on a large-scale knowledge base, a major technique of im-proving performance is to preload knowledge to minimize the number of roundtrips to theknowledge base. In this paper, an ontology-based semantic cache is proposed for an agentand ontology-oriented knowledge base (AOKB). In AOKB, an ontology is the collection of re-lationships between a group of knowledge units (agents and/or other sub-ontologies). Whenloading some agent A, its relationships with other knowledge units are examined, and thosewho have a tight semantic tie with A will be preloaded at the same time, including agents andsub-ontologies in the same ontology where A is. The preloaded agents and ontologies are savedat a semantic cache located in the memory. Test results show that up to 50% reduction inrunning time is achieved.

  12. Towards Data Repository Interoperability: The Data Conservancy Data Packaging Specification

    Science.gov (United States)

    DiLauro, T.; Duerr, R.; Thessen, A. E.; Rippin, M.; Pralle, B.; Choudhury, G. S.

    2013-12-01

    description, the DCS instance will be able to provide default mappings for the directories and files within the package payload and enable support for deposited content at a lower level of service. Internally, the DCS will map these hybrid package serializations to its own internal business objects and their properties. Thus, this approach is highly extensible, as other packaging formats could be mapped in a similar manner. In addition, this scheme supports establishing the fixity of the payload while still supporting update of the semantic overlay data. This allows a data producer with scarce resources or an archivist who acquires a researcher's data to package the data for deposit with the intention of augmenting the resource description in the future. The Data Conservancy is partnering with the Sustainable Environment Actionable Data[4] project to test the interoperability of this new packaging mechanism. [1] Data Conservancy: http://dataconservancy.org/ [2] BagIt: https://datatracker.ietf.org/doc/draft-kunze-bagit/ [3] OAI-ORE: http://www.openarchives.org/ore/1.0/ [4] SEAD: http://sead-data.net/

  13. Study on Semantic Assets for Smart Appliances Interoperability : D-S4: FINAL REPORT

    NARCIS (Netherlands)

    Daniele, L.M.; Hartog, F.T.H. den; Roes, J.B.M.

    2015-01-01

    About two thirds of the energy consumed in buildings originates from household appliances. Nowadays, appliances are often intelligent and networked devices that form complete energy consuming, producing, and managing systems. Reducing energy consumption is therefore a matter of managing and optimizi

  14. Toward semantic interoperability of energy using and producing appliances in residential environments

    NARCIS (Netherlands)

    Hartog, F.T.H. den; Daniele, L.M.; Roes, J.B.M.

    2015-01-01

    About two thirds of the energy consumed in buildings originates household appliances. Nowadays, appliances are often intelligent and networked devices that form complete energy consuming, producing, and managing systems. Reducing energy is therefore a matter of managing and optimizing the energy uti

  15. A Proof-of-Concept for Semantically Interoperable Federation of IoT Experimentation Facilities

    OpenAIRE

    Jorge Lanza; Luis Sanchez; David Gomez; Tarek Elsaleh; Ronald Steinke; Flavio Cirillo

    2016-01-01

    The Internet-of-Things (IoT) is unanimously identified as one of the main pillars of future smart scenarios. The potential of IoT technologies and deployments has been already demonstrated in a number of different application areas, including transport, energy, safety and healthcare. However, despite the growing number of IoT deployments, the majority of IoT applications tend to be self-contained, thereby forming application silos. A lightweight data centric integration and combination of the...

  16. Genericity versus expressivity - an exercise in semantic interoperable research information systems for Web Science

    NARCIS (Netherlands)

    Guéret, Christophe; Chambers, Tamy; Reijnhoudt, Linda; Most, Frank van der; Scharnhorst, Andrea

    2013-01-01

    The web does not only enable new forms of science, it also creates new possibilities to study science and new digital scholarship. This paper brings together multiple perspectives: from individual researchers seeking the best options to display their activities and market their skills on the academi

  17. Genericity versus expressivity - an exercise in semantic interoperable research information systems for Web Science

    OpenAIRE

    Guéret, Christophe; Chambers, Tamy; Reijnhoudt, Linda; Most, Frank van der; Scharnhorst, Andrea

    2013-01-01

    The web does not only enable new forms of science, it also creates new possibilities to study science and new digital scholarship. This paper brings together multiple perspectives: from individual researchers seeking the best options to display their activities and market their skills on the academic job market; to academic institutions, national funding agencies, and countries needing to monitor the science system and account for public money spending. We also address the research interests ...

  18. Clinical information modeling processes for semantic interoperability of electronic health records: systematic review and inductive analysis

    OpenAIRE

    Moreno-Conde, Alberto; Moner Cano, David; Da Cruz, Wellington Dimas; Santos, Marcelo R.; Maldonado Segura, José Alberto; Robles Viejo, Montserrat; KALRA, Dipak

    2015-01-01

    This is a pre-copyedited, author-produced PDF of an article accepted for publication in Journal of the American Medical Informatics Association following peer review. The version of record is available online at: http://dx.doi.org/10.1093/jamia/ocv008

  19. Semantic Interoperable Electronic Patient Records: The Unfolding of Consensus based Archetypes.

    Science.gov (United States)

    Pedersen, Rune; Wynn, Rolf; Ellingsen, Gunnar

    2015-01-01

    This paper is a status report from a large-scale openEHR-based EPR project from the North Norway Regional Health Authority encouraged by the unfolding of a national repository for openEHR archetypes. Clinicians need to engage in, and be responsible for the production of archetypes. The consensus processes have so far been challenged by a low number of active clinicians, a lack of critical specialties to reach consensus, and a cumbersome review process (3 or 4 review rounds) for each archetype. The goal is to have several clinicians from each specialty as a backup if one is hampered to participate. Archetypes and their importance for structured data and sharing of information has to become more visible for the clinicians through more sharpened information practice. PMID:25991124

  20. KYOTO: A Wiki for Establishing Semantic Interoperability for Knowledge Sharing Across Languages and Cultures

    OpenAIRE

    Marchetti, Andrea; Ronzano, Francesco; Tesconi, Maurizio; Vossen, Piek; Agirre, Eneko; Bond, Francis; Bosma, Wauter; Herold, Axel; Hicks, Amanda; Hsieh, Shu-Kai; Isahara, Hitoshi; Huang, Chu-Ren; Kanzaki, Kyoko; Rigau, German; Segers, Roxane

    2010-01-01

    KYOTO is an Asian-European project developing a community platform for modeling knowledge and finding facts across languages and cultures. The platform operates as a Wiki system that multilingual and multi-cultural communities can use to agree on the meaning of terms in specific domains. The Wiki is fed with terms that are automatically extracted from documents in different languages. The users can modify these terms and relate them across languages. The system generates complex, language-neu...

  1. Minimal-Length Interoperability Test Sequences Generation via Genetic Algorithm

    Institute of Scientific and Technical Information of China (English)

    ZHONG Ning; KUANG Jing-ming; HE Zun-wen

    2008-01-01

    A novel interoperability test sequences optimization scheme is proposed in which the genetic algo-rithm(GA)is used to obtain the minimal-length interoperability test sequences.During our work,the basicin teroperability test sequences are generated based on the minimal-complete-coverage criterion,which removes the redundancy from conformance test sequences.Then interoperability sequences minimization problem can be considered as an instance of the set covering problem,and the GA is applied to remove redundancy in interoperability transitions.The results show that compared to conventional algorithm,the proposed algorithm is more practical to avoid the state space explosion problem,for it can reduce the length of the test sequences and maintain the same transition coverage.

  2. An Ontological Solution to Support Interoperability in the Textile Industry

    Science.gov (United States)

    Duque, Arantxa; Campos, Cristina; Jiménez-Ruiz, Ernesto; Chalmeta, Ricardo

    Significant developments in information and communication technologies and challenging market conditions have forced enterprises to adapt their way of doing business. In this context, providing mechanisms to guarantee interoperability among heterogeneous organisations has become a critical issue. Even though prolific research has already been conducted in the area of enterprise interoperability, we have found that enterprises still struggle to introduce fully interoperable solutions, especially, in terms of the development and application of ontologies. Thus, the aim of this paper is to introduce basic ontology concepts in a simple manner and to explain the advantages of the use of ontologies to improve interoperability. We will also present a case study showing the implementation of an application ontology for an enterprise in the textile/clothing sector.

  3. Patterns in Standards and Technologies for Economic Information Systems Interoperability

    Directory of Open Access Journals (Sweden)

    Vasile Irimia

    2012-06-01

    Full Text Available This paper presets results from a review of the current standards used for collaboration between economic information systems, including web services and service oriented architecture, EDI, ebXML framework, RosettaNet framework, cXML, xCBL UBL, BPMN, BPEL, WS-CDL, ASN.1, and others. Standards have a key role in promoting economic information system interoperability, and thus enable collaboration. Analyzing the current standards, technologies and applications used for economic information systems interoperability has revealed a common pattern that runs through all of them. From this pattern we construct a basic model of interoperability around which we relate and judge all standards, technologies and applications for economic information systems interoperability.

  4. A Proposed Information Architecture for Telehealth System Interoperability

    Energy Technology Data Exchange (ETDEWEB)

    Warren, S.; Craft, R.L.; Parks, R.C.; Gallagher, L.K.; Garcia, R.J.; Funkhouser, D.R.

    1999-04-07

    Telemedicine technology is rapidly evolving. Whereas early telemedicine consultations relied primarily on video conferencing, consultations today may utilize video conferencing, medical peripherals, store-and-forward capabilities, electronic patient record management software, and/or a host of other emerging technologies. These remote care systems rely increasingly on distributed, collaborative information technology during the care delivery process, in its many forms. While these leading-edge systems are bellwethers for highly advanced telemedicine, the remote care market today is still immature. Most telemedicine systems are custom-designed and do not interoperate with other commercial offerings. Users are limited to a set of functionality that a single vendor provides and must often pay high prices to obtain this functionality, since vendors in this marketplace must deliver entire systems in order to compete. Besides increasing corporate research and development costs, this inhibits the ability of the user to make intelligent purchasing decisions regarding best-of-breed technologies. We propose a secure, object-oriented information architecture for telemedicine systems that promotes plug-and-play interaction between system components through standardized interfaces, communication protocols, messaging formats, and data definitions. In this architecture, each component functions as a black box, and components plug together in a lego-like fashion to achieve the desired device or system functionality. The architecture will support various ongoing standards work in the medical device arena.

  5. A Proposed Information Architecture for Telehealth System Interoperability

    Energy Technology Data Exchange (ETDEWEB)

    Craft, R.L.; Funkhouser, D.R.; Gallagher, L.K.; Garica, R.J.; Parks, R.C.; Warren, S.

    1999-04-20

    We propose an object-oriented information architecture for telemedicine systems that promotes secure `plug-and-play' interaction between system components through standardized interfaces, communication protocols, messaging formats, and data definitions. In this architecture, each component functions as a black box, and components plug together in a ''lego-like'' fashion to achieve the desired device or system functionality. Introduction Telemedicine systems today rely increasingly on distributed, collaborative information technology during the care delivery process. While these leading-edge systems are bellwethers for highly advanced telemedicine, most are custom-designed and do not interoperate with other commercial offerings. Users are limited to a set of functionality that a single vendor provides and must often pay high prices to obtain this functionality, since vendors in this marketplace must deliver en- tire systems in order to compete. Besides increasing corporate research and development costs, this inhibits the ability of the user to make intelligent purchasing decisions regarding best-of-breed technologies. This paper proposes a reference architecture for plug-and-play telemedicine systems that addresses these issues.

  6. MPEG-4 IPMP Extension for Interoperable Protection of Multimedia Content

    Directory of Open Access Journals (Sweden)

    Ming Ji

    2004-10-01

    Full Text Available To ensure secure content delivery, the Motion Picture Experts Group (MPEG has dedicated significant effort to the digital rights management (DRM issues. MPEG is now moving from defining only hooks to proprietary systems (e.g., in MPEG-2, MPEG-4 Version 1 to specifying a more encompassing standard in intellectual property management and protection (IPMP. MPEG feels that this is necessary in order to achieve MPEG's most important goal: interoperability. The design of the IPMP Extension framework also considers the complexity of the MPEG-4 standard and the diversity of its applications. This architecture leaves the details of the design of IPMP tools in the hands of applications developers, while ensuring the maximum flexibility and security. This paper first briefly describes the background of the development of the MPEG-4 IPMP Extension. It then presents an overview of the MPEG-4 IPMP Extension, including its architecture, the flexible protection signaling, and the secure messaging framework for the communication between the terminal and the tools. Two sample usage scenarios are also provided to illustrate how an MPEG-4 IPMP Extension compliant system works.

  7. AceWiki: A Natural and Expressive Semantic Wiki

    CERN Document Server

    Kuhn, Tobias

    2008-01-01

    We present AceWiki, a prototype of a new kind of semantic wiki using the controlled natural language Attempto Controlled English (ACE) for representing its content. ACE is a subset of English with a restricted grammar and a formal semantics. The use of ACE has two important advantages over existing semantic wikis. First, we can improve the usability and achieve a shallow learning curve. Second, ACE is more expressive than the formal languages of existing semantic wikis. Our evaluation shows that people who are not familiar with the formal foundations of the Semantic Web are able to deal with AceWiki after a very short learning phase and without the help of an expert.

  8. Frame semantics-based study of verbs across medical genres.

    Science.gov (United States)

    Wandji Tchami, Ornella; L'Homme, Marie-Claude; Grabar, Natalia

    2014-01-01

    The field of medicine gathers actors with different levels of expertise. These actors must interact, although their mutual understanding is not always completely successful. We propose to study corpora (with high and low levels of expertise) in order to observe their specificities. More specifically, we perform a contrastive analysis of verbs, and of the syntactic and semantic features of their participants, based on the Frame Semantics framework and the methodology implemented in FrameNet. In order to achieve this, we use an existing medical terminology to automatically annotate the semantics classes of participants of verbs, which we assume are indicative of semantics roles. Our results indicate that verbs show similar or very close semantics in some contexts, while in other contexts they behave differently. These results are important for studying the understanding of medical information by patients and for improving the communication between patients and medical doctors.

  9. Semantics in NETMAR (open service NETwork for MARine environmental data)

    Science.gov (United States)

    Leadbetter, Adam; Lowry, Roy; Clements, Oliver

    2010-05-01

    Over recent years, there has been a proliferation of environmental data portals utilising a wide range of systems and services, many of which cannot interoperate. The European Union Framework 7 project NETMAR (that commenced February 2010) aims to provide a toolkit for building such portals in a coherent manner through the use of chained Open Geospatial Consortium Web Services (WxS), OPeNDAP file access and W3C standards controlled by a Business Process Execution Language workflow. As such, the end product will be configurable by user communities interested in developing a portal for marine environmental data, and will offer search, download and integration tools for a range of satellite, model and observed data from open ocean and coastal areas. Further processing of these data will also be available in order to provide statistics and derived products suitable for decision making in the chosen environmental domain. In order to make the resulting portals truly interoperable, the NETMAR programme requires a detailed definition of the semantics of the services being called and the data which are being requested. A key goal of the NETMAR programme is, therefore, to develop a multi-domain and multilingual ontology of marine data and services. This will allow searches across both human languages and across scientific domains. The approach taken will be to analyse existing semantic resources and provide mappings between them, gluing together the definitions, semantics and workflows of the WxS services. The mappings between terms aim to be more general than the standard "narrower than", "broader than" type seen in the thesauri or simple ontologies implemented by previous programmes. Tools for the development and population of ontologoies will also be provided by NETMAR as there will be instances in which existing resources cannot sufficiently describe newly encountered data or services.

  10. Processing biological literature with customizable Web services supporting interoperable formats

    OpenAIRE

    Rak, Rafal; Batista-Navarro, Riza Theresa; Carter, Jacob; Rowley, Andrew; Ananiadou, Sophia

    2014-01-01

    Web services have become a popular means of interconnecting solutions for processing a body of scientific literature. This has fuelled research on high-level data exchange formats suitable for a given domain and ensuring the interoperability of Web services. In this article, we focus on the biological domain and consider four interoperability formats, BioC, BioNLP, XMI and RDF, that represent domain-specific and generic representations and include well-established as well as emerging specific...

  11. Ensuring Sustainable Data Interoperability Across the Natural and Social Sciences

    Science.gov (United States)

    Downs, R. R.; Chen, R. S.

    2015-12-01

    Both the natural and social science data communities are attempting to address the long-term sustainability of their data infrastructures in rapidly changing research, technological, and policy environments. Many parts of these communities are also considering how to improve the interoperability and integration of their data and systems across natural, social, health, and other domains. However, these efforts have generally been undertaken in parallel, with little thought about how different sustainability approaches may impact long-term interoperability from scientific, legal, or economic perspectives, or vice versa, i.e., how improved interoperability could enhance—or threaten—infrastructure sustainability. Scientific progress depends substantially on the ability to learn from the legacy of previous work available for current and future scientists to study, often by integrating disparate data not previously assembled. Digital data are less likely than scientific publications to be usable in the future unless they are managed by science-oriented repositories that can support long-term data access with the documentation and services needed for future interoperability. We summarize recent discussions in the social and natural science communities on emerging approaches to sustainability and relevant interoperability activities, including efforts by the Belmont Forum E-Infrastructures project to address global change data infrastructure needs; the Group on Earth Observations to further implement data sharing and improve data management across diverse societal benefit areas; and the Research Data Alliance to develop legal interoperability principles and guidelines and to address challenges faced by domain repositories. We also examine emerging needs for data interoperability in the context of the post-2015 development agenda and the expected set of Sustainable Development Goals (SDGs), which set ambitious targets for sustainable development, poverty reduction, and

  12. Supply chain business patterns definition for process interoperability

    OpenAIRE

    Yahia, Esma; Bigand, Michel; Bourey, Jean Pierre; Castelain, Emmanuel

    2009-01-01

    International audience; In the framework of international trading, new regulations are being drawn up concerning safety and in order to prohibit counterfeit goods. Due to the short delays imposed by Customs, the trend is still toward paperless trading. As a consequence, a better process and software interoperability is needed between the different actors of trading (customer, supplier, Customs...); a first step in software interoperability consists in process modeling. This paper presents a p...

  13. Client-based CardSpace-Shibboleth Interoperation

    OpenAIRE

    Al-Sinani, Haitham; Mitchell, Chris J

    2012-01-01

    Whilst the growing number of identity management systems have the potential to reduce the threat of identity attacks, major deployment problems remain because of the lack of interoperability between such systems. In this paper we propose a simple, novel scheme to provide interoperability between two of the most widely discussed identity systems, namely CardSpace and Shibboleth. In this scheme, CardSpace users are able to obtain an assertion token from a Shibboleth-enabled identity provider...

  14. A Collaborative System Software Solution for Modeling Business Flows Based on Automated Semantic Web Service Composition

    Directory of Open Access Journals (Sweden)

    Ion SMEUREANU

    2009-01-01

    Full Text Available Nowadays, business interoperability is one of the key factors for assuring competitive advantage for the participant business partners. In order to implement business cooperation, scalable, distributed and portable collaborative systems have to be implemented. This article presents some of the mostly used technologies in this field. Furthermore, it presents a software application architecture based on Business Process Modeling Notation standard and automated semantic web service coupling for modeling business flow in a collaborative manner. The main business processes will be represented in a single, hierarchic flow diagram. Each element of the diagram will represent calls to semantic web services. The business logic (the business rules and constraints will be structured with the help of OWL (Ontology Web Language. Moreover, OWL will also be used to create the semantic web service specifications.

  15. A semantically-aided approach for online annotation and retrieval of medical images.

    Science.gov (United States)

    Kyriazos, George K; Gerostathopoulos, Ilias Th; Kolias, Vassileios D; Stoitsis, John S; Nikita, Konstantina S

    2011-01-01

    The need for annotating the continuously increasing volume of medical image data is recognized from medical experts for a variety of purposes, regardless if this is medical practice, research or education. The rich information content latent in medical images can be made explicit and formal with the use of well-defined ontologies. Evolution of the Semantic Web now offers a unique opportunity of a web-based, service-oriented approach. Remote access to FMA and ICD-10 reference ontologies provides the ontological annotation framework. The proposed system utilizes this infrastructure to provide a customizable and robust annotation procedure. It also provides an intelligent search mechanism indicating the advantages of semantic over keyword search. The common representation layer discussed facilitates interoperability between institutions and systems, while semantic content enables inference and knowledge integration.

  16. The HDF Product Designer - Interoperability in the First Mile

    Science.gov (United States)

    Lee, H.; Jelenak, A.; Habermann, T.

    2014-12-01

    Interoperable data have been a long-time goal in many scientific communities. The recent growth in analysis, visualization and mash-up applications that expect data stored in a standardized manner has brought the interoperability issue to the fore. On the other hand, producing interoperable data is often regarded as a sideline task in a typical research team for which resources are not readily available. The HDF Group is developing a software tool aimed at lessening the burden of creating data in standards-compliant, interoperable HDF5 files. The tool, named HDF Product Designer, lowers the threshold needed to design such files by providing a user interface that combines the rich HDF5 feature set with applicable metadata conventions. Users can quickly devise new HDF5 files while at the same time seamlessly incorporating the latest best practices and conventions from their community. That is what the term interoperability in the first mile means: enabling generation of interoperable data in HDF5 files from the onset of their production. The tool also incorporates collaborative features, allowing team approach in the file design, as well as easy transfer of best practices as they are being developed. The current state of the tool and the plans for future development will be presented. Constructive input from interested parties is always welcome.

  17. Foundations of semantic web technologies

    CERN Document Server

    Hitzler, Pascal; Rudolph, Sebastian

    2009-01-01

    The Quest for Semantics Building Models Calculating with Knowledge Exchanging Information Semanic Web Technologies RESOURCE DESCRIPTION LANGUAGE (RDF)Simple Ontologies in RDF and RDF SchemaIntroduction to RDF Syntax for RDF Advanced Features Simple Ontologies in RDF Schema Encoding of Special Data Structures An ExampleRDF Formal Semantics Why Semantics? Model-Theoretic Semantics for RDF(S) Syntactic Reasoning with Deduction Rules The Semantic Limits of RDF(S)WEB ONTOLOGY LANGUAGE (OWL) Ontologies in OWL OWL Syntax and Intuitive Semantics OWL Species The Forthcoming OWL 2 StandardOWL Formal Sem

  18. AliEn - EDG Interoperability in ALICE

    CERN Document Server

    Bagnasco, S; Buncic, P; Carminati, F; Cerello, P G; Saiz, P

    2003-01-01

    AliEn (ALICE Environment) is a GRID-like system for large scale job submission and distributed data management developed and used in the context of ALICE, the CERN LHC heavy-ion experiment. With the aim of exploiting upcoming Grid resources to run AliEn-managed jobs and store the produced data, the problem of AliEn-EDG interoperability was addressed and an in-terface was designed. One or more EDG (European Data Grid) User Interface machines run the AliEn software suite (Cluster Monitor, Storage Element and Computing Element), and act as interface nodes between the systems. An EDG Resource Broker is seen by the AliEn server as a single Computing Element, while the EDG storage is seen by AliEn as a single, large Storage Element; files produced in EDG sites are registered in both the EDG Replica Catalogue and in the AliEn Data Catalogue, thus ensuring accessibility from both worlds. In fact, both registrations are required: the AliEn one is used for the data management, the EDG one to guarantee the integrity and...

  19. Interoperable Data Sharing for Diverse Scientific Disciplines

    Science.gov (United States)

    Hughes, John S.; Crichton, Daniel; Martinez, Santa; Law, Emily; Hardman, Sean

    2016-04-01

    For diverse scientific disciplines to interoperate they must be able to exchange information based on a shared understanding. To capture this shared understanding, we have developed a knowledge representation framework using ontologies and ISO level archive and metadata registry reference models. This framework provides multi-level governance, evolves independent of implementation technologies, and promotes agile development, namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. The knowledge representation framework is populated through knowledge acquisition from discipline experts. It is also extended to meet specific discipline requirements. The result is a formalized and rigorous knowledge base that addresses data representation, integrity, provenance, context, quantity, and their relationships within the community. The contents of the knowledge base is translated and written to files in appropriate formats to configure system software and services, provide user documentation, validate ingested data, and support data analytics. This presentation will provide an overview of the framework, present the Planetary Data System's PDS4 as a use case that has been adopted by the international planetary science community, describe how the framework is being applied to other disciplines, and share some important lessons learned.

  20. Recent ARC developments: Through modularity to interoperability

    Energy Technology Data Exchange (ETDEWEB)

    Smirnova, O; Cameron, D; Ellert, M; Groenager, M; Johansson, D; Kleist, J [NDGF, Kastruplundsgade 22, DK-2770 Kastrup (Denmark); Dobe, P; Joenemo, J; Konya, B [Lund University, Experimental High Energy Physics, Institute of Physics, Box 118, SE-22100 Lund (Sweden); Fraagaat, T; Konstantinov, A; Nilsen, J K; Saada, F Ould; Qiang, W; Read, A [University of Oslo, Department of Physics, P. O. Box 1048, Blindern, N-0316 Oslo (Norway); Kocan, M [Pavol Jozef Safarik University, Faculty of Science, Jesenna 5, SK-04000 Kosice (Slovakia); Marton, I; Nagy, Zs [NIIF/HUNGARNET, Victor Hugo 18-22, H-1132 Budapest (Hungary); Moeller, S [University of Luebeck, Inst. Of Neuro- and Bioinformatics, Ratzeburger Allee 160, D-23538 Luebeck (Germany); Mohn, B, E-mail: oxana.smirnova@hep.lu.s [Uppsala University, Department of Physics and Astronomy, Div. of Nuclear and Particle Physics, Box 535, SE-75121 Uppsala (Sweden)

    2010-04-01

    The Advanced Resource Connector (ARC) middleware introduced by NorduGrid is one of the basic Grid solutions used by scientists worldwide. While being well-proven in daily use by a wide variety of scientific applications at large-scale infrastructures like the Nordic DataGrid Facility (NDGF) and smaller scale projects, production ARC of today is still largely based on conventional Grid technologies and custom interfaces introduced a decade ago. In order to guarantee sustainability, true cross-system portability and standards-compliance based interoperability, the ARC community undertakes a massive effort of implementing modular Web Service (WS) approach into the middleware. With support from the EU KnowARC project, new components were introduced and the existing key ARC services got extended with WS technology based standard-compliant interfaces following a service-oriented architecture. Such components include the hosting environment framework, the resource-coupled execution service, the re-engineered client library, the self-healing storage solution and the peer-to-peer information system, to name a few. Gradual introduction of these new services and client tools into the production middleware releases is carried out together with NDGF and thus ensures a smooth transition to the next generation Grid middleware. Standard interfaces and modularity of the new component design are essential for ARC contributions to the planned Universal Middleware Distribution of the European Grid Initiative.

  1. The advanced microgrid. Integration and interoperability

    Energy Technology Data Exchange (ETDEWEB)

    Bower, Ward Isaac [Ward Bower Innovations, LLC, Albuquerque, NM (United Staes); Ton, Dan T. [U.S. Dept. of Energy, Washington, DC (United States); Guttromson, Ross [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Glover, Steven F [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Stamp, Jason Edwin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bhatnagar, Dhruv [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Reilly, Jim [Reily Associates, Pittston, PA (United States)

    2014-02-01

    This white paper focuses on "advanced microgrids," but sections do, out of necessity, reference today's commercially available systems and installations in order to clearly distinguish the differences and advances. Advanced microgrids have been identified as being a necessary part of the modern electrical grid through a two DOE microgrid workshops, the National Institute of Standards and Technology, Smart Grid Interoperability Panel and other related sources. With their grid-interconnectivity advantages, advanced microgrids will improve system energy efficiency and reliability and provide enabling technologies for grid-independence to end-user sites. One popular definition that has been evolved and is used in multiple references is that a microgrid is a group of interconnected loads and distributed-energy resources within clearly defined electrical boundaries that acts as a single controllable entity with respect to the grid. A microgrid can connect and disconnect from the grid to enable it to operate in both grid-connected or island-mode. Further, an advanced microgrid can then be loosely defined as a dynamic microgrid.

  2. Recent ARC developments: Through modularity to interoperability

    International Nuclear Information System (INIS)

    The Advanced Resource Connector (ARC) middleware introduced by NorduGrid is one of the basic Grid solutions used by scientists worldwide. While being well-proven in daily use by a wide variety of scientific applications at large-scale infrastructures like the Nordic DataGrid Facility (NDGF) and smaller scale projects, production ARC of today is still largely based on conventional Grid technologies and custom interfaces introduced a decade ago. In order to guarantee sustainability, true cross-system portability and standards-compliance based interoperability, the ARC community undertakes a massive effort of implementing modular Web Service (WS) approach into the middleware. With support from the EU KnowARC project, new components were introduced and the existing key ARC services got extended with WS technology based standard-compliant interfaces following a service-oriented architecture. Such components include the hosting environment framework, the resource-coupled execution service, the re-engineered client library, the self-healing storage solution and the peer-to-peer information system, to name a few. Gradual introduction of these new services and client tools into the production middleware releases is carried out together with NDGF and thus ensures a smooth transition to the next generation Grid middleware. Standard interfaces and modularity of the new component design are essential for ARC contributions to the planned Universal Middleware Distribution of the European Grid Initiative.

  3. Image Semantic Automatic Annotation by Relevance Feedback

    Institute of Scientific and Technical Information of China (English)

    ZHANG Tong-zhen; SHEN Rui-min

    2007-01-01

    A large semantic gap exists between content based index retrieval (CBIR) and high-level semantic, additional semantic information should be attached to the images, it refers in three respects including semantic representation model, semantic information building and semantic retrieval techniques. In this paper, we introduce an associated semantic network and an automatic semantic annotation system. In the system, a semantic network model is employed as the semantic representation model, it uses semantic keywords, linguistic ontology and low-level features in semantic similarity calculating. Through several times of users' relevance feedback, semantic network is enriched automatically. To speed up the growth of semantic network and get a balance annotation, semantic seeds and semantic loners are employed especially.

  4. Key pillars of data interoperability in Earth Sciences - INSPIRE and beyond

    Science.gov (United States)

    Tomas, Robert; Lutz, Michael

    2013-04-01

    The well-known heterogeneity and fragmentation of data models, formats and controlled vocabularies of environmental data limit potential data users from utilising the wealth of environmental information available today across Europe. The main aim of INSPIRE1 is to improve this situation and give users possibility to access, use and correctly interpret environmental data. Over the past years number of INSPIRE technical guidelines (TG) and implementing rules (IR) for interoperability have been developed, involving hundreds of domain experts from across Europe. The data interoperability specifications, which have been developed for all 34 INSPIRE spatial data themes2, are the central component of the TG and IR. Several of these themes are related to the earth sciences, e.g. geology (including hydrogeology, geophysics and geomorphology), mineral and energy resources, soil science, natural hazards, meteorology, oceanography, hydrology and land cover. The following main pillars for data interoperability and harmonisation have been identified during the development of the specifications: Conceptual data models describe the spatial objects and their properties and relationships for the different spatial data themes. To achieve cross-domain harmonization, the data models for all themes are based on a common modelling framework (the INSPIRE Generic Conceptual Model3) and managed in a common UML repository. Harmonised vocabularies (or code lists) are to be used in data exchange in order to overcome interoperability issues caused by heterogeneous free-text and/or multi-lingual content. Since a mapping to a harmonized vocabulary could be difficult, the INSPIRE data models typically allow the provision of more specific terms from local vocabularies in addition to the harmonized terms - utilizing either the extensibility options or additional terminological attributes. Encoding. Currently, specific XML profiles of the Geography Markup Language (GML) are promoted as the standard

  5. Semantic Web Mining: Benefits, Challenges and Opportunities

    OpenAIRE

    Syeda Farha Shazmeen, Etyala Ramyasree

    2012-01-01

    Semantic Web Mining aims at combining the two areas Semantic Web and Web Mining by using semantics to improve mining and using mining to create semantics. Web Mining aims at discovering insights about the meaning of Web resources and their usage In Semantic Web, the semantics information is presented by the relation with others and is recorded by RDF. RDF which is semantic web technology that can be utilized to build efficient and scalable systems for Cloud. The Semantic Web enriches the Worl...

  6. COEUS: “semantic web in a box” for biomedical applications

    Directory of Open Access Journals (Sweden)

    Lopes Pedro

    2012-12-01

    Full Text Available Abstract Background As the “omics” revolution unfolds, the growth in data quantity and diversity is bringing about the need for pioneering bioinformatics software, capable of significantly improving the research workflow. To cope with these computer science demands, biomedical software engineers are adopting emerging semantic web technologies that better suit the life sciences domain. The latter’s complex relationships are easily mapped into semantic web graphs, enabling a superior understanding of collected knowledge. Despite increased awareness of semantic web technologies in bioinformatics, their use is still limited. Results COEUS is a new semantic web framework, aiming at a streamlined application development cycle and following a “semantic web in a box” approach. The framework provides a single package including advanced data integration and triplification tools, base ontologies, a web-oriented engine and a flexible exploration API. Resources can be integrated from heterogeneous sources, including CSV and XML files or SQL and SPARQL query results, and mapped directly to one or more ontologies. Advanced interoperability features include REST services, a SPARQL endpoint and LinkedData publication. These enable the creation of multiple applications for web, desktop or mobile environments, and empower a new knowledge federation layer. Conclusions The platform, targeted at biomedical application developers, provides a complete skeleton ready for rapid application deployment, enhancing the creation of new semantic information systems. COEUS is available as open source at http://bioinformatics.ua.pt/coeus/.

  7. Semantic Changes of Gerund

    Directory of Open Access Journals (Sweden)

    Zofija Babickienė

    2012-06-01

    Full Text Available In this article, semantic models of gerund in the Lithuanian language are being investigated. Their productivity and the reasons of their change in the Lithuanian language are identified. The tendency to use gerund semantic structure in noun constructions is typical not only in Greek or Latin languages but also in English, Russian, etc. Regular polysemy is regarded as semantic derivation, i. e. shifting from main meanings to derivative ones. The object of this investigation is the usage patterns of gerunds which bear both the meaning of a verb and a noun. The examples for the present study have been gathered from the language of different Lithuanian dialects as well as from the Dictionary of the Lithuanian language (different volumes, etc. The research results reveal that semantic changes of object and result are the most productive, whereas mood or time semantic model proved to be not so productive. The productivity of regular models depends on the fact that there are suffix derivatives which have the meaning of a result. The research shows that scientific style and language of different dialects are rich in the use of gerund.

  8. Semantic home video categorization

    Science.gov (United States)

    Min, Hyun-Seok; Lee, Young Bok; De Neve, Wesley; Ro, Yong Man

    2009-02-01

    Nowadays, a strong need exists for the efficient organization of an increasing amount of home video content. To create an efficient system for the management of home video content, it is required to categorize home video content in a semantic way. So far, a significant amount of research has already been dedicated to semantic video categorization. However, conventional categorization approaches often rely on unnecessary concepts and complicated algorithms that are not suited in the context of home video categorization. To overcome the aforementioned problem, this paper proposes a novel home video categorization method that adopts semantic home photo categorization. To use home photo categorization in the context of home video, we segment video content into shots and extract key frames that represent each shot. To extract the semantics from key frames, we divide each key frame into ten local regions and extract lowlevel features. Based on the low level features extracted for each local region, we can predict the semantics of a particular key frame. To verify the usefulness of the proposed home video categorization method, experiments were performed with home video sequences, labeled by concepts part of the MPEG-7 VCE2 dataset. To verify the usefulness of the proposed home video categorization method, experiments were performed with 70 home video sequences. For the home video sequences used, the proposed system produced a recall of 77% and an accuracy of 78%.

  9. Semantic Parameters of Split Intransitivity.

    Science.gov (United States)

    Van Valin, Jr., Robert D.

    1990-01-01

    This paper argues that split-intransitive phenomena are better explained in semantic terms. A semantic analysis is carried out in Role and Reference Grammar, which assumes the theory of verb classification proposed in Dowty 1979. (49 references) (JL)

  10. A semantically-aided architecture for a web-based monitoring system for carotid atherosclerosis.

    Science.gov (United States)

    Kolias, Vassileios D; Stamou, Giorgos; Golemati, Spyretta; Stoitsis, Giannis; Gkekas, Christos D; Liapis, Christos D; Nikita, Konstantina S

    2015-08-01

    Carotid atherosclerosis is a multifactorial disease and its clinical diagnosis depends on the evaluation of heterogeneous clinical data, such as imaging exams, biochemical tests and the patient's clinical history. The lack of interoperability between Health Information Systems (HIS) does not allow the physicians to acquire all the necessary data for the diagnostic process. In this paper, a semantically-aided architecture is proposed for a web-based monitoring system for carotid atherosclerosis that is able to gather and unify heterogeneous data with the use of an ontology and to create a common interface for data access enhancing the interoperability of HIS. The architecture is based on an application ontology of carotid atherosclerosis that is used to (a) integrate heterogeneous data sources on the basis of semantic representation and ontological reasoning and (b) access the critical information using SPARQL query rewriting and ontology-based data access services. The architecture was tested over a carotid atherosclerosis dataset consisting of the imaging exams and the clinical profile of 233 patients, using a set of complex queries, constructed by the physicians. The proposed architecture was evaluated with respect to the complexity of the queries that the physicians could make and the retrieval speed. The proposed architecture gave promising results in terms of interoperability, data integration of heterogeneous sources with an ontological way and expanded capabilities of query and retrieval in HIS.

  11. e-Science and biological pathway semantics

    Directory of Open Access Journals (Sweden)

    Luciano Joanne S

    2007-05-01

    Full Text Available Abstract Background The development of e-Science presents a major set of opportunities and challenges for the future progress of biological and life scientific research. Major new tools are required and corresponding demands are placed on the high-throughput data generated and used in these processes. Nowhere is the demand greater than in the semantic integration of these data. Semantic Web tools and technologies afford the chance to achieve this semantic integration. Since pathway knowledge is central to much of the scientific research today it is a good test-bed for semantic integration. Within the context of biological pathways, the BioPAX initiative, part of a broader movement towards the standardization and integration of life science databases, forms a necessary prerequisite for its successful application of e-Science in health care and life science research. This paper examines whether BioPAX, an effort to overcome the barrier of disparate and heterogeneous pathway data sources, addresses the needs of e-Science. Results We demonstrate how BioPAX pathway data can be used to ask and answer some useful biological questions. We find that BioPAX comes close to meeting a broad range of e-Science needs, but certain semantic weaknesses mean that these goals are missed. We make a series of recommendations for re-modeling some aspects of BioPAX to better meet these needs. Conclusion Once these semantic weaknesses are addressed, it will be possible to integrate pathway information in a manner that would be useful in e-Science.

  12. The semantic priming project.

    Science.gov (United States)

    Hutchison, Keith A; Balota, David A; Neely, James H; Cortese, Michael J; Cohen-Shikora, Emily R; Tse, Chi-Shing; Yap, Melvin J; Bengson, Jesse J; Niemeyer, Dale; Buchanan, Erin

    2013-12-01

    Speeded naming and lexical decision data for 1,661 target words following related and unrelated primes were collected from 768 subjects across four different universities. These behavioral measures have been integrated with demographic information for each subject and descriptive characteristics for every item. Subjects also completed portions of the Woodcock-Johnson reading battery, three attentional control tasks, and a circadian rhythm measure. These data are available at a user-friendly Internet-based repository ( http://spp.montana.edu ). This Web site includes a search engine designed to generate lists of prime-target pairs with specific characteristics (e.g., length, frequency, associative strength, latent semantic similarity, priming effect in standardized and raw reaction times). We illustrate the types of questions that can be addressed via the Semantic Priming Project. These data represent the largest behavioral database on semantic priming and are available to researchers to aid in selecting stimuli, testing theories, and reducing potential confounds in their studies.

  13. Academic Research Library as Broker in Addressing Interoperability Challenges for the Geosciences

    Science.gov (United States)

    Smith, P., II

    2015-12-01

    Data capture is an important process in the research lifecycle. Complete descriptive and representative information of the data or database is necessary during data collection whether in the field or in the research lab. The National Science Foundation's (NSF) Public Access Plan (2015) mandates the need for federally funded projects to make their research data more openly available. Developing, implementing, and integrating metadata workflows into to the research process of the data lifecycle facilitates improved data access while also addressing interoperability challenges for the geosciences such as data description and representation. Lack of metadata or data curation can contribute to (1) semantic, (2) ontology, and (3) data integration issues within and across disciplinary domains and projects. Some researchers of EarthCube funded projects have identified these issues as gaps. These gaps can contribute to interoperability data access, discovery, and integration issues between domain-specific and general data repositories. Academic Research Libraries have expertise in providing long-term discovery and access through the use of metadata standards and provision of access to research data, datasets, and publications via institutional repositories. Metadata crosswalks, open archival information systems (OAIS), trusted-repositories, data seal of approval, persistent URL, linking data, objects, resources, and publications in institutional repositories and digital content management systems are common components in the library discipline. These components contribute to a library perspective on data access and discovery that can benefit the geosciences. The USGS Community for Data Integration (CDI) has developed the Science Support Framework (SSF) for data management and integration within its community of practice for contribution to improved understanding of the Earth's physical and biological systems. The USGS CDI SSF can be used as a reference model to map to Earth

  14. Words semantic orientation classification based on HowNet

    Institute of Scientific and Technical Information of China (English)

    LI Dun; MA Yong-tao; GUO Jian-li

    2009-01-01

    Based on the text orientation classification, a new measurement approach to semantic orientation of words was proposed. According to the integrated and detailed definition of words in HowNet, seed sets including the words with intense orientations were built up. The orientation similarity between the seed words and the given word was then calculated using the sentiment weight priority to recognize the semantic orientation of common words. Finally, the words' semantic orientation and the context were combined to recognize the given words' orientation. The experiments show that the measurement approach achieves better results for common words' orientation classification and contributes particularly to the text orientation classification of large granularities.

  15. GEO Standard and Interoperability Forum (SIF) European Team

    Science.gov (United States)

    Nativi, Stefano

    2010-05-01

    The European GEO SIF has been initiated by the GIGAS project in an effort to better coordinate European requirements for GEO and GEOSS related activities, and is recognised by GEO as a regional SIF. To help advance the interoperability goals of the Global Earth Observing System of Systems (GEOSS), the Group on Earth Observations (GEO) Architecture and Data Committee (ADC) has established a Standards and Interoperability Forum (SIF) to support GEO organizations offering components and services to GEOSS. The SIF will help GEOSS contributors understand how to work with the GEOSS interoperability guidelines and how to enter their "interoperability arrangements" (standards or other ad hoc arrangements for interoperability) into the GEOSS registries. This will greatly facilitate the utility of GEOSS and encourage significant increase in participation. To carry out its work most effectively, the SIF promotes to form Regional Teams. They will help to organize and optimize the support coming from the different parts of the World and reach out regional and multi-disciplinary Scientific Communities. This will allow to have true global representation in supporting GEOSS interoperability. A SIF European Team is foreseen. The main role of the SIF is facilitating interoperability and working with members and participating organizations as they offer data and information services to the users of GEOSS. In this framework, the purpose of having a European Regional Team is to increase efficiency in carrying out the work of the SIF. Experts can join the SIF European Team by registering at the SIF European Team wiki site: http://www.thegigasforum.eu/sif/

  16. Flow Logics and Operational Semantics

    DEFF Research Database (Denmark)

    Nielson, Flemming; Nielson, Hanne Riis

    1998-01-01

    Flow logic is a “fast prototyping” approach to program analysis that shows great promise of being able to deal with a wide variety of languages and calculi for computation. However, seemingly innocent choices in the flow logic as well as in the operational semantics may inhibit proving the analysis...... correct. Our main conclusion is that environment based semantics is more flexible than either substitution based semantics or semantics making use of structural congruences (like alpha-renaming)....

  17. Temporal Representation in Semantic Graphs

    Energy Technology Data Exchange (ETDEWEB)

    Levandoski, J J; Abdulla, G M

    2007-08-07

    A wide range of knowledge discovery and analysis applications, ranging from business to biological, make use of semantic graphs when modeling relationships and concepts. Most of the semantic graphs used in these applications are assumed to be static pieces of information, meaning temporal evolution of concepts and relationships are not taken into account. Guided by the need for more advanced semantic graph queries involving temporal concepts, this paper surveys the existing work involving temporal representations in semantic graphs.

  18. An Open Source Tool to Test Interoperability

    Science.gov (United States)

    Bermudez, L. E.

    2012-12-01

    Scientists interact with information at various levels from gathering of the raw observed data to accessing portrayed processed quality control data. Geoinformatics tools help scientist on the acquisition, storage, processing, dissemination and presentation of geospatial information. Most of the interactions occur in a distributed environment between software components that take the role of either client or server. The communication between components includes protocols, encodings of messages and managing of errors. Testing of these communication components is important to guarantee proper implementation of standards. The communication between clients and servers can be adhoc or follow standards. By following standards interoperability between components increase while reducing the time of developing new software. The Open Geospatial Consortium (OGC), not only coordinates the development of standards but also, within the Compliance Testing Program (CITE), provides a testing infrastructure to test clients and servers. The OGC Web-based Test Engine Facility, based on TEAM Engine, allows developers to test Web services and clients for correct implementation of OGC standards. TEAM Engine is a JAVA open source facility, available at Sourceforge that can be run via command line, deployed in a web servlet container or integrated in developer's environment via MAVEN. The TEAM Engine uses the Compliance Test Language (CTL) and TestNG to test HTTP requests, SOAP services and XML instances against Schemas and Schematron based assertions of any type of web service, not only OGC services. For example, the OGC Web Feature Service (WFS) 1.0.0 test has more than 400 test assertions. Some of these assertions includes conformance of HTTP responses, conformance of GML-encoded data; proper values for elements and attributes in the XML; and, correct error responses. This presentation will provide an overview of TEAM Engine, introduction of how to test via the OGC Testing web site and

  19. Evolution of semantic systems

    CERN Document Server

    Küppers, Bernd-Olaf; Artmann, Stefan

    2013-01-01

    Complex systems in nature and society make use of information for the development of their internal organization and the control of their functional mechanisms. Alongside technical aspects of storing, transmitting and processing information, the various semantic aspects of information, such as meaning, sense, reference and function, play a decisive part in the analysis of such systems.With the aim of fostering a better understanding of semantic systems from an evolutionary and multidisciplinary perspective, this volume collects contributions by philosophers and natural scientists, linguists, i

  20. Enterprise semantic Web

    OpenAIRE

    Gutiérrez Alba, David

    2012-01-01

    This document is a journey through Semantic Web principles and Microsoft SharePoint in order to come to understand some advantages and disadvantages of theirs, and how Semantic Web principles can be blended into an enterprise solution like Microsoft SharePoint. Aquest document és un viatge a través dels principis de la web semàntica i Microsoft SharePoint amb la finalitat d'arribar a entendre alguns dels seus avantatges i desavantatges, i com els principis de la web semàntica es poden barr...

  1. Causal premise semantics.

    Science.gov (United States)

    Kaufmann, Stefan

    2013-08-01

    The rise of causality and the attendant graph-theoretic modeling tools in the study of counterfactual reasoning has had resounding effects in many areas of cognitive science, but it has thus far not permeated the mainstream in linguistic theory to a comparable degree. In this study I show that a version of the predominant framework for the formal semantic analysis of conditionals, Kratzer-style premise semantics, allows for a straightforward implementation of the crucial ideas and insights of Pearl-style causal networks. I spell out the details of such an implementation, focusing especially on the notions of intervention on a network and backtracking interpretations of counterfactuals.

  2. Semantic Search of Web Services

    Science.gov (United States)

    Hao, Ke

    2013-01-01

    This dissertation addresses semantic search of Web services using natural language processing. We first survey various existing approaches, focusing on the fact that the expensive costs of current semantic annotation frameworks result in limited use of semantic search for large scale applications. We then propose a vector space model based service…

  3. Semantic repository and ontology mapping

    NARCIS (Netherlands)

    J. Gracia; M. Trna; E. Lozano; T.T. Nguyen; A. Gómez-Pérez; C. Montaña; J. Liem

    2010-01-01

    This document discusses the core Semantic Technologies in DynaLearn: i) The semantic repository, which supports the online storage and access of qualitative reasoning models, ii) the grounding process, which establishes semantic equivalences between the concepts in the models and the concepts in a b

  4. A Timed Semantics for SDL

    DEFF Research Database (Denmark)

    Mørk, Simon; Godskesen, Jens Christian; Hansen, Michael Reichhardt;

    1996-01-01

    An alternative formal semantics for describing the temporal aspects for the ITU-T specification language SDL is proposed, based on the interval temporal logic Duration Calculus (DC). It is shown how DC can be used to give an SDL semantics with a precise treatment oftemporal phenomena. The semantics...

  5. Semantic cognition or data mining?

    NARCIS (Netherlands)

    D. Borsboom; I. Visser

    2008-01-01

    We argue that neural networks for semantic cognition, as proposed by Rogers & McClelland (R&M), do not acquire semantics and therefore cannot be the basis for a theory of semantic cognition. The reason is that the neural networks simply perform statistical categorization procedures, and these do not

  6. A Conceptual Framework to Enhance the Interoperability of Observatories among Countries, Continents and the World

    Science.gov (United States)

    Loescher, H.; Fundamental Instrument Unit

    2013-05-01

    , GEO-BON, NutNet, etc.) and domestically, (e.g., NSF-CZO, USDA-LTAR, DOE-NGEE, Soil Carbon Network, etc.), there is a strong and mutual desire to assure interoperability of data. Developing interoperability is the degree by which each of the following is mapped between observatories (entities), defined by linking i) science requirements with science questions, ii) traceability of measurements to nationally and internationally accepted standards, iii) how data product are derived, i.e., algorithms, procedures, and methods, and iv) the bioinformatics which broadly include data formats, metadata, controlled vocabularies, and semantics. Here, we explore the rationale and focus areas for interoperability, the governance and work structures, example projects (NSF-NEON, EU-ICOS, and AU-TERN), and the emergent roles of scientists in these endeavors.

  7. A core observational data model for enhancing the interoperability of ontologically annotated environmental data

    Science.gov (United States)

    Schildhauer, M.; Bermudez, L. E.; Bowers, S.; Dibner, P. C.; Gries, C.; Jones, M. B.; McGuinness, D. L.; Cao, H.; Cox, S. J.; Kelling, S.; Lagoze, C.; Lapp, H.; Madin, J.

    2010-12-01

    Research in the environmental sciences often requires accessing diverse data, collected by numerous data providers over varying spatiotemporal scales, incorporating specialized measurements from a range of instruments. These measurements are typically documented using idiosyncratic, disciplinary specific terms, and stored in management systems ranging from desktop spreadsheets to the Cloud, where the information is often further decomposed or stylized in unpredictable ways. This situation creates major informatics challenges for broadly discovering, interpreting, and merging the data necessary for integrative earth science research. A number of scientific disciplines have recognized these issues, and been developing semantically enhanced data storage frameworks, typically based on ontologies, to enable communities to better circumscribe and clarify the content of data objects within their domain of practice. There is concern, however, that cross-domain compatibility of these semantic solutions could become problematic. We describe here our efforts to address this issue by developing a core, unified Observational Data Model, that should greatly facilitate interoperability among the semantic solutions growing organically within diverse scientific domains. Observational Data Models have emerged independently from several distinct scientific communities, including the biodiversity sciences, ecology, evolution, geospatial sciences, and hydrology, to name a few. Informatics projects striving for data integration within each of these domains had converged on identifying "observations" and "measurements" as fundamental abstractions that provide useful "templates" through which scientific data can be linked— at the structural, composited, or even cell value levels— to domain terms stored in ontologies or other forms of controlled vocabularies. The Scientific Observations Network, SONet (http://sonet.ecoinformatics.org) brings together a number of these observational

  8. Towards a semantic event-based service-oriented architecture

    OpenAIRE

    Pedrinaci, Carlos; Moran, Matthew; NORTON, Barry

    2006-01-01

    Service-Oriented Architecture (SOA) is commonly lauded as a silver bullet for Enterprise Application Integration, inter-organizational business processes implementation, and even as a general solution for the development of all complex Web-oriented applications. However, SOA without semantic descriptions of its data, processes and messaging models fails to achieve a truly flexible and dynamic infrastructure. In this paper we explain where semantics are necessary for SOA and present early work ...

  9. Semantator: semantic annotator for converting biomedical text to linked data.

    Science.gov (United States)

    Tao, Cui; Song, Dezhao; Sharma, Deepak; Chute, Christopher G

    2013-10-01

    More than 80% of biomedical data is embedded in plain text. The unstructured nature of these text-based documents makes it challenging to easily browse and query the data of interest in them. One approach to facilitate browsing and querying biomedical text is to convert the plain text to a linked web of data, i.e., converting data originally in free text to structured formats with defined meta-level semantics. In this paper, we introduce Semantator (Semantic Annotator), a semantic-web-based environment for annotating data of interest in biomedical documents, browsing and querying the annotated data, and interactively refining annotation results if needed. Through Semantator, information of interest can be either annotated manually or semi-automatically using plug-in information extraction tools. The annotated results will be stored in RDF and can be queried using the SPARQL query language. In addition, semantic reasoners can be directly applied to the annotated data for consistency checking and knowledge inference. Semantator has been released online and was used by the biomedical ontology community who provided positive feedbacks. Our evaluation results indicated that (1) Semantator can perform the annotation functionalities as designed; (2) Semantator can be adopted in real applications in clinical and transactional research; and (3) the annotated results using Semantator can be easily used in Semantic-web-based reasoning tools for further inference.

  10. Towards On-line Automated Semantic Scoring of English-Chinese Translation

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Described and exemplified a semantic scoring system of students' on-line English-Chinese translation.To achieve accurate assessment, the system adopted a comprehensive method which combines semantic scoring with keyword matching scoring. Four kinds of words-verbs, adjectives, adverbs and "the rest" including nouns, pronouns, idioms, prepositions, etc. , are identified after parsing. The system treats different words tagged with different part of speech differently. Then it calculated the semantic similarity between these words of the standard versions and those of students'translations by the distinctive differences of the semantic features of these words with the aid of HowNet. The first semantic feature of verbs and the last semantic features of adjectives and adverbs are calculated. "The rest" is scored by keyword matching. The experiment results show that the semantic scoring system is applicable in fulfilling the task of scoring students'on-line English-Chinese translations.

  11. Semantator: annotating clinical narratives with semantic web ontologies.

    Science.gov (United States)

    Song, Dezhao; Chute, Christopher G; Tao, Cui

    2012-01-01

    To facilitate clinical research, clinical data needs to be stored in a machine processable and understandable way. Manual annotating clinical data is time consuming. Automatic approaches (e.g., Natural Language Processing systems) have been adopted to convert such data into structured formats; however, the quality of such automatically extracted data may not always be satisfying. In this paper, we propose Semantator, a semi-automatic tool for document annotation with Semantic Web ontologies. With a loaded free text document and an ontology, Semantator supports the creation/deletion of ontology instances for any document fragment, linking/disconnecting instances with the properties in the ontology, and also enables automatic annotation by connecting to the NCBO annotator and cTAKES. By representing annotations in Semantic Web standards, Semantator supports reasoning based upon the underlying semantics of the owl:disjointWith and owl:equivalentClass predicates. We present discussions based on user experiences of using Semantator.

  12. Semantic Web meets Integrative Biology: a survey.

    Science.gov (United States)

    Chen, Huajun; Yu, Tong; Chen, Jake Y

    2013-01-01

    Integrative Biology (IB) uses experimental or computational quantitative technologies to characterize biological systems at the molecular, cellular, tissue and population levels. IB typically involves the integration of the data, knowledge and capabilities across disciplinary boundaries in order to solve complex problems. We identify a series of bioinformatics problems posed by interdisciplinary integration: (i) data integration that interconnects structured data across related biomedical domains; (ii) ontology integration that brings jargons, terminologies and taxonomies from various disciplines into a unified network of ontologies; (iii) knowledge integration that integrates disparate knowledge elements from multiple sources; (iv) service integration that build applications out of services provided by different vendors. We argue that IB can benefit significantly from the integration solutions enabled by Semantic Web (SW) technologies. The SW enables scientists to share content beyond the boundaries of applications and websites, resulting into a web of data that is meaningful and understandable to any computers. In this review, we provide insight into how SW technologies can be used to build open, standardized and interoperable solutions for interdisciplinary integration on a global basis. We present a rich set of case studies in system biology, integrative neuroscience, bio-pharmaceutics and translational medicine, to highlight the technical features and benefits of SW applications in IB.

  13. Development of an Electronic Claim System Based on an Integrated Electronic Health Record Platform to Guarantee Interoperability

    OpenAIRE

    Kim, Hwa Sun; Cho, Hune; Lee, In Keun

    2011-01-01

    Objectives We design and develop an electronic claim system based on an integrated electronic health record (EHR) platform. This system is designed to be used for ambulatory care by office-based physicians in the United States. This is achieved by integrating various medical standard technologies for interoperability between heterogeneous information systems. Methods The developed system serves as a simple clinical data repository, it automatically fills out the Centers for Medicare and Medic...

  14. Operational Semantics of Termination Types

    DEFF Research Database (Denmark)

    Nielson, Flemming; Nielson, Hanne Riis

    1996-01-01

    and algebraic data types. The well-founded orderings are defined by pattern matching against the definition of the algebraic data types. We prove that the analysis is semantically sound with respect to a big-step (or natural) operational semantics. We compare our approach based on operational semantics to one...... based on denotational semantics and we identify the need for extending the semantic universe with low constructs whose sole purpose is to facilitate the proof. For dealing with partial correctness it suffices to consider approximations that are less defined than the desired fixed points; for dealing...

  15. Learning semantic query suggestions

    NARCIS (Netherlands)

    E. Meij; M. Bron; L. Hollink; B. Huurnink; M. de Rijke

    2009-01-01

    An important application of semantic web technology is recognizing human-defined concepts in text. Query transformation is a strategy often used in search engines to derive queries that are able to return more useful search results than the original query and most popular search engines provide faci

  16. Assertiveness through Semantics.

    Science.gov (United States)

    Zuercher, Nancy T.

    1983-01-01

    Suggests that connotations of assertiveness do not convey all of its meanings, particularly the components of positive feelings, communication, and cooperation. The application of semantics can help restore the balance. Presents a model for differentiating assertive behavior and clarifying the definition. (JAC)

  17. Verb Semantics and Lexical Selection

    CERN Document Server

    Wu, Z; Wu, Zhibiao; Palmer, Martha

    1994-01-01

    This paper will focus on the semantic representation of verbs in computer systems and its impact on lexical selection problems in machine translation (MT). Two groups of English and Chinese verbs are examined to show that lexical selection must be based on interpretation of the sentence as well as selection restrictions placed on the verb arguments. A novel representation scheme is suggested, and is compared to representations with selection restrictions used in transfer-based MT. We see our approach as closely aligned with knowledge-based MT approaches (KBMT), and as a separate component that could be incorporated into existing systems. Examples and experimental results will show that, using this scheme, inexact matches can achieve correct lexical selection.

  18. Efficient Proposed Framework for Semantic Search Engine using New Semantic Ranking Algorithm

    Directory of Open Access Journals (Sweden)

    M. M. El-gayar

    2015-08-01

    Full Text Available The amount of information raises billions of databases every year and there is an urgent need to search for that information by a specialize tool called search engine. There are many of search engines available today, but the main challenge in these search engines is that most of them cannot retrieve meaningful information intelligently. The semantic web technology is a solution that keeps data in a readable format that helps machines to match smartly this data with related information based on meanings. In this paper, we will introduce a proposed semantic framework that includes four phases crawling, indexing, ranking and retrieval phase. This semantic framework operates over a sorting RDF by using efficient proposed ranking algorithm and enhanced crawling algorithm. The enhanced crawling algorithm crawls relevant forum content from the web with minimal overhead. The proposed ranking algorithm is produced to order and evaluate similar meaningful data in order to make the retrieval process becomes faster, easier and more accurate. We applied our work on a standard database and achieved 99 percent effectiveness on semantic performance in minimum time and less than 1 percent error rate compared with the other semantic systems.

  19. Bootstrapping Object Coreferencing on the Semantic Web

    Institute of Scientific and Technical Information of China (English)

    Wei Hu; Yu-Zhong Qu; Xing-Zhi Sun

    2011-01-01

    An object on the Semantic Web is likely to be denoted with several URIs by different parties.Object coreferencing is a process to identify "equivalent" URIs of objects for achieving a better Data Web.In this paper,we propose a bootstrapping approach for object coreferencing on the Semantic Web.For an object URI,we firstly establish a kernel that consists of semantically equivalent URIs from the same-as,(inverse) functional properties and (max-)cardinalities,and then extend the kernel with respect to the textual descriptions (e.g.,labels and local names) of URIs.We also propose a trustworthiness-based method to rank the coreferent URIs in the kernel as well as a similarity-based method for ranking the URIs in the extension of the kernel.We implement the proposed approach,called ObjectCoref,on a large-scale dataset that contains 76 million URIs collected by the Falcons search engine until 2008.The evaluation on precision,relative recall and response time demonstrates the feasibility of our approach.Additionally,we apply the proposed approach to investigate the popularity of the URI alias phenomenon on the current Semantic Web.

  20. Restructuring an EHR system and the Medical Markup Language (MML) standard to improve interoperability by archetype technology.

    Science.gov (United States)

    Kobayashi, Shinji; Kume, Naoto; Yoshihara, Hiroyuki

    2015-01-01

    In 2001, we developed an EHR system for regional healthcare information inter-exchange and to provide individual patient data to patients. This system was adopted in three regions in Japan. We also developed a Medical Markup Language (MML) standard for inter- and intra-hospital communications. The system was built on a legacy platform, however, and had not been appropriately maintained or updated to meet clinical requirements. To improve future maintenance costs, we reconstructed the EHR system using archetype technology on the Ruby on Rails platform, and generated MML equivalent forms from archetypes. The system was deployed as a cloud-based system for preliminary use as a regional EHR. The system now has the capability to catch up with new requirements, maintaining semantic interoperability with archetype technology. It is also more flexible than the legacy EHR system. PMID:26262183

  1. Metaworkflows and Workflow Interoperability for Heliophysics

    Science.gov (United States)

    Pierantoni, Gabriele; Carley, Eoin P.

    2014-06-01

    Heliophysics is a relatively new branch of physics that investigates the relationship between the Sun and the other bodies of the solar system. To investigate such relationships, heliophysicists can rely on various tools developed by the community. Some of these tools are on-line catalogues that list events (such as Coronal Mass Ejections, CMEs) and their characteristics as they were observed on the surface of the Sun or on the other bodies of the Solar System. Other tools offer on-line data analysis and access to images and data catalogues. During their research, heliophysicists often perform investigations that need to coordinate several of these services and to repeat these complex operations until the phenomena under investigation are fully analyzed. Heliophysicists combine the results of these services; this service orchestration is best suited for workflows. This approach has been investigated in the HELIO project. The HELIO project developed an infrastructure for a Virtual Observatory for Heliophysics and implemented service orchestration using TAVERNA workflows. HELIO developed a set of workflows that proved to be useful but lacked flexibility and re-usability. The TAVERNA workflows also needed to be executed directly in TAVERNA workbench, and this forced all users to learn how to use the workbench. Within the SCI-BUS and ER-FLOW projects, we have started an effort to re-think and re-design the heliophysics workflows with the aim of fostering re-usability and ease of use. We base our approach on two key concepts, that of meta-workflows and that of workflow interoperability. We have divided the produced workflows in three different layers. The first layer is Basic Workflows, developed both in the TAVERNA and WS-PGRADE languages. They are building blocks that users compose to address their scientific challenges. They implement well-defined Use Cases that usually involve only one service. The second layer is Science Workflows usually developed in TAVERNA. They

  2. Moving Controlled Vocabularies into the Semantic Web

    Science.gov (United States)

    Thomas, R.; Lowry, R. K.; Kokkinaki, A.

    2015-12-01

    . Having placed Linked Data tooling over a single SPARQL end point the obvious future development for this system is to support semantic interoperability outside NVS by the incorporation of federated SPARQL end points in the USA and Australia during the ODIP II project. 1https://vocab.nerc.ac.uk/sparql 2 https://www.bodc.ac.uk/data/codes_and_formats/vocabulary_search/

  3. Privacy for Semantic Web Mining using Advanced DSA – Spatial LBS Case Study

    Directory of Open Access Journals (Sweden)

    Dr.D.Sravan Kumar,

    2010-05-01

    Full Text Available The Web Services paradigm promises to enable rich flexible and dynamic interoperation of highly distributed, heterogeneous network enabled services. The idea of Web Services Mining that it makes use of the findings in the field of data mining and applies them to the world of Web Services. The emerging concept of Semantic Web Services aims at more sophisticated Web Services technologies: on basis of Semantic Description Frameworks, Intelligent mechanisms are envisioned for Discovery, Composition, and contracting of Web Services. The aim of semantic web is not only to support to access information on the web but also to support its usage. Geospatial Semantic Web is an augmentation to the Semantic Web that adds geospatial abstractions, as well as related reasoning, representation and query mechanisms. Web Service Security represents a key requirement for today’s distributed interconnected digital world and for the new generations, Web 2.0 and Semantic Web. To date, the problem of security has been investigated very much in the context of standardization efforts; Personal judgments are made usually based on the sensitivity of the information and the reputation of the party towhich the information is to be disclosed. On the privacy front,this means that privacy invasion would net more quality and sensitive personal information. In this paper, we had implemented a case study on integrated privacy issues of Spatial Semantic Web Services Mining. Initially we improved privacy of Geospatial Semantic Layer. Finally, we implemented a Location Based System and improved its digital signature capability, using advanced Digital Signature standards.

  4. OGC and Grid Interoperability in enviroGRIDS Project

    Science.gov (United States)

    Gorgan, Dorian; Rodila, Denisa; Bacu, Victor; Giuliani, Gregory; Ray, Nicolas

    2010-05-01

    EnviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is a 4-years FP7 Project aiming to address the subjects of ecologically unsustainable development and inadequate resource management. The project develops a Spatial Data Infrastructure of the Black Sea Catchment region. The geospatial technologies offer very specialized functionality for Earth Science oriented applications as well as the Grid oriented technology that is able to support distributed and parallel processing. One challenge of the enviroGRIDS project is the interoperability between geospatial and Grid infrastructures by providing the basic and the extended features of the both technologies. The geospatial interoperability technology has been promoted as a way of dealing with large volumes of geospatial data in distributed environments through the development of interoperable Web service specifications proposed by the Open Geospatial Consortium (OGC), with applications spread across multiple fields but especially in Earth observation research. Due to the huge volumes of data available in the geospatial domain and the additional introduced issues (data management, secure data transfer, data distribution and data computation), the need for an infrastructure capable to manage all those problems becomes an important aspect. The Grid promotes and facilitates the secure interoperations of geospatial heterogeneous distributed data within a distributed environment, the creation and management of large distributed computational jobs and assures a security level for communication and transfer of messages based on certificates. This presentation analysis and discusses the most significant use cases for enabling the OGC Web services interoperability with the Grid environment and focuses on the description and implementation of the most promising one. In these use cases we give a special attention to issues such as: the relations between computational grid and

  5. Interoperable and standard e-Health solution over Bluetooth.

    Science.gov (United States)

    Martinez, I; Del Valle, P; Munoz, P; Trigo, J D; Escayola, J; Martínez-Espronceda, M; Muñoz, A; Serrano, L; Garcia, J

    2010-01-01

    The new paradigm of e-Health demands open sensors and middleware components that permit transparent integration and end-to-end interoperability of new personal health devices. The use of standards seems to be the internationally adopted way to solve these problems. This paper presents the implementation of an end-to-end standards-based e-Health solution. This includes ISO/IEEE11073 standard for the interoperability of the medical devices in the patient environment and EN13606 standard for the interoperable exchange of the Electronic Healthcare Record. The design strictly fulfills all the technical features of the most recent versions of both standards. The implemented prototype has been tested in a laboratory environment to demonstrate its feasibility for its further transfer to the healthcare system.

  6. RESTFul based heterogeneous Geoprocessing workflow interoperation for Sensor Web Service

    Science.gov (United States)

    Yang, Chao; Chen, Nengcheng; Di, Liping

    2012-10-01

    Advanced sensors on board satellites offer detailed Earth observations. A workflow is one approach for designing, implementing and constructing a flexible and live link between these sensors' resources and users. It can coordinate, organize and aggregate the distributed sensor Web services to meet the requirement of a complex Earth observation scenario. A RESTFul based workflow interoperation method is proposed to integrate heterogeneous workflows into an interoperable unit. The Atom protocols are applied to describe and manage workflow resources. The XML Process Definition Language (XPDL) and Business Process Execution Language (BPEL) workflow standards are applied to structure a workflow that accesses sensor information and one that processes it separately. Then, a scenario for nitrogen dioxide (NO2) from a volcanic eruption is used to investigate the feasibility of the proposed method. The RESTFul based workflows interoperation system can describe, publish, discover, access and coordinate heterogeneous Geoprocessing workflows.

  7. Data Access, Discovery and Interoperability in the European Context

    Science.gov (United States)

    Genova, Francoise

    2015-12-01

    European Virtual Observatory (VO) activities have been coordinated by a series of projects funded by the European Commission. Three pillar were identified: support to the data providers for implementation of their data in the VO framework; support to the astronomical community for their usage of VO-enabled data and tools; technological work for updating the VO framework of interoperability standards and tools. A new phase is beginning with the ASTERICS cluster project. ASTERICS Work Package "Data Access, Discovery and Interoperability" aims at making the data from the ESFRI projects and their pathfinders available for discovery and usage, interoperable in the VO framework and accessible with VO-enabled common tools. VO teams and representatives of ESFRI and pathfinder projects and of EGO/VIRGO are engaged together in the Work Package. ESO is associated to the project which is also working closely with ESA. The three pillars identified for coordinating Europaen VO activities are tackled.

  8. Operational Interoperability Challenges on the Example of GEOSS and WIS

    Science.gov (United States)

    Heene, M.; Buesselberg, T.; Schroeder, D.; Brotzer, A.; Nativi, S.

    2015-12-01

    The following poster highlights the operational interoperability challenges on the example of Global Earth Observation System of Systems (GEOSS) and World Meteorological Organization Information System (WIS). At the heart of both systems is a catalogue of earth observation data, products and services but with different metadata management concepts. While in WIS a strong governance with an own metadata profile for the hundreds of thousands metadata records exists, GEOSS adopted a more open approach for the ten million records. Furthermore, the development of WIS - as an operational system - follows a roadmap with committed downwards compatibility while the GEOSS development process is more agile. The poster discusses how the interoperability can be reached for the different metadata management concepts and how a proxy concept helps to couple two different systems which follow a different development methodology. Furthermore, the poster highlights the importance of monitoring and backup concepts as a verification method for operational interoperability.

  9. A framework for interoperability of BPEL-based workflows

    Institute of Scientific and Technical Information of China (English)

    Li Xitong; Fan Yushun; Huang Shuangxi

    2008-01-01

    With the prevalence of service-oriented architecture (SOA), web services have become the dominating technology to construct workflow systems. As a workflow is the composition of a series of interrelated web services which realize its activities, the interoperability of workflows can be treated as the composition of web services. To address it, a framework for interoperability of business process execution language (BPEL)-based workflows is presented, which can perform three phases, that is, transformation, conformance test and execution. The core components of the framework are proposed, especially how these components promote interoperability. In particular, dynamic binding and re-composition of workflows in terms of web service testing are presented. Besides, an example of business-to-business (B2B) collaboration is provided to illustrate how to perform composition and conformance test.

  10. Integrating Ontology into Semantic File Systems

    OpenAIRE

    Ngo, Ba-Hung; Bac, Christian; SILBER-CHAUSSUMIER, Frédérique

    2007-01-01

    Semantic file systems enhance standard file systems with the ability of file searching based on file semantics. In this paper, we propose to integrate the support for ontologies into a file system to build efficient semantic file systems whose file semantics can be shared between users, applications and semantic file systems themselves. We call it ontology-based file system. We identify three existing types of file semantics: property-based, content-based and context-based semantics and adopt...

  11. Context-Aware Adaptive Hybrid Semantic Relatedness in Biomedical Science

    Science.gov (United States)

    Emadzadeh, Ehsan

    Text mining of biomedical literature and clinical notes is a very active field of research in biomedical science. Semantic analysis is one of the core modules for different Natural Language Processing (NLP) solutions. Methods for calculating semantic relatedness of two concepts can be very useful in solutions solving different problems such as relationship extraction, ontology creation and question / answering [1--6]. Several techniques exist in calculating semantic relatedness of two concepts. These techniques utilize different knowledge sources and corpora. So far, researchers attempted to find the best hybrid method for each domain by combining semantic relatedness techniques and data sources manually. In this work, attempts were made to eliminate the needs for manually combining semantic relatedness methods targeting any new contexts or resources through proposing an automated method, which attempted to find the best combination of semantic relatedness techniques and resources to achieve the best semantic relatedness score in every context. This may help the research community find the best hybrid method for each context considering the available algorithms and resources.

  12. An Interoperable Framework to Access In-Situ OPeNDAP Data

    Science.gov (United States)

    Li, W.; Yang, C.; Li, Z.; Li, J.; Zhu, H.; Xie, J.

    2008-12-01

    A huge amount of in-situ ocean observation and hydrology related data are made available to scientists through a uniform access interface, the OPeNDAP inteface. However, there are few interoperable clients that support the interface, and existing clients only provide data access to a specific OPeNDAP server rather than employ flexible data access mechanisms. Moreover, current data visualization is limited to 2-D, which is not very intuitive for end users. To overcome the shortcomings, we developed a linkage and a client to provide a compatible and interactive data access and visualization interface for both gridded and sequence data from multiple remote OPeNDAP servers providing NetCDF, HDF5 and other data formats. The system 1) to fully understand the data structures, attributes and knowledge of data from different OPeNDAP servers, semantic technique is employed and a semantic mapping table defining the usage conventions helps parsing the given metadata description files. 2) After selecting the variable, time interval and spatial extent, the request constructor is started to organize the constraint expression for subsetting the datasets. 3) The multi- threading enabled downloading mechanism helps to download the subset datasets in the intermediate format-DODS simultaneously. Once all the datasets are downloaded, an applet based java plug-in is able to support 3-D visualization by rendering the data with extended NASA's World Wind. If the data are in a time sequence, an animation is automatically generated and displayed within World Wind. Meanwhile, a KML file is generated automatically for users to visualize data in Google Earth.

  13. Interoperable eHealth Platform for Personalized Smart Services

    DEFF Research Database (Denmark)

    Mihaylov, Mihail Rumenov; Mihovska, Albena Dimitrova; Kyriazakos, Sofoklis;

    2015-01-01

    personalized context-aware applications to serve the user's needs. This paper proposes the use of advised sensing, context-aware and cloud-based lifestyle reasoning to design an innovative eHealth platform that supports highly personalized smart services to primary users. The architecture of the platform has...... been designed in accordance with the interoperability requirements and standards as proposed by ITU-T and Continua Alliance. In particular, we define the interface dependencies and functional requirements needed, to allow eCare and eHealth vendors to manufacture interoperable sensors, ambient and home...

  14. Interoperable Archetypes With a Three Folded Terminology Governance.

    Science.gov (United States)

    Pederson, Rune; Ellingsen, Gunnar

    2015-01-01

    The use of openEHR archetypes increases the interoperability of clinical terminology, and in doing so improves upon the availability of clinical terminology for both primary and secondary purposes. Where clinical terminology is employed in the EPR system, research reports conflicting a results for the use of structuring and standardization as measurements of success. In order to elucidate this concept, this paper focuses on the effort to establish a national repository for openEHR based archetypes in Norway where clinical terminology could be included with benefit for interoperability three folded. PMID:26262236

  15. Establishing Interoperability of a Blog Archive through Linked Open Data

    DEFF Research Database (Denmark)

    Kalb, Hendrik; Lazaridou, Paraskevi; Trier, Matthias

    2013-01-01

    The digital cultural heritage is partly preserved through web archiving activities The BlogForever platform is a web archiving platform that aims specifically at the preservation of the blogosphere. The focus enables exploitation of the blog structure for sophisticated access capabilities...... on archived data. However, interoperability among BlogForever archives, as well as with other digital libraries, is necessary in order to avoid silos of data. In this paper, we reveal some of our efforts to establish interoperability through the application of Linked Open data....

  16. 76 FR 4102 - Smart Grid Interoperability Standards; Supplemental Notice of Technical Conference

    Science.gov (United States)

    2011-01-24

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Smart Grid Interoperability Standards; Supplemental Notice of Technical... Technical Conference on Smart Grid Interoperability Standards will be held on Monday, January 31,...

  17. 75 FR 63462 - Smart Grid Interoperability Standards; Notice of Docket Designation for Smart Grid...

    Science.gov (United States)

    2010-10-15

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Smart Grid Interoperability Standards; Notice of Docket Designation for Smart Grid Interoperability Standards October 7, 2010. 1. The Energy Independence and Security Act...

  18. 75 FR 21011 - National Protection and Programs Directorate; Statewide Communication Interoperability Plan...

    Science.gov (United States)

    2010-04-22

    ... SECURITY National Protection and Programs Directorate; Statewide Communication Interoperability Plan... concerning New Information Collection Request, Statewide Communication Interoperability Plan Implementation... January 5, 2010, at 75 FR 417, for a 60-day public comment period. DHS received no comments. The...

  19. 75 FR 417 - National Protection and Programs Directorate; Statewide Communication Interoperability Plan...

    Science.gov (United States)

    2010-01-05

    ... SECURITY National Protection and Programs Directorate; Statewide Communication Interoperability Plan...: Statewide Communication Interoperability Plan Implementation Report. Form: Not Applicable. OMB Number: 1670... Emergency Communications Grant Program (IECGP) (6 U.S.C. 579) comply with the Statewide...

  20. Report on the IFIP WG5.8 International Workshop on Enterprise Interoperability (IEWI 2008)

    NARCIS (Netherlands)

    Sinderen, van M.J.; Johnson, P.; Kutvonen, L.

    2008-01-01

    Enterprise interoperability is a growing research topic, rooted in various sub-disciplines from computer science and business management. Enterprise interoperability addresses intra- and inter-organizational collaboration and is characterized by the objective of aligning business level and technolog

  1. Semantic Web integration of Cheminformatics resources with the SADI framework

    Directory of Open Access Journals (Sweden)

    Chepelev Leonid L

    2011-05-01

    Full Text Available Abstract Background The diversity and the largely independent nature of chemical research efforts over the past half century are, most likely, the major contributors to the current poor state of chemical computational resource and database interoperability. While open software for chemical format interconversion and database entry cross-linking have partially addressed database interoperability, computational resource integration is hindered by the great diversity of software interfaces, languages, access methods, and platforms, among others. This has, in turn, translated into limited reproducibility of computational experiments and the need for application-specific computational workflow construction and semi-automated enactment by human experts, especially where emerging interdisciplinary fields, such as systems chemistry, are pursued. Fortunately, the advent of the Semantic Web, and the very recent introduction of RESTful Semantic Web Services (SWS may present an opportunity to integrate all of the existing computational and database resources in chemistry into a machine-understandable, unified system that draws on the entirety of the Semantic Web. Results We have created a prototype framework of Semantic Automated Discovery and Integration (SADI framework SWS that exposes the QSAR descriptor functionality of the Chemistry Development Kit. Since each of these services has formal ontology-defined input and output classes, and each service consumes and produces RDF graphs, clients can automatically reason about the services and available reference information necessary to complete a given overall computational task specified through a simple SPARQL query. We demonstrate this capability by carrying out QSAR analysis backed by a simple formal ontology to determine whether a given molecule is drug-like. Further, we discuss parameter-based control over the execution of SADI SWS. Finally, we demonstrate the value of computational resource

  2. Development of high performance scientific components for interoperability of computing packages

    Energy Technology Data Exchange (ETDEWEB)

    Gulabani, Teena Pratap [Iowa State Univ., Ames, IA (United States)

    2008-01-01

    Three major high performance quantum chemistry computational packages, NWChem, GAMESS and MPQC have been developed by different research efforts following different design patterns. The goal is to achieve interoperability among these packages by overcoming the challenges caused by the different communication patterns and software design of each of these packages. A chemistry algorithm is hard to develop as well as being a time consuming process; integration of large quantum chemistry packages will allow resource sharing and thus avoid reinvention of the wheel. Creating connections between these incompatible packages is the major motivation of the proposed work. This interoperability is achieved by bringing the benefits of Component Based Software Engineering through a plug-and-play component framework called Common Component Architecture (CCA). In this thesis, I present a strategy and process used for interfacing two widely used and important computational chemistry methodologies: Quantum Mechanics and Molecular Mechanics. To show the feasibility of the proposed approach the Tuning and Analysis Utility (TAU) has been coupled with NWChem code and its CCA components. Results show that the overhead is negligible when compared to the ease and potential of organizing and coping with large-scale software applications.

  3. Dynamic Slicing: a Generic Analysis Based on a Natural Semantics Format

    OpenAIRE

    Gouranton, Valérie; Le Métayer, Daniel

    1998-01-01

    Slicing analyses have been proposed for different programming languages. Rather than defining a new analysis from scratch for each programming language, we would like to specify such an analysis once for all, in a language-independent way, and then specialise it for different programming languages. In order to achieve this goal, we propose a notion of natural semantics format and a dynamic slicing analysis format. The natural semantics format formalises a class of natural semantics and the an...

  4. Semantics - Supportive Element for the Cooperative Evaluation of Geographical and Historical Information

    OpenAIRE

    Karmacharya, Ashish; Kohr, Tobias; Cruz, Christophe; Bruhn, Kai-Christian; Boochs, Frank

    2013-01-01

    International audience The emergence of the Semantic Web and its underlying knowledge technologies has brought changes in data han- dling. Transferring expert knowledge to machines through knowledge formalization provides us the required support in managing huge datasets like the information in the World Wide Web. In the field of geospatial technology semantic technologies not only entail the capability to achieve higher degree of data integration but also infer semantics to discover new a...

  5. SEMANTIC TECHNIQUES FOR IOT DATA AND SERVICE MANAGEMENT: ONTOSMART SYSTEM

    Directory of Open Access Journals (Sweden)

    L. Nachabe

    2016-08-01

    Full Text Available In 2020 more than50 billions devices will be connected over the Internet. Every device will be connected to anything, anyone, anytime and anywhere in the world of Internet of Thing or IoT. This network will generate tremendous unstructured or semi structured data that should be shared between different devices/machines for advanced and automated service delivery in the benefits of the user’s daily life. Thus, mechanisms for data interoperability and automatic service discovery and delivery should be offered. Although many approaches have been suggested in the state of art, none of these researches provide a fully interoperable, light, flexible and modular Sensing/Actuating as service architecture. Therefore, this paper introduces a new Semantic Multi Agent architecture named OntoSmart for IoT data and service management through service oriented paradigm. It proposes sensors/actuators and scenarios independent flexible context aware and distributed architecture for IoT systems, in particular smart home systems.

  6. On the Unification of Process Semantics: Logical Semantics

    CERN Document Server

    Romero-Hernández, David; 10.4204/EPTCS.62.4

    2011-01-01

    We continue with the task of obtaining a unifying view of process semantics by considering in this case the logical characterization of the semantics. We start by considering the classic linear time-branching time spectrum developed by R.J. van Glabbeek. He provided a logical characterization of most of the semantics in his spectrum but, without following a unique pattern. In this paper, we present a uniform logical characterization of all the semantics in the enlarged spectrum. The common structure of the formulas that constitute all the corresponding logics gives us a much clearer picture of the spectrum, clarifying the relations between the different semantics, and allows us to develop generic proofs of some general properties of the semantics.

  7. A structured alternative to Prolog with simple compositional semantics

    CERN Document Server

    Porto, António

    2011-01-01

    Prolog's very useful expressive power is not captured by traditional logic programming semantics, due mainly to the cut and goal and clause order. Several alternative semantics have been put forward, exposing operational details of the computation state. We propose instead to redesign Prolog around structured alternatives to the cut and clauses, keeping the expressive power and computation model but with a compositional denotational semantics over much simpler states-just variable bindings. This considerably eases reasoning about programs, by programmers and tools such as a partial evaluator, with safe unfolding of calls through predicate definitions. An if-then-else across clauses replaces most uses of the cut, but the cut's full power is achieved by an until construct. Disjunction, conjunction and until, along with unification, are the primitive goal types with a compositional semantics yielding sequences of variable-binding solutions. This extends to programs via the usual technique of a least fixpoint con...

  8. Practical Semantic Astronomy

    Science.gov (United States)

    Graham, Matthew; Gray, N.; Burke, D.

    2010-01-01

    Many activities in the era of data-intensive astronomy are predicated upon some transference of domain knowledge and expertise from human to machine. The semantic infrastructure required to support this is no longer a pipe dream of computer science but a set of practical engineering challenges, more concerned with deployment and performance details than AI abstractions. The application of such ideas promises to help in such areas as contextual data access, exploiting distributed annotation and heterogeneous sources, and intelligent data dissemination and discovery. In this talk, we will review the status and use of semantic technologies in astronomy, particularly to address current problems in astroinformatics, with such projects as SKUA and AstroCollation.

  9. Semantic Gaps Are Dangerous

    DEFF Research Database (Denmark)

    Ejstrup, Michael; le Fevre Jakobsen, Bjarne

    Semantic gaps are dangerous Language adapts to the environment where it serves as a tool to communication. Language is a social agreement, and we all have to stick to both grammaticalized and non-grammaticalized rules in order to pass information about the world around us. As such language develops...... unpolite language and tend to create dangerous relations where specialy language creates problems and trouble that could be avoided if we had better language tools at hand. But we have not these tools of communication, and we are in a situation today where media and specially digital and social media......, supported by new possibilities of migration, create dangerous situations. How can we avoid these accidental gaps in language and specially the gaps in semantic and metaphoric tools. Do we have to keep silent and stop discusing certain isues, or do we have other ways to get acces to sufficient language tools...

  10. INTEROPERABILITY, TRUST BASED INFORMATION SHARING PROTOCOL AND SECURITY: DIGITAL GOVERNMENT KEY ISSUES

    Directory of Open Access Journals (Sweden)

    Md.Headayetullah

    2010-06-01

    Full Text Available Improved interoperability between public and private organizations is of key significance to make digitalgovernment newest triumphant. Digital Government interoperability, information sharing protocol andsecurity are measured the key issue for achieving a refined stage of digital government. Flawlessinteroperability is essential to share the information between diverse and merely dispersed organisationsin several network environments by using computer based tools. Digital government must ensure securityfor its information systems, including computers and networks for providing better service to the citizens.Governments around the world are increasingly revolving to information sharing and integration forsolving problems in programs and policy areas. Evils of global worry such as syndrome discovery andmanage, terror campaign, immigration and border control, prohibited drug trafficking, and more demandinformation sharing, harmonization and cooperation amid government agencies within a country andacross national borders. A number of daunting challenges survive to the progress of an efficientinformation sharing protocol. A secure and trusted information-sharing protocol is required to enableusers to interact and share information easily and perfectly across many diverse networks and databasesglobally. This article presents (1 literature review of digital government security and interoperabilityand, (2 key research issue trust based information sharing protocol for seamless interoperability amongdiverse government organizations or agencies around the world. While trust-based information access iswell studied in the literature, presented secure information sharing technologies and protocols cannotoffer enough incentives for government agencies to share information amid them without harming theirown national interest. To overcome the drawbacks of the exiting technology, an innovative and proficienttrust-based security protocol is proposed in this

  11. Postmarketing Safety Study Tool: A Web Based, Dynamic, and Interoperable System for Postmarketing Drug Surveillance Studies

    Directory of Open Access Journals (Sweden)

    A. Anil Sinaci

    2015-01-01

    Full Text Available Postmarketing drug surveillance is a crucial aspect of the clinical research activities in pharmacovigilance and pharmacoepidemiology. Successful utilization of available Electronic Health Record (EHR data can complement and strengthen postmarketing safety studies. In terms of the secondary use of EHRs, access and analysis of patient data across different domains are a critical factor; we address this data interoperability problem between EHR systems and clinical research systems in this paper. We demonstrate that this problem can be solved in an upper level with the use of common data elements in a standardized fashion so that clinical researchers can work with different EHR systems independently of the underlying information model. Postmarketing Safety Study Tool lets the clinical researchers extract data from different EHR systems by designing data collection set schemas through common data elements. The tool interacts with a semantic metadata registry through IHE data element exchange profile. Postmarketing Safety Study Tool and its supporting components have been implemented and deployed on the central data warehouse of the Lombardy region, Italy, which contains anonymized records of about 16 million patients with over 10-year longitudinal data on average. Clinical researchers in Roche validate the tool with real life use cases.

  12. Metaphor and Lexical Semantics

    OpenAIRE

    Michael Glanzberg

    2008-01-01

    ABSTRACT: This paper shows that several sorts of expressions cannot be interpreted metaphorically, including determiners, tenses, etc. Generally, functional categories cannot be interpreted metaphorically, while lexical categories can. This reveals a semantic property of functional categories, and it shows that metaphor can be used as a probe for investigating them. It also reveals an important linguistic constraint on metaphor. The paper argues this constraint applies to the interface betwee...

  13. Insensitive Enough Semantics

    Directory of Open Access Journals (Sweden)

    Richard Vallée

    2006-06-01

    Full Text Available According to some philosophers, sentences like (1 “It is raining” and (2 “John is ready” are context sensitive sentences even if they do not contain indexicals or demonstratives. That view initiated a context sensitivity frenzy. Cappelen and Lepore (2005 summarize the frenzy by the slogan “Every sentence is context sensitive” (Insensitive Semantics, p. 6, note 5. They suggest a view they call Minimalism according to which the truth conditions of utterances of sentences like (1/(2 are exactly what Convention T gives you. I will distinguish different propositions, and refocus semantics on sentences. As distinct from what the protagonists in the ongoing debate think, I argue that the content or truth conditions of utterances of both context sensitive sentences and sentences like (1/(2 are not interesting from a semantic point of view, and that the problem sentences like (1/(2 raises is not about context sensitivity or context insensitivity of sentences, but relevance of the content of utterances.

  14. Complex Semantic Networks

    Science.gov (United States)

    Teixeira, G. M.; Aguiar, M. S. F.; Carvalho, C. F.; Dantas, D. R.; Cunha, M. V.; Morais, J. H. M.; Pereira, H. B. B.; Miranda, J. G. V.

    Verbal language is a dynamic mental process. Ideas emerge by means of the selection of words from subjective and individual characteristics throughout the oral discourse. The goal of this work is to characterize the complex network of word associations that emerge from an oral discourse from a discourse topic. Because of that, concepts of associative incidence and fidelity have been elaborated and represented the probability of occurrence of pairs of words in the same sentence in the whole oral discourse. Semantic network of words associations were constructed, where the words are represented as nodes and the edges are created when the incidence-fidelity index between pairs of words exceeds a numerical limit (0.001). Twelve oral discourses were studied. The networks generated from these oral discourses present a typical behavior of complex networks and their indices were calculated and their topologies characterized. The indices of these networks obtained from each incidence-fidelity limit exhibit a critical value in which the semantic network has maximum conceptual information and minimum residual associations. Semantic networks generated by this incidence-fidelity limit depict a pattern of hierarchical classes that represent the different contexts used in the oral discourse.

  15. Towards sustainability: An interoperability outline for a Regional ARC based infrastructure in the WLCG and EGEE infrastructures

    CERN Document Server

    Field, L; Johansson, D; Kleist, J

    2010-01-01

    Interoperability of grid infrastructures is becoming increasingly important in the emergence of large scale grid infrastructures based on national and regional initiatives. To achieve interoperability of grid infrastructures adaptions and bridging of many different systems and services needs to be tackled. A grid infrastructure offers services for authentication, authorization, accounting, monitoring, operation besides from the services for handling and data and computations. This paper presents an outline of the work done to integrate the Nordic Tier-1 and 2s, which for the compute part is based on the ARC middleware, into the WLCG grid infrastructure co-operated by the EGEE project. Especially, a throughout description of integration of the compute services is presented.

  16. Exploring Interoperability as a Multidimensional Challenge for Effective Emergency Response

    Science.gov (United States)

    Santisteban, Hiram

    2010-01-01

    Purpose. The purpose of this research was to further an understanding of how the federal government is addressing the challenges of interoperability for emergency response or crisis management (FEMA, 2009) by informing the development of standards through the review of current congressional law, commissions, studies, executive orders, and…

  17. Interoperability Gap Challenges for Learning Object Repositories & Learning Management Systems

    Science.gov (United States)

    Mason, Robert T.

    2011-01-01

    An interoperability gap exists between Learning Management Systems (LMSs) and Learning Object Repositories (LORs). Learning Objects (LOs) and the associated Learning Object Metadata (LOM) that is stored within LORs adhere to a variety of LOM standards. A common LOM standard found in LORs is the Sharable Content Object Reference Model (SCORM)…

  18. Ontologies and datasets for energy management system interoperability

    OpenAIRE

    Weise, Mathias; Poveda Villalón, María; García Castro, Raúl; Euzenat, Jérôme; Priego, Luz Maria; Fies, Bruno; Cavallaro, Andrea; Peters-Anders, Jan; Zoi Tsagkari, Kleopatra

    2015-01-01

    weise2015a; This document presents a final report of the work carried out as part of work package 2 of the READY4SmartCitiesproject (R4SC), whose goal it is to identify the knowledge and data resources that support interoperability for energymanagement systems. The document is divided into two parts.

  19. Global Interoperability of Broadband Networks (GIBN): Project Overview

    Science.gov (United States)

    DePaula, Ramon P.

    1998-01-01

    Various issues associated with the Global Interoperability of Broadband Networks (GIBN) are presented in viewgraph form. Specific topics include GIBN principles, objectives and goals, and background. GIBN/NASA status, the Transpacific High Definition Video experiment, GIBN experiment selection criteria, satellite industry involvement, and current experiments associated with GIBN are also discussed.

  20. Putting the School Interoperability Framework to the Test

    Science.gov (United States)

    Mercurius, Neil; Burton, Glenn; Hopkins, Bill; Larsen, Hans

    2004-01-01

    The Jurupa Unified School District in Southern California recently partnered with Microsoft, Dell and the Zone Integration Group for the implementation of a School Interoperability Framework (SIF) database repository model throughout the district (Magner 2002). A two-week project--the Integrated District Education Applications System, better known…

  1. Managing Uncertainty: The Road Towards Better Data Interoperability

    NARCIS (Netherlands)

    Keulen, van Maurice; Herschel, M.

    2012-01-01

    Data interoperability encompasses the many data management activities needed for effective information management in anyone´s or any organization´s everyday work such as data cleaning, coupling, fusion, mapping, and information extraction. It is our conviction that a significant amount of money and

  2. Interoperable transactions in business models: A structured approach

    NARCIS (Netherlands)

    Weigand, H.; Verharen, E.; Dignum, F.P.M.

    1996-01-01

    Recent database research has given much attention to the specification of "flexible" transactions that can be used in interoperable systems. Starting from a quite different angle, Business Process Modelling has approached the area of communication modelling as well (the Language/Action perspective).

  3. Documentation and Reporting of Nutrition - Interoperability, Standards, Practice and Procedures.

    Science.gov (United States)

    Rotegård, Ann Kristin

    2016-01-01

    Interoperability, fragmentation, standardization and data integrity are key challenges in efforts to improve documentation, streamline reporting and ensure quality of care. This workshop aims at demonstrating and discussing health politics and solutions aimed to improve nutritional status in elderly. PMID:27332331

  4. System Interoperability Study for Healthcare Information System with Web Services

    Directory of Open Access Journals (Sweden)

    J. K. Zhang

    2007-01-01

    Full Text Available This paper describes the use of a new distributed middleware technology ‘Web Services’ in the proposed Healthcare Information System (HIS to address the issue of system interoperability raised from existing Healthcare Information systems. With the development of HISs, hospitals and healthcare institutes have been building their own HISs for processing massive healthcare data, such as, systems built up for hospitals under the NHS (National Health Service to manage patients’ records. Nowadays many healthcare providers are willing to integrate their systems’ functions and data for information sharing. This has raised concerns in data transmission, data security and network limitation. Among these issues, system and language interoperability are one of most obvious issues since data and application integration is not an easy task due to differences in programming languages, system platforms, Database Management Systems (DBMS used within different systems. As a new distributed middleware technology, Web service brings an ideal solution to the issue of system and language interoperability. Web service has been approved to be very successful in many commercial applications (e.g. Amazon.com, Dell computer, etc., however it is different to healthcare information system. As the result, Web Service-based Integrated Healthcare Information System (WSIHIS is proposed to address the interoperability issue of existing HISs but also to introduce this new technology into the healthcare environment.

  5. Information and documentation - Thesauri and interoperability with other vocabularies

    DEFF Research Database (Denmark)

    Lykke, Marianne; Dalbin, Sylvie; Smedt, Johan De;

    ISO 25964-2:2013 is applicable to thesauri and other types of vocabulary that are commonly used for information retrieval. It describes, compares and contrasts the elements and features of these vocabularies that are implicated when interoperability is needed. It gives recommendations...... for the establishment and maintenance of mappings between multiple thesauri, or between thesauri and other types of vocabularies....

  6. Metadata behind the interoperability of wireless sensor networks

    NARCIS (Netherlands)

    Ballari, D.E.; Wachowicz, M.; Manso-Callejo, M.A.

    2009-01-01

    Wireless Sensor Networks (WSNs) produce changes of status that are frequent, dynamic and unpredictable, and cannot be represented using a linear cause-effect approach. Consequently, a new approach is needed to handle these changes in order to support dynamic interoperability. Our approach is to intr

  7. The Next Generation of Interoperability Agents in Healthcare

    Directory of Open Access Journals (Sweden)

    Luciana Cardoso

    2014-05-01

    Full Text Available Interoperability in health information systems is increasingly a requirement rather than an option. Standards and technologies, such as multi-agent systems, have proven to be powerful tools in interoperability issues. In the last few years, the authors have worked on developing the Agency for Integration, Diffusion and Archive of Medical Information (AIDA, which is an intelligent, agent-based platform to ensure interoperability in healthcare units. It is increasingly important to ensure the high availability and reliability of systems. The functions provided by the systems that treat interoperability cannot fail. This paper shows the importance of monitoring and controlling intelligent agents as a tool to anticipate problems in health information systems. The interaction between humans and agents through an interface that allows the user to create new agents easily and to monitor their activities in real time is also an important feature, as health systems evolve by adopting more features and solving new problems. A module was installed in Centro Hospitalar do Porto, increasing the functionality and the overall usability of AIDA.

  8. Attention trees and semantic paths

    Science.gov (United States)

    Giusti, Christian; Pieroni, Goffredo G.; Pieroni, Laura

    2007-02-01

    In the last few decades several techniques for image content extraction, often based on segmentation, have been proposed. It has been suggested that under the assumption of very general image content, segmentation becomes unstable and classification becomes unreliable. According to recent psychological theories, certain image regions attract the attention of human observers more than others and, generally, the image main meaning appears concentrated in those regions. Initially, regions attracting our attention are perceived as a whole and hypotheses on their content are formulated; successively the components of those regions are carefully analyzed and a more precise interpretation is reached. It is interesting to observe that an image decomposition process performed according to these psychological visual attention theories might present advantages with respect to a traditional segmentation approach. In this paper we propose an automatic procedure generating image decomposition based on the detection of visual attention regions. A new clustering algorithm taking advantage of the Delaunay- Voronoi diagrams for achieving the decomposition target is proposed. By applying that algorithm recursively, starting from the whole image, a transformation of the image into a tree of related meaningful regions is obtained (Attention Tree). Successively, a semantic interpretation of the leaf nodes is carried out by using a structure of Neural Networks (Neural Tree) assisted by a knowledge base (Ontology Net). Starting from leaf nodes, paths toward the root node across the Attention Tree are attempted. The task of the path consists in relating the semantics of each child-parent node pair and, consequently, in merging the corresponding image regions. The relationship detected in this way between two tree nodes generates, as a result, the extension of the interpreted image area through each step of the path. The construction of several Attention Trees has been performed and partial

  9. Agent Based Knowledge Management Solution using Ontology, Semantic Web Services and GIS

    Directory of Open Access Journals (Sweden)

    Andreea DIOSTEANU

    2009-01-01

    Full Text Available The purpose of our research is to develop an agent based knowledge management application framework using a specific type of ontology that is able to facilitate semantic web service search and automatic composition. This solution can later on be used to develop complex solutions for location based services, supply chain management, etc. This application for modeling knowledge highlights the importance of agent interaction that leads to efficient enterprise interoperability. Furthermore, it proposes an "agent communication language" ontology that extends the OWL Lite standard approach and makes it more flexible in retrieving proper data for identifying the agents that can best communicate and negotiate.

  10. SEMANTIC WEB (CREATING AND QUERYING

    Directory of Open Access Journals (Sweden)

    Vidya S. Dandagi

    2016-01-01

    Full Text Available Semantic Web is a system that allows machines to understand complex human requests. Depending on the meaning semantic web replies. Semantics is the learning of the meanings of linguistic appearance. It is the main branch of contemporary linguistics. Semantics is meaning of words, text or a phrase and relations between them. RDF provides essential support to the Semantic Web. To represent distributed information RDF is created. Applications can use RDF created and process it in an adaptive manner. Knowledge representation is done using RDF standards and it is machine understandable. This paper describes the creation of a semantic web using RDF, and retrieval of accurate results using SparQL query language.

  11. Semantic Shift in Plant Names

    OpenAIRE

    DOSKOČILOVÁ, Iveta

    2014-01-01

    The aim of the present work is to identify and list English plant names coined by semantic shift, namely by metaphor, metonymy or synecdoche, and to carry out a detailed categorisation of individual semantic categories based on different tendencies within them and interpretation of the results. The theoretical part of my work focuses on different approaches to semantic shift and its categories. It is followed by the practical part which deals individually with metaphor, metonymy and synecdoch...

  12. Survey of semantic modeling techniques

    Energy Technology Data Exchange (ETDEWEB)

    Smith, C.L.

    1975-07-01

    The analysis of the semantics of programing languages was attempted with numerous modeling techniques. By providing a brief survey of these techniques together with an analysis of their applicability for answering semantic issues, this report attempts to illuminate the state-of-the-art in this area. The intent is to be illustrative rather than thorough in the coverage of semantic models. A bibliography is included for the reader who is interested in pursuing this area of research in more detail.

  13. System semantics of explanatory dictionaries

    OpenAIRE

    Volodymyr Shyrokov

    2015-01-01

    System semantics of explanatory dictionariesSome semantic properties of the language to be followed from the structure of lexicographical systems of big explanatory dictionaries are considered. The hyperchains and hypercycles are determined as the definite kind of automorphisms of the lexicographical system of explanatory dictionary. Some semantic consequencies following from the principles of lexicographic closure and lexicographic completeness are investigated using the hyperchains and hype...

  14. Semantic multimedia analysis and processing

    CERN Document Server

    Spyrou, Evaggelos; Mylonas, Phivos

    2014-01-01

    Broad in scope, Semantic Multimedia Analysis and Processing provides a complete reference of techniques, algorithms, and solutions for the design and the implementation of contemporary multimedia systems. Offering a balanced, global look at the latest advances in semantic indexing, retrieval, analysis, and processing of multimedia, the book features the contributions of renowned researchers from around the world. Its contents are based on four fundamental thematic pillars: 1) information and content retrieval, 2) semantic knowledge exploitation paradigms, 3) multimedia personalization, and 4)

  15. OMOGENIA: A Semantically Driven Collaborative Environment

    Science.gov (United States)

    Liapis, Aggelos

    Ontology creation can be thought of as a social procedure. Indeed the concepts involved in general need to be elicited from communities of domain experts and end-users by teams of knowledge engineers. Many problems in ontology creation appear to resemble certain problems in software design, particularly with respect to the setup of collaborative systems. For instance, the resolution of conceptual conflicts between formalized ontologies is a major engineering problem as ontologies move into widespread use on the semantic web. Such conflict resolution often requires human collaboration and cannot be achieved by automated methods with the exception of simple cases. In this chapter we discuss research in the field of computer-supported cooperative work (CSCW) that focuses on classification and which throws light on ontology building. Furthermore, we present a semantically driven collaborative environment called OMOGENIA as a natural way to display and examine the structure of an evolving ontology in a collaborative setting.

  16. Semantic Content Filtering with Wikipedia and Ontologies

    CERN Document Server

    Malo, Pekka; Ahlgren, Oskar; Wallenius, Jyrki; Korhonen, Pekka

    2010-01-01

    The use of domain knowledge is generally found to improve query efficiency in content filtering applications. In particular, tangible benefits have been achieved when using knowledge-based approaches within more specialized fields, such as medical free texts or legal documents. However, the problem is that sources of domain knowledge are time-consuming to build and equally costly to maintain. As a potential remedy, recent studies on Wikipedia suggest that this large body of socially constructed knowledge can be effectively harnessed to provide not only facts but also accurate information about semantic concept-similarities. This paper describes a framework for document filtering, where Wikipedia's concept-relatedness information is combined with a domain ontology to produce semantic content classifiers. The approach is evaluated using Reuters RCV1 corpus and TREC-11 filtering task definitions. In a comparative study, the approach shows robust performance and appears to outperform content classifiers based on ...

  17. Requirements Development for Interoperability Simulation Capability for Law Enforcement

    Energy Technology Data Exchange (ETDEWEB)

    Holter, Gregory M.

    2004-05-19

    The National Counterdrug Center (NCC) was initially authorized by Congress in FY 1999 appropriations to create a simulation-based counterdrug interoperability training capability. As the lead organization for Research and Analysis to support the NCC, the Pacific Northwest National Laboratory (PNNL) was responsible for developing the requirements for this interoperability simulation capability. These requirements were structured to address the hardware and software components of the system, as well as the deployment and use of the system. The original set of requirements was developed through a process of conducting a user-based survey of requirements for the simulation capability, coupled with an analysis of similar development efforts. The user-based approach ensured that existing concerns with respect to interoperability within the law enforcement community would be addressed. Law enforcement agencies within the designated pilot area of Cochise County, Arizona, were surveyed using interviews and ride-alongs during actual operations. The results of this survey were then accumulated, organized, and validated with the agencies to ensure the accuracy of the results. These requirements were then supplemented by adapting operational requirements from existing systems to ensure system reliability and operability. The NCC adopted a development approach providing incremental capability through the fielding of a phased series of progressively more capable versions of the system. This allowed for feedback from system users to be incorporated into subsequent revisions of the system requirements, and also allowed the addition of new elements as needed to adapt the system to broader geographic and geopolitical areas, including areas along the southwest and northwest U.S. borders. This paper addresses the processes used to develop and refine requirements for the NCC interoperability simulation capability, as well as the response of the law enforcement community to the use of

  18. System semantics of explanatory dictionaries

    Directory of Open Access Journals (Sweden)

    Volodymyr Shyrokov

    2015-11-01

    Full Text Available System semantics of explanatory dictionariesSome semantic properties of the language to be followed from the structure of lexicographical systems of big explanatory dictionaries are considered. The hyperchains and hypercycles are determined as the definite kind of automorphisms of the lexicographical system of explanatory dictionary. Some semantic consequencies following from the principles of lexicographic closure and lexicographic completeness are investigated using the hyperchains and hypercycles formalism. The connection between the hypercyle properties of the lexicographical system semantics and Goedel’s incompleteness theorem is discussed.

  19. Semantic Representatives of the Concept

    Directory of Open Access Journals (Sweden)

    Elena N. Tsay

    2013-01-01

    Full Text Available In the article concept as one of the principle notions of cognitive linguistics is investigated. Considering concept as culture phenomenon, having language realization and ethnocultural peculiarities, the description of the concept “happiness” is presented. Lexical and semantic paradigm of the concept of happiness correlates with a great number of lexical and semantic variants. In the work semantic representatives of the concept of happiness, covering supreme spiritual values are revealed and semantic interpretation of their functioning in the Biblical discourse is given.

  20. The Semantic Automated Discovery and Integration (SADI Web service Design-Pattern, API and Reference Implementation

    Directory of Open Access Journals (Sweden)

    Wilkinson Mark D

    2011-10-01

    Full Text Available Abstract Background The complexity and inter-related nature of biological data poses a difficult challenge for data and tool integration. There has been a proliferation of interoperability standards and projects over the past decade, none of which has been widely adopted by the bioinformatics community. Recent attempts have focused on the use of semantics to assist integration, and Semantic Web technologies are being welcomed by this community. Description SADI - Semantic Automated Discovery and Integration - is a lightweight set of fully standards-compliant Semantic Web service design patterns that simplify the publication of services of the type commonly found in bioinformatics and other scientific domains. Using Semantic Web technologies at every level of the Web services "stack", SADI services consume and produce instances of OWL Classes following a small number of very straightforward best-practices. In addition, we provide codebases that support these best-practices, and plug-in tools to popular developer and client software that dramatically simplify deployment of services by providers, and the discovery and utilization of those services by their consumers. Conclusions SADI Services are fully compliant with, and utilize only foundational Web standards; are simple to create and maintain for service providers; and can be discovered and utilized in a very intuitive way by biologist end-users. In addition, the SADI design patterns significantly improve the ability of software to automatically discover appropriate services based on user-needs, and automatically chain these into complex analytical workflows. We show that, when resources are exposed through SADI, data compliant with a given ontological model can be automatically gathered, or generated, from these distributed, non-coordinating resources - a behaviour we have not observed in any other Semantic system. Finally, we show that, using SADI, data dynamically generated from Web services

  1. Interoperabilidad de sistemas de organización del conocimiento: el estado del arte Interoperability of knowledge organization systems: the state of the art

    Directory of Open Access Journals (Sweden)

    Ana M. Martínez Tamayo

    2011-06-01

    Full Text Available La interoperabilidad entre distintos sistemas de organización del conocimiento (SOC ha cobrado gran importancia en los últimos tiempos, con el propósito de facilitar la búsqueda simultánea en varias bases de datos o bien fusionar distintas bases de datos en una sola. Las nuevas normas para el diseño y desarrollo de SOC, la estadounidense Z39.19:2005 y la británica BS 8723-4:2007, incluyen recomendaciones detalladas para la interoperabilidad. También se encuentra en preparación una nueva norma ISO 25964-1 sobre tesauros e interoperabilidad que se agregará a las anteriores. La tecnología disponible proporciona herramientas para este fin, como son los formatos y requisitos funcionales de autoridades y las herramientas de la Web Semántica RDF/OWL, SKOS Core y XML. Por otro lado, actualmente es muy difícil diseñar y desarrollar nuevos SOC debido a los problemas económicos, de modo que la interoperabilidad hace posible aprovechar los SOC existentes. En este trabajo se revisan los conceptos básicos, los modelos y métodos recomendados por las normas, así como numerosas experiencias de interoperabilidad entre SOC que han sido documentadas.The interoperability between knowledge organization systems (KOS has become very important in recent years, in order to facilitate simultaneous searches in several databases or to merge different databases into one. The new standards for KOS design and development, the American Z39.19:2005 and the British 8723-4:2007, include detailed recommendations for interoperability. Also, there is a new ISO standard in preparation, the 25964-1 about thesauri and interoperability, which will be added to the above mentioned ones. The available technology provides tools for interoperability, e.g. formats and functional requirements for subject authority, as well as those for Semantic Web RDF/OWL, SKOS Core and XML. On the other hand, presently it is very hard to design and develop new KOS due to economical problems

  2. Domain Ontology, an Instrument of Semantic Web Knowledge Management in e-Learning

    Directory of Open Access Journals (Sweden)

    Anatoly Jasonovich Gladun

    2012-11-01

    Full Text Available Domain ontology is proposed as an instrument for controlling of e-learning course results. This approach instead of traditional testing more objectively reflects the structure of students’ knowledge. Use of ontologies for knowledge representation provides interoperability of the used testing systems. The domain knowledge has been found and reuse in the semantic web world. Domain ontology developing is based on the principles of ontological analysis. The method of it’s use for evaluation of the learners’ knowledge results is based on matching of the learners’ ontology with the reference ontology proposed by the trainer/tutor. A multi-agent system for e-learning (M(eL has been defined and a prototype system has been developed. In a case, the domain ontology and the semantic web representation has been applied in two e-learning university courses.

  3. Developing Semantic Business Model for VO Construction on Semantic Grid

    Institute of Scientific and Technical Information of China (English)

    CHU Wang; QIAN Depei

    2006-01-01

    This paper combines semantic web technology with business modeling and yields semantic business model that is semantically described in terms of roles and relationships. The semantic business model can be used to discover grid services by means of automation tools. The gap between business goals and grid services is bridged by role relationships and compositions of them, so that the virtual organization evolution is supported effectively. Semantic business model can support virtual organization validation at design stage rather than at run-time stage. The designers can animate their business model and make initial assessment of what interactions should occur between roles and in which order. The users can verify whether the grid service compositions satisfy business goals.

  4. Semantics and the crowd

    Institute of Scientific and Technical Information of China (English)

    Mark GREAVES

    2012-01-01

    One of the principal scientific challenges that drives my group is to understand the character of formal knowledge on the Web.By formal knowledge,I mean information that is represented on the Web in something other than natural language text—typically,as machine-readable Web data with a formal syntax and a specific,intended semantics.The Web provides a major counterpoint to our traditional artificial intelligence (AI) based accounts of formal knowledge.Most symbolic AI systems are designed to address sophisticated logical inference over coherent conceptual knowledge,and thus the underlying research is focused on characterizing formal properties such as entailment relations,time/space complexity of inference,monotonicity,and expressiveness.In contrast,the Semantic Web allows us to explore formal knowledge in a very different context,where data representations exist in a constantly changing,large-scale,highly distributed network of looselyconnected publishers and consumers,and are governed by a Web-derived set of social practices for discovery,trust,reliability,and use.We are particularly interested in understanding how large-scale Semantic Web data behaves over longer time periods:the way by which its producers and consumers shift their requirements over time;how uniform resource identifiers (URIs) are used to dynamically link knowledge together;and the overall lifecycle of Web data from publication,to use,integration with other knowledge,evolution,and eventual deprecation.We believe that understanding formal knowledge in this Web context is the key to bringing existing AI insights and knowledge bases to the level of scale and utility of the current hypertext Web.

  5. Universal semantic communication

    CERN Document Server

    Juba, Brendan

    2011-01-01

    Is meaningful communication possible between two intelligent parties who share no common language or background? In this work, a theoretical framework is proposed in which it is possible to address when and to what extent such semantic communication is possible: such problems can be rigorously addressed by explicitly focusing on the goals of the communication. Under this framework, it is possible to show that for many goals, communication without any common language or background is possible using universal protocols. This work should be accessible to anyone with an undergraduate-level knowled

  6. Communication of Semantic Properties

    DEFF Research Database (Denmark)

    Lenau, Torben Anker; Boelskifte, Per

    2004-01-01

    , but by the specific way that the materials are used in the product. Selection of materials is therefore often done by looking at similar products. The product as well as its constitutive materials possesses a number of technical properties like strength, stiffness and hardness. Furthermore the product possesses...... a number of semantic properties associated with the meaning we read from the form, colour, texture and sound of the product. The purpose of working with these properties can be to make the use of the product more self-evident, to form or enhance the cultural meaning of the product and to give the product...

  7. Semantic Roles and Grammatical Relations.

    Science.gov (United States)

    Van Valin, Robert D., Jr.

    The nature of semantic roles and grammatical relations are explored from the perspective of Role and Reference Grammar (RRG). It is proposed that unraveling the relational aspects of grammar involves the recognition that semantic roles fall into two types, thematic relations and macroroles, and that grammatical relations are not universal and are…

  8. Semantic Reasoning for Scene Interpretation

    DEFF Research Database (Denmark)

    Jensen, Lars Baunegaard With; Baseski, Emre; Pugeault, Nicolas;

    2008-01-01

    its potential by two applications. As a first application, we localize lane structures by the semantic descriptors and their relations in a Bayesian framework. As the second application, which is in the context of vision based grasping, we show how the semantic relations can be associated to actions...

  9. Semantic annotation for biological information retrieval system.

    Science.gov (United States)

    Oshaiba, Mohamed Marouf Z; El Houby, Enas M F; Salah, Akram

    2015-01-01

    Online literatures are increasing in a tremendous rate. Biological domain is one of the fast growing domains. Biological researchers face a problem finding what they are searching for effectively and efficiently. The aim of this research is to find documents that contain any combination of biological process and/or molecular function and/or cellular component. This research proposes a framework that helps researchers to retrieve meaningful documents related to their asserted terms based on gene ontology (GO). The system utilizes GO by semantically decomposing it into three subontologies (cellular component, biological process, and molecular function). Researcher has the flexibility to choose searching terms from any combination of the three subontologies. Document annotation is taking a place in this research to create an index of biological terms in documents to speed the searching process. Query expansion is used to infer semantically related terms to asserted terms. It increases the search meaningful results using the term synonyms and term relationships. The system uses a ranking method to order the retrieved documents based on the ranking weights. The proposed system achieves researchers' needs to find documents that fit the asserted terms semantically.

  10. Semantic Mediation via Access Broker: the OWS-9 experiment

    Science.gov (United States)

    Santoro, Mattia; Papeschi, Fabrizio; Craglia, Massimo; Nativi, Stefano

    2013-04-01

    Even with the use of common data models standards to publish and share geospatial data, users may still face semantic inconsistencies when they use Spatial Data Infrastructures - especially in multidisciplinary contexts. Several semantic mediation solutions exist to address this issue; they span from simple XSLT documents to transform from one data model schema to another, to more complex services based on the use of ontologies. This work presents the activity done in the context of the OGC Web Services Phase 9 (OWS-9) Cross Community Interoperability to develop a semantic mediation solution by enhancing the GEOSS Discovery and Access Broker (DAB). This is a middleware component that provides harmonized access to geospatial datasets according to client applications preferred service interface (Nativi et al. 2012, Vaccari et al. 2012). Given a set of remote feature data encoded in different feature schemas, the objective of the activity was to use the DAB to enable client applications to transparently access the feature data according to one single schema. Due to the flexible architecture of the Access Broker, it was possible to introduce a new transformation type in the configured chain of transformations. In fact, the Access Broker already provided the following transformations: Coordinate Reference System (CRS), spatial resolution, spatial extent (e.g., a subset of a data set), and data encoding format. A new software module was developed to invoke the needed external semantic mediation service and harmonize the accessed features. In OWS-9 the Access Broker invokes a SPARQL WPS to retrieve mapping rules for the OWS-9 schemas: USGS, and NGA schema. The solution implemented to address this problem shows the flexibility and extensibility of the brokering framework underpinning the GEO DAB: new services can be added to augment the number of supported schemas without the need to modify other components and/or software modules. Moreover, all other transformations (CRS

  11. The 2nd DBCLS BioHackathon: interoperable bioinformatics Web services for integrated applications

    Directory of Open Access Journals (Sweden)

    Katayama Toshiaki

    2011-08-01

    Full Text Available Abstract Background The interaction between biological researchers and the bioinformatics tools they use is still hampered by incomplete interoperability between such tools. To ensure interoperability initiatives are effectively deployed, end-user applications need to be aware of, and support, best practices and standards. Here, we report on an initiative in which software developers and genome biologists came together to explore and raise awareness of these issues: BioHackathon 2009. Results Developers in attendance came from diverse backgrounds, with experts in Web services, workflow tools, text mining and visualization. Genome biologists provided expertise and exemplar data from the domains of sequence and pathway analysis and glyco-informatics. One goal of the meeting was to evaluate the ability to address real world use cases in these domains using the tools that the developers represented. This resulted in i a workflow to annotate 100,000 sequences from an invertebrate species; ii an integrated system for analysis of the transcription factor binding sites (TFBSs enriched based on differential gene expression data obtained from a microarray experiment; iii a workflow to enumerate putative physical protein interactions among enzymes in a metabolic pathway using protein structure data; iv a workflow to analyze glyco-gene-related diseases by searching for human homologs of glyco-genes in other species, such as fruit flies, and retrieving their phenotype-annotated SNPs. Conclusions Beyond deriving prototype solutions for each use-case, a second major purpose of the BioHackathon was to highlight areas of insufficiency. We discuss the issues raised by our exploration of the problem/solution space, concluding that there are still problems with the way Web services are modeled and annotated, including: i the absence of several useful data or analysis functions in the Web service "space"; ii the lack of documentation of methods; iii lack of

  12. Interoperability between Publications, Reference Data and Visualisation Tools

    Science.gov (United States)

    Allen, Mark G.; Ocvirk, Pierre; Genova, Francoise

    2015-08-01

    Astronomy research is becoming more and more inter-connected, and there is a high expectation for our publications, reference data and tools to be interoperable. Publications are the hard earned final results of scientific endeavour, and technology allows us to enable publications as useable resources, going beyond their traditional role as a readable document. There is strong demand for simple access to the data associated with publications, and that links and references in publications are strongly connected to online resources, and are useable in visualisation tools. We highlight the capabilities of the CDS reference services for interoperability between the reference data obtained from publications, the connections between Journal and literature services, and combination of these data and information in Aladin and other CDS services. (In support of the abstract submitted by P. Ocvirk)

  13. "Pre-Semantic" Cognition Revisited: Critical Differences between Semantic Aphasia and Semantic Dementia

    Science.gov (United States)

    Jefferies, Elizabeth; Rogers, Timothy T.; Hopper, Samantha; Lambon Ralph, Matthew A.

    2010-01-01

    Patients with semantic dementia show a specific pattern of impairment on both verbal and non-verbal "pre-semantic" tasks, e.g., reading aloud, past tense generation, spelling to dictation, lexical decision, object decision, colour decision and delayed picture copying. All seven tasks are characterised by poorer performance for items that are…

  14. Evolution of Business Interoperability in the Automotive Industry

    OpenAIRE

    Wende, Kristin; Legner, Christine

    2006-01-01

    In recent years, the established roles in the automotive industry have undergone changes: Automakers which have traditionally executed control over the entire value chain are now increasingly focusing on branding and distribution. At the same time, tier-1 suppliers are becoming vehicle integrators. This paper analyses how new forms of cooperation impact the required level of business interoperability. The comparison of two cases, a traditional OEM-supplier relationship and an innovative form ...

  15. Proceedings International Workshop on Component and Service Interoperability

    CERN Document Server

    Cámara, Javier; Salaün, Gwen; 10.4204/EPTCS.37

    2010-01-01

    This volume contains the proceedings of WCSI 2010, the International Workshop on Component and Service Interoperability. WCSI 2010 was held in Malaga (Spain) on June 29th, 2010 as a satellite event of the TOOLS 2010 Federated Conferences. The papers published in this volume tackle different issues that are currently central to our community, namely definition of expressive interface languages, formal models and approaches to software composition and adaptation, interface-based compatibility and substitutability, and verification techniques for distributed software.

  16. Towards Responsive Open Learning Environments: the ROLE Interoperability framework

    OpenAIRE

    Govaerts S.; Verbert K.; Dahrendorf D.; Ullrich C.; Schmidt M.; Werkle M.; Chatterjee A.; Nussbaumer A.; Renzel D.; Scheffel M.

    2011-01-01

    In recent years, research on mash-up technologies for learning environments has gained interest. The overall goal is to enrich or replace traditional learning management systems (LMS) with mash-ups of widgets and services that can be easily combined and configured to fit the learner needs. This paper presents the implemented prototype of the ROLE interoperability framework and a business and an educational case study. The framework provides a common technical infrastructure to assemble widget...

  17. PyMOOSE: Interoperable Scripting in Python for MOOSE

    OpenAIRE

    Subhasis Ray; Bhalla, Upinder S

    2008-01-01

    Python is emerging as a common scripting language for simulators. This opens up many possibilities for interoperability in the form of analysis, interfaces, and communications between simulators. We report the integration of Python scripting with the Multi-scale Object Oriented Simulation Environment (MOOSE). MOOSE is a general-purpose simulation system for compartmental neuronal models and for models of signaling pathways based on chemical kinetics. We show how the Python-scripting version ...

  18. Promoting interoperability: the case for discipline-specific PSAPS

    OpenAIRE

    Walsh, Thomas Michael

    2014-01-01

    Approved for public release; distribution is unlimited Given that public safety answering points (PSAPs or 9-1-1 dispatch centers) are undergoing a process of consolidation, should that consolidation occur as a function of simple geographic proximity or discipline? This thesis investigated the differences among different dispatch disciplines, the effect of dispatching on interoperability, case studies investigating the operations of several different models of PSAP consolidation, and a the...

  19. Secure and interoperable communication infrastructures for PPDR organisations

    Science.gov (United States)

    Müller, Wilmuth; Marques, Hugo; Pereira, Luis; Rodriguez, Jonathan; Brouwer, Frank; Bouwers, Bert; Politis, Ilias; Lykourgiotis, Asimakis; Ladas, Alexandros; Adigun, Olayinka; Jelenc, David

    2016-05-01

    The growing number of events affecting public safety and security (PS&S) on a regional scale with potential to grow up to large scale cross border disasters puts an increased pressure on agencies and organisation responsible for PS&S. In order to respond timely and in an adequate manner to such events, Public Protection and Disaster Relief (PPDR) organisations need to cooperate, align their procedures and activities, share the needed information and be interoperable. Existing PPDR/PMR technologies such as TETRA, TETRAPOL or P25, do not currently provide broadband capability nor is expected such technologies to be upgraded in the future. This presents a major limitation in supporting new services and information flows. Furthermore, there is no known standard that addresses interoperability of these technologies. In this contribution the design of a next generation communication infrastructure for PPDR organisations which fulfills the requirements of secure and seamless end-to-end communication and interoperable information exchange within the deployed communication networks is presented. Based on Enterprise Architecture of PPDR organisations, a next generation PPDR network that is backward compatible with legacy communication technologies is designed and implemented, capable of providing security, privacy, seamless mobility, QoS and reliability support for mission-critical Private Mobile Radio (PMR) voice and broadband data services. The designed solution provides a robust, reliable, and secure mobile broadband communications system for a wide variety of PMR applications and services on PPDR broadband networks, including the ability of inter-system, interagency and cross-border operations with emphasis on interoperability between users in PMR and LTE.

  20. OGC® Engineering Report for the OWS Shibboleth Interoperability Experiment

    OpenAIRE

    Higgins, Christopher

    2012-01-01

    This document reports on outcomes from the OGC Web Services Shibboleth Interoperability Experiment (OSI). The main objective of OSI was to advance the use of Shibboleth (an open source implementation of SAML) as a means of protecting OWS. In the process, OSI helped develop further understanding of this approach to establishing trusted federations of OWS. This report documents these findings and is intended to be of use to those interested in how Shibboleth/SAML access management federations m...