WorldWideScience

Sample records for ontology mapping approach

  1. A Bayesian Network Approach to Ontology Mapping

    Pan, Rong; Ding, Zhongli; Yu, Yang; Peng, Yun

    2005-01-01

    This paper presents our ongoing effort on developing a principled methodology for automatic ontology mapping based on BayesOWL, a probabilistic framework we developed for modeling uncertainty in semantic web...

  2. Ontology Mapping Neural Network: An Approach to Learning and Inferring Correspondences among Ontologies

    Peng, Yefei

    2010-01-01

    An ontology mapping neural network (OMNN) is proposed in order to learn and infer correspondences among ontologies. It extends the Identical Elements Neural Network (IENN)'s ability to represent and map complex relationships. The learning dynamics of simultaneous (interlaced) training of similar tasks interact at the shared connections of the…

  3. Survey on Ontology Mapping

    Zhu, Junwu

    To create a sharable semantic space in which the terms from different domain ontology or knowledge system, Ontology mapping become a hot research point in Semantic Web Community. In this paper, motivated factors of ontology mapping research are given firstly, and then 5 dominating theories and methods, such as information accessing technology, machine learning, linguistics, structure graph and similarity, are illustrated according their technology class. Before we analyses the new requirements and takes a long view, the contributions of these theories and methods are summarized in details. At last, this paper suggest to design a group of semantic connector with the ability of migration learning for OWL-2 extended with constrains and the ontology mapping theory of axiom, so as to provide a new methodology for ontology mapping.

  4. Mapping between the OBO and OWL ontology languages.

    Tirmizi, Syed Hamid; Aitken, Stuart; Moreira, Dilvan A; Mungall, Chris; Sequeda, Juan; Shah, Nigam H; Miranker, Daniel P

    2011-03-07

    Ontologies are commonly used in biomedicine to organize concepts to describe domains such as anatomies, environments, experiment, taxonomies etc. NCBO BioPortal currently hosts about 180 different biomedical ontologies. These ontologies have been mainly expressed in either the Open Biomedical Ontology (OBO) format or the Web Ontology Language (OWL). OBO emerged from the Gene Ontology, and supports most of the biomedical ontology content. In comparison, OWL is a Semantic Web language, and is supported by the World Wide Web consortium together with integral query languages, rule languages and distributed infrastructure for information interchange. These features are highly desirable for the OBO content as well. A convenient method for leveraging these features for OBO ontologies is by transforming OBO ontologies to OWL. We have developed a methodology for translating OBO ontologies to OWL using the organization of the Semantic Web itself to guide the work. The approach reveals that the constructs of OBO can be grouped together to form a similar layer cake. Thus we were able to decompose the problem into two parts. Most OBO constructs have easy and obvious equivalence to a construct in OWL. A small subset of OBO constructs requires deeper consideration. We have defined transformations for all constructs in an effort to foster a standard common mapping between OBO and OWL. Our mapping produces OWL-DL, a Description Logics based subset of OWL with desirable computational properties for efficiency and correctness. Our Java implementation of the mapping is part of the official Gene Ontology project source. Our transformation system provides a lossless roundtrip mapping for OBO ontologies, i.e. an OBO ontology may be translated to OWL and back without loss of knowledge. In addition, it provides a roadmap for bridging the gap between the two ontology languages in order to enable the use of ontology content in a language independent manner.

  5. Merged ontology for engineering design: Contrasting empirical and theoretical approaches to develop engineering ontologies

    Ahmed, Saeema; Storga, M

    2009-01-01

    to developing the ontology engineering design integrated taxonomies (EDIT) with a theoretical approach in which concepts and relations are elicited from engineering design theories ontology (DO) The limitations and advantages of each approach are discussed. The research methodology adopted is to map......This paper presents a comparison of two previous and separate efforts to develop an ontology in the engineering design domain, together with an ontology proposal from which ontologies for a specific application may be derived. The research contrasts an empirical, user-centered approach...

  6. A Cognitive Support Framework for Ontology Mapping

    Falconer, Sean M.; Storey, Margaret-Anne

    Ontology mapping is the key to data interoperability in the semantic web. This problem has received a lot of research attention, however, the research emphasis has been mostly devoted to automating the mapping process, even though the creation of mappings often involve the user. As industry interest in semantic web technologies grows and the number of widely adopted semantic web applications increases, we must begin to support the user. In this paper, we combine data gathered from background literature, theories of cognitive support and decision making, and an observational case study to propose a theoretical framework for cognitive support in ontology mapping tools. We also describe a tool called CogZ that is based on this framework.

  7. An ontological approach to domain engineering

    Falbo, R.A.; Guizzardi, G.; Duarte, K.

    2002-01-01

    Domain engineering aims to support systematic reuse, focusing on modeling common knowledge in a problem domain. Ontologies have also been pointed as holding great promise for software reuse. In this paper, we present ODE (Ontology-based Domain Engineering), an ontological approach for domain

  8. Ontology mapping specification in description logics for cooperative ...

    Furthermore, the resolution of differences among ontologies is necessary to process queries or use web services in distributed heterogeneous environments. Mapping discovery is a key issue to allow efficient resolution of heterogeneity. We develop an architecture for mapping different systems associated with ontologies.

  9. The mouse-human anatomy ontology mapping project.

    Hayamizu, Terry F; de Coronado, Sherri; Fragoso, Gilberto; Sioutos, Nicholas; Kadin, James A; Ringwald, Martin

    2012-01-01

    The overall objective of the Mouse-Human Anatomy Project (MHAP) was to facilitate the mapping and harmonization of anatomical terms used for mouse and human models by Mouse Genome Informatics (MGI) and the National Cancer Institute (NCI). The anatomy resources designated for this study were the Adult Mouse Anatomy (MA) ontology and the set of anatomy concepts contained in the NCI Thesaurus (NCIt). Several methods and software tools were identified and evaluated, then used to conduct an in-depth comparative analysis of the anatomy ontologies. Matches between mouse and human anatomy terms were determined and validated, resulting in a highly curated set of mappings between the two ontologies that has been used by other resources. These mappings will enable linking of data from mouse and human. As the anatomy ontologies have been expanded and refined, the mappings have been updated accordingly. Insights are presented into the overall process of comparing and mapping between ontologies, which may prove useful for further comparative analyses and ontology mapping efforts, especially those involving anatomy ontologies. Finally, issues concerning further development of the ontologies, updates to the mapping files, and possible additional applications and significance were considered. DATABASE URL: http://obofoundry.org/cgi-bin/detail.cgi?id=ma2ncit.

  10. Towards natural language question generation for the validation of ontologies and mappings.

    Ben Abacha, Asma; Dos Reis, Julio Cesar; Mrabet, Yassine; Pruski, Cédric; Da Silveira, Marcos

    2016-08-08

    The increasing number of open-access ontologies and their key role in several applications such as decision-support systems highlight the importance of their validation. Human expertise is crucial for the validation of ontologies from a domain point-of-view. However, the growing number of ontologies and their fast evolution over time make manual validation challenging. We propose a novel semi-automatic approach based on the generation of natural language (NL) questions to support the validation of ontologies and their evolution. The proposed approach includes the automatic generation, factorization and ordering of NL questions from medical ontologies. The final validation and correction is performed by submitting these questions to domain experts and automatically analyzing their feedback. We also propose a second approach for the validation of mappings impacted by ontology changes. The method exploits the context of the changes to propose correction alternatives presented as Multiple Choice Questions. This research provides a question optimization strategy to maximize the validation of ontology entities with a reduced number of questions. We evaluate our approach for the validation of three medical ontologies. We also evaluate the feasibility and efficiency of our mappings validation approach in the context of ontology evolution. These experiments are performed with different versions of SNOMED-CT and ICD9. The obtained experimental results suggest the feasibility and adequacy of our approach to support the validation of interconnected and evolving ontologies. Results also suggest that taking into account RDFS and OWL entailment helps reducing the number of questions and validation time. The application of our approach to validate mapping evolution also shows the difficulty of adapting mapping evolution over time and highlights the importance of semi-automatic validation.

  11. An ontological approach to logistics

    Daniele, L.M.; Ferreira Pires, Luis; Zelm, M.; van Sinderen, Marten J.; Doumeingts, G.

    2013-01-01

    In today’s global market, the competitiveness of enterprises is strongly dictated by their ability to collaborate with other enterprises. Ontologies enable common understanding of concepts and have been acknowledged as a powerful means to foster collaboration, both within the boundaries of an

  12. Validating Domain Ontologies: A Methodology Exemplified for Concept Maps

    Steiner, Christina M.; Albert, Dietrich

    2017-01-01

    Ontologies play an important role as knowledge domain representations in technology-enhanced learning and instruction. Represented in form of concept maps they are commonly used as teaching and learning material and have the potential to enhance positive educational outcomes. To ensure the effective use of an ontology representing a knowledge…

  13. Spatial Data Integration Using Ontology-Based Approach

    Hasani, S.; Sadeghi-Niaraki, A.; Jelokhani-Niaraki, M.

    2015-12-01

    In today's world, the necessity for spatial data for various organizations is becoming so crucial that many of these organizations have begun to produce spatial data for that purpose. In some circumstances, the need to obtain real time integrated data requires sustainable mechanism to process real-time integration. Case in point, the disater management situations that requires obtaining real time data from various sources of information. One of the problematic challenges in the mentioned situation is the high degree of heterogeneity between different organizations data. To solve this issue, we introduce an ontology-based method to provide sharing and integration capabilities for the existing databases. In addition to resolving semantic heterogeneity, better access to information is also provided by our proposed method. Our approach is consisted of three steps, the first step is identification of the object in a relational database, then the semantic relationships between them are modelled and subsequently, the ontology of each database is created. In a second step, the relative ontology will be inserted into the database and the relationship of each class of ontology will be inserted into the new created column in database tables. Last step is consisted of a platform based on service-oriented architecture, which allows integration of data. This is done by using the concept of ontology mapping. The proposed approach, in addition to being fast and low cost, makes the process of data integration easy and the data remains unchanged and thus takes advantage of the legacy application provided.

  14. SPATIAL DATA INTEGRATION USING ONTOLOGY-BASED APPROACH

    S. Hasani

    2015-12-01

    Full Text Available In today's world, the necessity for spatial data for various organizations is becoming so crucial that many of these organizations have begun to produce spatial data for that purpose. In some circumstances, the need to obtain real time integrated data requires sustainable mechanism to process real-time integration. Case in point, the disater management situations that requires obtaining real time data from various sources of information. One of the problematic challenges in the mentioned situation is the high degree of heterogeneity between different organizations data. To solve this issue, we introduce an ontology-based method to provide sharing and integration capabilities for the existing databases. In addition to resolving semantic heterogeneity, better access to information is also provided by our proposed method. Our approach is consisted of three steps, the first step is identification of the object in a relational database, then the semantic relationships between them are modelled and subsequently, the ontology of each database is created. In a second step, the relative ontology will be inserted into the database and the relationship of each class of ontology will be inserted into the new created column in database tables. Last step is consisted of a platform based on service-oriented architecture, which allows integration of data. This is done by using the concept of ontology mapping. The proposed approach, in addition to being fast and low cost, makes the process of data integration easy and the data remains unchanged and thus takes advantage of the legacy application provided.

  15. Fund Finder: A case study of database-to-ontology mapping

    Barrasa Rodríguez, Jesús; Corcho, Oscar; Gómez-Pérez, A.

    2003-01-01

    The mapping between databases and ontologies is a basic problem when trying to "upgrade" deep web content to the semantic web. Our approach suggests the declarative definition of mappings as a way to achieve domain independency and reusability. A specific language (expressive enough to cover some real world mapping situations like lightly structured databases or not 1st normal form ones) is defined for this purpose. Along with this mapping description language, the ODEMapster processor is in ...

  16. Recognizing lexical and semantic change patterns in evolving life science ontologies to inform mapping adaptation.

    Dos Reis, Julio Cesar; Dinh, Duy; Da Silveira, Marcos; Pruski, Cédric; Reynaud-Delaître, Chantal

    2015-03-01

    Mappings established between life science ontologies require significant efforts to maintain them up to date due to the size and frequent evolution of these ontologies. In consequence, automatic methods for applying modifications on mappings are highly demanded. The accuracy of such methods relies on the available description about the evolution of ontologies, especially regarding concepts involved in mappings. However, from one ontology version to another, a further understanding of ontology changes relevant for supporting mapping adaptation is typically lacking. This research work defines a set of change patterns at the level of concept attributes, and proposes original methods to automatically recognize instances of these patterns based on the similarity between attributes denoting the evolving concepts. This investigation evaluates the benefits of the proposed methods and the influence of the recognized change patterns to select the strategies for mapping adaptation. The summary of the findings is as follows: (1) the Precision (>60%) and Recall (>35%) achieved by comparing manually identified change patterns with the automatic ones; (2) a set of potential impact of recognized change patterns on the way mappings is adapted. We found that the detected correlations cover ∼66% of the mapping adaptation actions with a positive impact; and (3) the influence of the similarity coefficient calculated between concept attributes on the performance of the recognition algorithms. The experimental evaluations conducted with real life science ontologies showed the effectiveness of our approach to accurately characterize ontology evolution at the level of concept attributes. This investigation confirmed the relevance of the proposed change patterns to support decisions on mapping adaptation. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Two obvious intuitions : Ontology-mapping needs background knowledge and approximation

    Van Harmelen, Frank

    2007-01-01

    Ontology mapping (or: ontology alignment, or integration) is one of the most active areas the Semantic Web area. An increasing amount of ontologies are becoming available in recent years, and if the Semantic Web is to be taken seriously, the problem of ontology mapping must be solved. Numerous

  18. NCBO Ontology Recommender 2.0: an enhanced approach for biomedical ontology recommendation.

    Martínez-Romero, Marcos; Jonquet, Clement; O'Connor, Martin J; Graybeal, John; Pazos, Alejandro; Musen, Mark A

    2017-06-07

    Ontologies and controlled terminologies have become increasingly important in biomedical research. Researchers use ontologies to annotate their data with ontology terms, enabling better data integration and interoperability across disparate datasets. However, the number, variety and complexity of current biomedical ontologies make it cumbersome for researchers to determine which ones to reuse for their specific needs. To overcome this problem, in 2010 the National Center for Biomedical Ontology (NCBO) released the Ontology Recommender, which is a service that receives a biomedical text corpus or a list of keywords and suggests ontologies appropriate for referencing the indicated terms. We developed a new version of the NCBO Ontology Recommender. Called Ontology Recommender 2.0, it uses a novel recommendation approach that evaluates the relevance of an ontology to biomedical text data according to four different criteria: (1) the extent to which the ontology covers the input data; (2) the acceptance of the ontology in the biomedical community; (3) the level of detail of the ontology classes that cover the input data; and (4) the specialization of the ontology to the domain of the input data. Our evaluation shows that the enhanced recommender provides higher quality suggestions than the original approach, providing better coverage of the input data, more detailed information about their concepts, increased specialization for the domain of the input data, and greater acceptance and use in the community. In addition, it provides users with more explanatory information, along with suggestions of not only individual ontologies but also groups of ontologies to use together. It also can be customized to fit the needs of different ontology recommendation scenarios. Ontology Recommender 2.0 suggests relevant ontologies for annotating biomedical text data. It combines the strengths of its predecessor with a range of adjustments and new features that improve its reliability

  19. Australia’s National Health Programs: An Ontological Mapping

    Arkalgud Ramaprasad

    2016-11-01

    Full Text Available Australia has a large number of health program initiatives whose comprehensive assessment will help refine and redefine priorities by highlighting areas of emphasis, under-emphasis, and non-emphasis. The objectives of our research are to: (a systematically map all the programs onto an ontological framework, and (b systemically analyse their relative emphases at different levels of granularity. We mapped all the health program initiatives onto an ontology with five dimensions, namely: (a Policy-scope, (b Policy-focus, (c Outcomes, (d Type of care, and (e Population served. Each dimension is expanded into a taxonomy of its constituent elements. Each combination of elements from the five dimensions is a possible policy initiative component. There are 30,030 possible components encapsulated in the ontology. It includes, for example: (a National financial policies on accessibility of preventive care for family, and (b Local-urban regulatory policies on cost of palliative care for individual-aged. Four of the authors mapped all of Australia’s health programs and initiatives on to the ontology. Visualizations of the data are used to highlight the relative emphases in the program initiatives. The dominant emphasis of the program initiatives is: [National] [educational, personnel-physician, information] policies on [accessibility, quality] of [preventive, wellness] care for the [community]. However, although (a information is emphasized technology is not; and (b accessibility and quality are emphasized cost, satisfaction, and quality are not. The ontology and the results of the mapping can help systematically reassess and redirect the relative emphases of the programs and initiatives from a systemic perspective.

  20. A priorean approach to time ontologies

    Øhrstrøm, Peter; Schärfe, Henrik

    2004-01-01

    Any non-trivial top-level ontology should take temporal notions into account. The details of how this should be done, however, are frequently debated. In this paper it is argued that "the four grades of tense-logical involvement" suggested by A.N. Prior form a useful framework for discussing how...... various temporal notions are related in a top-level ontology. Furthermore, a number of modern ontologies are analysed with respect to their incorporation of temporal notions. It is argued that all of them correspond to Prior's first and second grade, and that none of them reflect the views which Prior......'s third and fourth grade represent. Finally, the paper deals with Prior's ideas on a tensed ontology and it is argued that a logic based on the third grade and will be useful in the further development of tensed ontology....

  1. Knowledge Representation in Patient Safety Reporting: An Ontological Approach

    Liang Chen

    2016-10-01

    Full Text Available Purpose: The current development of patient safety reporting systems is criticized for loss of information and low data quality due to the lack of a uniformed domain knowledge base and text processing functionality. To improve patient safety reporting, the present paper suggests an ontological representation of patient safety knowledge. Design/methodology/approach: We propose a framework for constructing an ontological knowledge base of patient safety. The present paper describes our design, implementation, and evaluation of the ontology at its initial stage. Findings: We describe the design and initial outcomes of the ontology implementation. The evaluation results demonstrate the clinical validity of the ontology by a self-developed survey measurement. Research limitations: The proposed ontology was developed and evaluated using a small number of information sources. Presently, US data are used, but they are not essential for the ultimate structure of the ontology. Practical implications: The goal of improving patient safety can be aided through investigating patient safety reports and providing actionable knowledge to clinical practitioners. As such, constructing a domain specific ontology for patient safety reports serves as a cornerstone in information collection and text mining methods. Originality/value: The use of ontologies provides abstracted representation of semantic information and enables a wealth of applications in a reporting system. Therefore, constructing such a knowledge base is recognized as a high priority in health care.

  2. Mapping the entangled ontology of science teachers' lived experience

    Daugbjerg, Peer S.; de Freitas, Elizabeth; Valero, Paola

    2015-09-01

    In this paper we investigate how the bodily activity of teaching, along with the embodied aspect of lived experience, relates to science teachers' ways of dealing with bodies as living organisms which are both the subject matter as well as the site or vehicle of learning. More precisely, the following questions are pursued: (1) In what ways do primary science teachers refer to the lived and living body in teaching and learning? (2) In what ways do primary science teachers tap into past experiences in which the body figured prominently in order to teach students about living organisms? We draw on the relational ontology and intra-action of Karen Barad (J Women Cult Soc 28(3): 801, 2003) as she argues for a "relational ontology" that sees a relation as a dynamic flowing entanglement of a matter and meaning. We combine this with the materialist phenomenological studies of embodiment by SungWon Hwang and Wolff-Michael Roth (Scientific and mathematical bodies, Sense Publishers, Rotterdam, 2011), as they address how the teachers and students are present in the classroom with/in their "living and lived bodies". Our aim is to use theoretical insights from these two different but complementary approaches to map the embodiment of teachers' experiences and actions. We build our understanding of experience on the work of John Dewey (Experience and education, Simon & Schuster, New York, 1938) and also Jean Clandinin and Michael Connelly (Handbook of qualitative research, Sage Publications, California, 2000), leading us to propose three dimensions: settings, relations and continuity. This means that bodies and settings are mutually entailed in the present relation, and furthermore that the past as well as the present of these bodies and settings—their continuity—is also part of the present relation. We analyse the entanglement of lived experience and embodied teaching using these three proposed dimensions of experience. Analysing interviews and observations of three Danish

  3. Knowledge Representation in Patient Safety Reporting: An Ontological Approach

    Liang Chen; Yang Gong

    2016-01-01

    Purpose: The current development of patient safety reporting systems is criticized for loss of information and low data quality due to the lack of a uniformed domain knowledge base and text processing functionality. To improve patient safety reporting, the present paper suggests an ontological representation of patient safety knowledge. Design/methodology/approach: We propose a framework for constructing an ontological knowledge base of patient safety. The present paper describes our desig...

  4. An Ontology for State Analysis: Formalizing the Mapping to SysML

    Wagner, David A.; Bennett, Matthew B.; Karban, Robert; Rouquette, Nicolas; Jenkins, Steven; Ingham, Michel

    2012-01-01

    State Analysis is a methodology developed over the last decade for architecting, designing and documenting complex control systems. Although it was originally conceived for designing robotic spacecraft, recent applications include the design of control systems for large ground-based telescopes. The European Southern Observatory (ESO) began a project to design the European Extremely Large Telescope (E-ELT), which will require coordinated control of over a thousand articulated mirror segments. The designers are using State Analysis as a methodology and the Systems Modeling Language (SysML) as a modeling and documentation language in this task. To effectively apply the State Analysis methodology in this context it became necessary to provide ontological definitions of the concepts and relations in State Analysis and greater flexibility through a mapping of State Analysis into a practical extension of SysML. The ontology provides the formal basis for verifying compliance with State Analysis semantics including architectural constraints. The SysML extension provides the practical basis for applying the State Analysis methodology with SysML tools. This paper will discuss the method used to develop these formalisms (the ontology), the formalisms themselves, the mapping to SysML and approach to using these formalisms to specify a control system and enforce architectural constraints in a SysML model.

  5. Development of National Map ontologies for organization and orchestration of hydrologic observations

    Lieberman, J. E.

    2014-12-01

    Feature layers in the National Map program (TNM) are a fundamental context for much of the data collection and analysis conducted by the USGS and other governmental and nongovernmental organizations. Their computational usefulness, though, has been constrained by the lack of formal relationships besides superposition between TNM layers, as well as limited means of representing how TNM datasets relate to additional attributes, datasets, and activities. In the field of Geospatial Information Science, there has been a growing recognition of the value of semantic representation and technology for addressing these limitations, particularly in the face of burgeoning information volume and heterogeneity. Fundamental to this approach is the development of formal ontologies for concepts related to that information that can be processed computationally to enhance creation and discovery of new geospatial knowledge. They offer a means of making much of the presently innate knowledge about relationships in and between TNM features accessible for machine processing and distributed computation.A full and comprehensive ontology of all knowledge represented by TNM features is still impractical. The work reported here involves elaboration and integration of a number of small ontology design patterns (ODP's) that represent limited, discrete, but commonly accepted and broadly applicable physical theories for the behavior of TNM features representing surface water bodies and landscape surfaces and the connections between them. These ontology components are validated through use in applications for discovery and aggregation of water science observational data associated with National Hydrography Data features, features from the National Elevation Dataset (NED) and Water Boundary Dataset (WBD) that constrain water occurrence in the continental US. These applications emphasize workflows which are difficult or impossible to automate using existing data structures. Evaluation of the

  6. Improving software product line using an ontological approach

    An ontological based approach is proposed following first-order logic (FOL) rules to identify defects namely dead features and false optional features. The classification of cases forthese defects in FMs that represent variability of SPL is defined. The presented approach has been explained with the help of an FM derived ...

  7. Taxonomy-Based Approaches to Quality Assurance of Ontologies

    Michael Halper

    2017-01-01

    Full Text Available Ontologies are important components of health information management systems. As such, the quality of their content is of paramount importance. It has been proven to be practical to develop quality assurance (QA methodologies based on automated identification of sets of concepts expected to have higher likelihood of errors. Four kinds of such sets (called QA-sets organized around the themes of complex and uncommonly modeled concepts are introduced. A survey of different methodologies based on these QA-sets and the results of applying them to various ontologies are presented. Overall, following these approaches leads to higher QA yields and better utilization of QA personnel. The formulation of additional QA-set methodologies will further enhance the suite of available ontology QA tools.

  8. A topographic feature taxonomy for a U.S. national topographic mapping ontology

    Varanka, Dalia E.

    2013-01-01

    Using legacy feature lists from the U.S. National Topographic Mapping Program of the twentieth century, a taxonomy of features is presented for purposes of developing a national topographic feature ontology for geographic mapping and analysis. After reviewing published taxonomic classifications, six basic classes are suggested; terrain, surface water, ecological regimes, built-up areas, divisions, and events. Aspects of ontology development are suggested as the taxonomy is described.

  9. Method of Automatic Ontology Mapping through Machine Learning and Logic Mining

    王英林

    2004-01-01

    Ontology mapping is the bottleneck of handling conflicts among heterogeneous ontologies and of implementing reconfiguration or interoperability of legacy systems. We proposed an ontology mapping method by using machine learning, type constraints and logic mining techniques. This method is able to find concept correspondences through instances and the result is optimized by using an error function; it is able to find attribute correspondence between two equivalent concepts and the mapping accuracy is enhanced by combining together instances learning, type constraints and the logic relations that are imbedded in instances; moreover, it solves the most common kind of categorization conflicts. We then proposed a merging algorithm to generate the shared ontology and proposed a reconfigurable architecture for interoperation based on multi agents. The legacy systems are encapsulated as information agents to participate in the integration system. Finally we give a simplified case study.

  10. Defining Resilience and Vulnerability Based on Ontology Engineering Approach

    Kumazawa, T.; Matsui, T.; Endo, A.

    2014-12-01

    It is necessary to reflect the concepts of resilience and vulnerability into the assessment framework of "Human-Environmental Security", but it is also in difficulty to identify the linkage between both concepts because of the difference of the academic community which has discussed each concept. The authors have been developing the ontology which deals with the sustainability of the social-ecological systems (SESs). Resilience and vulnerability are also the concepts in the target world which this ontology covers. Based on this point, this paper aims at explicating the semantic relationship between the concepts of resilience and vulnerability based on ontology engineering approach. For this purpose, we first examine the definitions of resilience and vulnerability which the existing literatures proposed. Second, we incorporate the definitions in the ontology dealing with sustainability of SESs. Finally, we focus on the "Water-Energy-Food Nexus Index" to assess Human-Environmental Security, and clarify how the concepts of resilience and vulnerability are linked semantically through the concepts included in these index items.

  11. An ontology approach to comparative phenomics in plants

    Oellrich, Anika

    2015-02-25

    Background: Plant phenotype datasets include many different types of data, formats, and terms from specialized vocabularies. Because these datasets were designed for different audiences, they frequently contain language and details tailored to investigators with different research objectives and backgrounds. Although phenotype comparisons across datasets have long been possible on a small scale, comprehensive queries and analyses that span a broad set of reference species, research disciplines, and knowledge domains continue to be severely limited by the absence of a common semantic framework. Results: We developed a workflow to curate and standardize existing phenotype datasets for six plant species, encompassing both model species and crop plants with established genetic resources. Our effort focused on mutant phenotypes associated with genes of known sequence in Arabidopsis thaliana (L.) Heynh. (Arabidopsis), Zea mays L. subsp. mays (maize), Medicago truncatula Gaertn. (barrel medic or Medicago), Oryza sativa L. (rice), Glycine max (L.) Merr. (soybean), and Solanum lycopersicum L. (tomato). We applied the same ontologies, annotation standards, formats, and best practices across all six species, thereby ensuring that the shared dataset could be used for cross-species querying and semantic similarity analyses. Curated phenotypes were first converted into a common format using taxonomically broad ontologies such as the Plant Ontology, Gene Ontology, and Phenotype and Trait Ontology. We then compared ontology-based phenotypic descriptions with an existing classification system for plant phenotypes and evaluated our semantic similarity dataset for its ability to enhance predictions of gene families, protein functions, and shared metabolic pathways that underlie informative plant phenotypes. Conclusions: The use of ontologies, annotation standards, shared formats, and best practices for cross-taxon phenotype data analyses represents a novel approach to plant phenomics

  12. An ontology approach to comparative phenomics in plants

    Oellrich, Anika; Walls, Ramona L; Cannon, Ethalinda KS; Cannon, Steven B; Cooper, Laurel; Gardiner, Jack; Gkoutos, Georgios V; Harper, Lisa; He, Mingze; Hoehndorf, Robert; Jaiswal, Pankaj; Kalberer, Scott R; Lloyd, John P; Meinke, David; Menda, Naama; Moore, Laura; Nelson, Rex T; Pujar, Anuradha; Lawrence, Carolyn J; Huala, Eva

    2015-01-01

    Background: Plant phenotype datasets include many different types of data, formats, and terms from specialized vocabularies. Because these datasets were designed for different audiences, they frequently contain language and details tailored to investigators with different research objectives and backgrounds. Although phenotype comparisons across datasets have long been possible on a small scale, comprehensive queries and analyses that span a broad set of reference species, research disciplines, and knowledge domains continue to be severely limited by the absence of a common semantic framework. Results: We developed a workflow to curate and standardize existing phenotype datasets for six plant species, encompassing both model species and crop plants with established genetic resources. Our effort focused on mutant phenotypes associated with genes of known sequence in Arabidopsis thaliana (L.) Heynh. (Arabidopsis), Zea mays L. subsp. mays (maize), Medicago truncatula Gaertn. (barrel medic or Medicago), Oryza sativa L. (rice), Glycine max (L.) Merr. (soybean), and Solanum lycopersicum L. (tomato). We applied the same ontologies, annotation standards, formats, and best practices across all six species, thereby ensuring that the shared dataset could be used for cross-species querying and semantic similarity analyses. Curated phenotypes were first converted into a common format using taxonomically broad ontologies such as the Plant Ontology, Gene Ontology, and Phenotype and Trait Ontology. We then compared ontology-based phenotypic descriptions with an existing classification system for plant phenotypes and evaluated our semantic similarity dataset for its ability to enhance predictions of gene families, protein functions, and shared metabolic pathways that underlie informative plant phenotypes. Conclusions: The use of ontologies, annotation standards, shared formats, and best practices for cross-taxon phenotype data analyses represents a novel approach to plant phenomics

  13. A semantic medical multimedia retrieval approach using ontology information hiding.

    Guo, Kehua; Zhang, Shigeng

    2013-01-01

    Searching useful information from unstructured medical multimedia data has been a difficult problem in information retrieval. This paper reports an effective semantic medical multimedia retrieval approach which can reflect the users' query intent. Firstly, semantic annotations will be given to the multimedia documents in the medical multimedia database. Secondly, the ontology that represented semantic information will be hidden in the head of the multimedia documents. The main innovations of this approach are cross-type retrieval support and semantic information preservation. Experimental results indicate a good precision and efficiency of our approach for medical multimedia retrieval in comparison with some traditional approaches.

  14. An ontology-based approach for modelling architectural styles

    Pahl, Claus; Giesecke, Simon; Hasselbring, Wilhelm

    2007-01-01

    peer-reviewed The conceptual modelling of software architectures is of central importance for the quality of a software system. A rich modelling language is required to integrate the different aspects of architecture modelling, such as architectural styles, structural and behavioural modelling, into a coherent framework.We propose an ontological approach for architectural style modelling based on description logic as an abstract, meta-level modelling instrument. Architect...

  15. Ontology-based concept map learning path reasoning system using SWRL rules

    Chu, K.-K.; Lee, C.-I. [National Univ. of Tainan, Taiwan (China). Dept. of Computer Science and Information Learning Technology

    2010-08-13

    Concept maps are graphical representations of knowledge. Concept mapping may reduce students' cognitive load and extend simple memory function. The purpose of this study was on the diagnosis of students' concept map learning abilities and the provision of personally constructive advice dependant on their learning path and progress. Ontology is a useful method with which to represent and store concept map information. Semantic web rule language (SWRL) rules are easy to understand and to use as specific reasoning services. This paper discussed the selection of grade 7 lakes and rivers curriculum for which to devise a concept map learning path reasoning service. The paper defined a concept map e-learning ontology and two SWRL semantic rules, and collected users' concept map learning path data to infer implicit knowledge and to recommend the next learning path for users. It was concluded that the designs devised in this study were feasible and advanced and the ontology kept the domain knowledge preserved. SWRL rules identified an abstraction model for inferred properties. Since they were separate systems, they did not interfere with each other, while ontology or SWRL rules were maintained, ensuring persistent system extensibility and robustness. 15 refs., 1 tab., 8 figs.

  16. A Knowledge Engineering Approach to Develop Domain Ontology

    Yun, Hongyan; Xu, Jianliang; Xiong, Jing; Wei, Moji

    2011-01-01

    Ontologies are one of the most popular and widespread means of knowledge representation and reuse. A few research groups have proposed a series of methodologies for developing their own standard ontologies. However, because this ontological construction concerns special fields, there is no standard method to build domain ontology. In this paper,…

  17. Domain XML semantic integration based on extraction rules and ontology mapping

    Huayu LI

    2016-08-01

    Full Text Available A plenty of XML documents exist in petroleum engineering field, but traditional XML integration solution can’t provide semantic query, which leads to low data use efficiency. In light of WeXML(oil&gas well XML data semantic integration and query requirement, this paper proposes a semantic integration method based on extraction rules and ontology mapping. The method firstly defines a series of extraction rules with which elements and properties of WeXML Schema are mapped to classes and properties in WeOWL ontology, respectively; secondly, an algorithm is used to transform WeXML documents into WeOWL instances. Because WeOWL provides limited semantics, ontology mappings between two ontologies are then built to explain class and property of global ontology with terms of WeOWL, and semantic query based on global domain concepts model is provided. By constructing a WeXML data semantic integration prototype system, the proposed transformational rule, the transfer algorithm and the mapping rule are tested.

  18. A MAUT APPROACH FOR REUSING DOMAIN ONTOLOGIES ON THE BASIS OF THE NeOn METHODOLOGY

    A. JIMÉNEZ; M. C. SUÁREZ-FIGUEROA; A. MATEOS; A. GÓMEZ-PÉREZ; M. FERNÁNDEZ-LÓPEZ

    2013-01-01

    Knowledge resource reuse is becoming a widespread approach in the ontology engineering field because it can speed up the ontology development process. In this context, the NeOn Methodology specifies some guidelines for reusing different types of knowledge resources (ontologies, nonontological resources, and ontology design patterns). These guidelines prescribe how to perform the different activities involved in any of the diverse types of reuse processes. One such activity is to select the be...

  19. An Approach for Composing Services Based on Environment Ontology

    Guangjun Cai

    2013-01-01

    Full Text Available Service-oriented computing is revolutionizing the modern computing paradigms with its aim to boost software reuse and enable business agility. Under this paradigm, new services are fabricated by composing available services. The problem arises as how to effectively and efficiently compose heterogeneous services facing the high complexity of service composition. Based on environment ontology, this paper introduces a requirement-driven service composition approach. We propose the algorithms to decompose the requirement, the rules to deduct the relation between services, and the algorithm for composing service. The empirical results and the comparison with other services’ composition methodologies show that this approach is feasible and efficient.

  20. Buildings classification from airborne LiDAR point clouds through OBIA and ontology driven approach

    Tomljenovic, Ivan; Belgiu, Mariana; Lampoltshammer, Thomas J.

    2013-04-01

    In the last years, airborne Light Detection and Ranging (LiDAR) data proved to be a valuable information resource for a vast number of applications ranging from land cover mapping to individual surface feature extraction from complex urban environments. To extract information from LiDAR data, users apply prior knowledge. Unfortunately, there is no consistent initiative for structuring this knowledge into data models that can be shared and reused across different applications and domains. The absence of such models poses great challenges to data interpretation, data fusion and integration as well as information transferability. The intention of this work is to describe the design, development and deployment of an ontology-based system to classify buildings from airborne LiDAR data. The novelty of this approach consists of the development of a domain ontology that specifies explicitly the knowledge used to extract features from airborne LiDAR data. The overall goal of this approach is to investigate the possibility for classification of features of interest from LiDAR data by means of domain ontology. The proposed workflow is applied to the building extraction process for the region of "Biberach an der Riss" in South Germany. Strip-adjusted and georeferenced airborne LiDAR data is processed based on geometrical and radiometric signatures stored within the point cloud. Region-growing segmentation algorithms are applied and segmented regions are exported to the GeoJSON format. Subsequently, the data is imported into the ontology-based reasoning process used to automatically classify exported features of interest. Based on the ontology it becomes possible to define domain concepts, associated properties and relations. As a consequence, the resulting specific body of knowledge restricts possible interpretation variants. Moreover, ontologies are machinable and thus it is possible to run reasoning on top of them. Available reasoners (FACT++, JESS, Pellet) are used to check

  1. ComTrustO: Composite Trust-Based Ontology Framework for Information and Decision Fusion

    2015-07-06

    11] presents a methodological approach for ontology management allowing development of extensible ontologies and the mapping from ontologies to...ComTrustO: Composite Trust-based Ontology Framework for Information and Decision Fusion Alessandro Oltramari Carnegie Mellon University Pittsburgh... ontology -based framework for information fusion, as a support system for human decision makers. In particular, we build upon the concept of composite

  2. OIntEd: online ontology instance editor enabling a new approach to ontology development

    Wibisono, A.; Koning, R.; Grosso, P.; Belloum, A.; Bubak, M.; de Laat, C.

    2013-01-01

    Ontology development involves people with different background knowledge and expertise. It is an elaborate process, where sophisticated tools for experienced knowledge engineers are available. However, domain experts need simple tools that they can use to focus on ontology instantiation. In this

  3. Ontology mapping specification in description logics for cooperative ...

    Le développement rapide du Web sémantique est lié à la spécification de plus en plus d'ontologies. Celles-ci permettent de modéliser des connaissances agréées par des communautés de personnes concernant des domaines ou des tâches spécifiques. Le même domaine décrit par deux communautés distinctes sera ...

  4. Ontology mapping and data discovery for the translational investigator.

    Wynden, Rob; Weiner, Mark G; Sim, Ida; Gabriel, Davera; Casale, Marco; Carini, Simona; Hastings, Shannon; Ervin, David; Tu, Samson; Gennari, John H; Anderson, Nick; Mobed, Ketty; Lakshminarayanan, Prakash; Massary, Maggie; Cucina, Russ J

    2010-03-01

    An integrated data repository (IDR) containing aggregations of clinical, biomedical, economic, administrative, and public health data is a key component of an overall translational research infrastructure. But most available data repositories are designed using standard data warehouse architecture that employs arbitrary data encoding standards, making queries across disparate repositories difficult. In response to these shortcomings we have designed a Health Ontology Mapper (HOM) that translates terminologies into formal data encoding standards without altering the underlying source data. We believe the HOM system promotes inter-institutional data sharing and research collaboration, and will ultimately lower the barrier to developing and using an IDR.

  5. Feature-based Ontology Mapping from an Information Receivers’ Viewpoint

    Glückstad, Fumiko Kano; Mørup, Morten

    2012-01-01

    This paper compares four algorithms for computing feature-based similarities between concepts respectively possessing a distinctive set of features. The eventual purpose of comparing these feature-based similarity algorithms is to identify a candidate term in a Target Language (TL) that can...... optimally convey the original meaning of a culturally-specific Source Language (SL) concept to a TL audience by aligning two culturally-dependent domain-specific ontologies. The results indicate that the Bayesian Model of Generalization [1] performs best, not only for identifying candidate translation terms...

  6. An Iterative and Incremental Approach for E-Learning Ontology Engineering

    Sudath Rohitha Heiyanthuduwage

    2009-03-01

    Full Text Available Abstract - There is a boost in the interest on ontology with the developments in Semantic Web technologies. Ontologies play a vital role in semantic web. Even though there is lot of work done on ontology, still a standard framework for ontology engineering has not been defined. Even though current ontology engineering methodologies are available they need improvements. The effort of our work is to integrate various methods, techniques, tools and etc to different stages of proposed ontology engineering life cycle to create a comprehensive framework for ontology engineering. Current methodologies discuss ontology engineering stages and collaborative environments with user collaboration. However, discussion on increasing effectiveness and correct inference has been given less attention. More over, these methodologies provide little discussion on usability of domain ontologies. We consider these aspects as more important in our work. Also, ontology engineering has been done for various domains and for various purposes. Our effort is to propose an iterative and incremental approach for ontology engineering especially for e-learning domain with the intention of achieving a higher usability and effectiveness of e-learning systems. This paper introduces different aspects of the proposed ontology engineering framework and evaluation of it.

  7. From Brain Maps to Cognitive Ontologies: Informatics and the Search for Mental Structure.

    Poldrack, Russell A; Yarkoni, Tal

    2016-01-01

    A major goal of cognitive neuroscience is to delineate how brain systems give rise to mental function. Here we review the increasingly large role informatics-driven approaches are playing in such efforts. We begin by reviewing a number of challenges conventional neuroimaging approaches face in trying to delineate brain-cognition mappings--for example, the difficulty in establishing the specificity of postulated associations. Next, we demonstrate how these limitations can potentially be overcome using complementary approaches that emphasize large-scale analysis--including meta-analytic methods that synthesize hundreds or thousands of studies at a time; latent-variable approaches that seek to extract structure from data in a bottom-up manner; and predictive modeling approaches capable of quantitatively inferring mental states from patterns of brain activity. We highlight the underappreciated but critical role for formal cognitive ontologies in helping to clarify, refine, and test theories of brain and cognitive function. Finally, we conclude with a speculative discussion of what future informatics developments may hold for cognitive neuroscience.

  8. Going Deeper or Flatter: Connecting Deep Mapping, Flat Ontologies and the Democratizing of Knowledge

    Selina Springett

    2015-10-01

    Full Text Available The concept of “deep mapping”, as an approach to place, has been deployed as both a descriptor of a specific suite of creative works and as a set of aesthetic practices. While its definition has been amorphous and adaptive, a number of distinct, yet related, manifestations identify as, or have been identified by, the term. In recent times, it has garnered attention beyond literary discourse, particularly within the “spatial” turn of representation in the humanities and as a result of expanded platforms of data presentation. This paper takes a brief look at the practice of “deep mapping”, considering it as a consciously performative act and tracing a number of its various manifestations. It explores how deep mapping is a reflection of epistemological trends in ontological practices of connectivity and the “flattening” of knowledge systems. In particular those put forward by post structural and cultural theorists, such as Bruno Latour, Gilles Deleuze, and Felix Guattari, as well as by theorists who associate with speculative realism. The concept of deep mapping as an aesthetic, methodological, and ideological tool, enables an approach to place that democratizes knowledge by crossing temporal, spatial, and disciplinary boundaries.

  9. The Semantic Mapping of Archival Metadata to the CIDOC CRM Ontology

    Bountouri, Lina; Gergatsoulis, Manolis

    2011-01-01

    In this article we analyze the main semantics of archival description, expressed through Encoded Archival Description (EAD). Our main target is to map the semantics of EAD to the CIDOC Conceptual Reference Model (CIDOC CRM) ontology as part of a wider integration architecture of cultural heritage metadata. Through this analysis, it is concluded…

  10. Ontology-Based Approach to Social Data Sentiment Analysis: Detection of Adolescent Depression Signals.

    Jung, Hyesil; Park, Hyeoun-Ae; Song, Tae-Min

    2017-07-24

    Social networking services (SNSs) contain abundant information about the feelings, thoughts, interests, and patterns of behavior of adolescents that can be obtained by analyzing SNS postings. An ontology that expresses the shared concepts and their relationships in a specific field could be used as a semantic framework for social media data analytics. The aim of this study was to refine an adolescent depression ontology and terminology as a framework for analyzing social media data and to evaluate description logics between classes and the applicability of this ontology to sentiment analysis. The domain and scope of the ontology were defined using competency questions. The concepts constituting the ontology and terminology were collected from clinical practice guidelines, the literature, and social media postings on adolescent depression. Class concepts, their hierarchy, and the relationships among class concepts were defined. An internal structure of the ontology was designed using the entity-attribute-value (EAV) triplet data model, and superclasses of the ontology were aligned with the upper ontology. Description logics between classes were evaluated by mapping concepts extracted from the answers to frequently asked questions (FAQs) onto the ontology concepts derived from description logic queries. The applicability of the ontology was validated by examining the representability of 1358 sentiment phrases using the ontology EAV model and conducting sentiment analyses of social media data using ontology class concepts. We developed an adolescent depression ontology that comprised 443 classes and 60 relationships among the classes; the terminology comprised 1682 synonyms of the 443 classes. In the description logics test, no error in relationships between classes was found, and about 89% (55/62) of the concepts cited in the answers to FAQs mapped onto the ontology class. Regarding applicability, the EAV triplet models of the ontology class represented about 91

  11. Understanding semantic mapping evolution by observing changes in biomedical ontologies.

    dos Reis, Julio Cesar; Pruski, Cédric; Da Silveira, Marcos; Reynaud-Delaître, Chantal

    2014-02-01

    Knowledge Organization Systems (KOSs) are extensively used in the biomedical domain to support information sharing between software applications. KOSs are proposed covering different, but overlapping subjects, and mappings indicate the semantic relation between concepts from two KOSs. Over time, KOSs change as do the mappings between them. This can result from a new discovery or a revision of existing knowledge which includes corrections of concepts or mappings. Indeed, changes affecting KOS entities may force the underline mappings to be updated in order to ensure their reliability over time. To tackle this open research problem, we study how mappings are affected by KOS evolution. This article presents a detailed descriptive analysis of the impact that changes in KOS have on mappings. As a case study, we use the official mappings established between SNOMED CT and ICD-9-CM from 2009 to 2011. Results highlight factors according to which KOS changes in varying degrees influence the evolution of mappings. Copyright © 2013 Elsevier Inc. All rights reserved.

  12. Knowledge management of eco-industrial park for efficient energy utilization through ontology-based approach

    Zhang, Chuan; Romagnoli, Alessandro; Zhou, Li; Kraft, Markus

    2017-01-01

    Highlights: •An intelligent energy management system for Eco-Industrial Park (EIP) is proposed. •An explicit domain ontology for EIP energy management is designed. •Ontology-based approach can increase knowledge interoperability within EIP. •Ontology-based approach can allow self-optimization without human intervention in EIP. •The proposed system harbours huge potential in the future scenario of Internet of Things. -- Abstract: An ontology-based approach for Eco-Industrial Park (EIP) knowledge management is proposed in this paper. The designed ontology in this study is formalized conceptualization of EIP. Based on such an ontological representation, a Knowledge-Based System (KBS) for EIP energy management named J-Park Simulator (JPS) is developed. By applying JPS to the solution of EIP waste heat utilization problem, the results of this study show that ontology is a powerful tool for knowledge management of complex systems such as EIP. The ontology-based approach can increase knowledge interoperability between different companies in EIP. The ontology-based approach can also allow intelligent decision making by using disparate data from remote databases, which implies the possibility of self-optimization without human intervention scenario of Internet of Things (IoT). It is shown through this study that KBS can bridge the communication gaps between different companies in EIP, sequentially more potential Industrial Symbiosis (IS) links can be established to improve the overall energy efficiency of the whole EIP.

  13. A four stage approach for ontology-based health information system design.

    Kuziemsky, Craig E; Lau, Francis

    2010-11-01

    To describe and illustrate a four stage methodological approach to capture user knowledge in a biomedical domain area, use that knowledge to design an ontology, and then implement and evaluate the ontology as a health information system (HIS). A hybrid participatory design-grounded theory (GT-PD) method was used to obtain data and code them for ontology development. Prototyping was used to implement the ontology as a computer-based tool. Usability testing evaluated the computer-based tool. An empirically derived domain ontology and set of three problem-solving approaches were developed as a formalized model of the concepts and categories from the GT coding. The ontology and problem-solving approaches were used to design and implement a HIS that tested favorably in usability testing. The four stage approach illustrated in this paper is useful for designing and implementing an ontology as the basis for a HIS. The approach extends existing ontology development methodologies by providing an empirical basis for theory incorporated into ontology design. Copyright © 2010 Elsevier B.V. All rights reserved.

  14. Ontology Based Model Transformation Infrastructure

    Göknil, Arda; Topaloglu, N.Y.

    2005-01-01

    Using MDA in ontology development has been investigated in several works recently. The mappings and transformations between the UML constructs and the OWL elements to develop ontologies are the main concern of these research projects. We propose another approach in order to achieve the collaboration

  15. A methodological approach for designing a usable ontology-based GUI in healthcare.

    Lasierra, N; Kushniruk, A; Alesanco, A; Borycki, E; García, J

    2013-01-01

    This paper presents a methodological approach to the design and evaluation of an interface for an ontology-based system used for designing care plans for monitoring patients at home. In order to define the care plans, physicians need a tool for creating instances of the ontology and configuring some rules. Our purpose is to develop an interface to allow clinicians to interact with the ontology. Although ontology-driven applications do not necessarily present the ontology in the user interface, it is our hypothesis that showing selected parts of the ontology in a "usable" way could enhance clinician's understanding and make easier the definition of the care plans. Based on prototyping and iterative testing, this methodology combines visualization techniques and usability methods. Preliminary results obtained after a formative evaluation indicate the effectiveness of suggested combination.

  16. The use of concept maps during knowledge elicitation in ontology development processes – the nutrigenomics use case

    Taylor Chris

    2006-05-01

    Full Text Available Abstract Background Incorporation of ontologies into annotations has enabled 'semantic integration' of complex data, making explicit the knowledge within a certain field. One of the major bottlenecks in developing bio-ontologies is the lack of a unified methodology. Different methodologies have been proposed for different scenarios, but there is no agreed-upon standard methodology for building ontologies. The involvement of geographically distributed domain experts, the need for domain experts to lead the design process, the application of the ontologies and the life cycles of bio-ontologies are amongst the features not considered by previously proposed methodologies. Results Here, we present a methodology for developing ontologies within the biological domain. We describe our scenario, competency questions, results and milestones for each methodological stage. We introduce the use of concept maps during knowledge acquisition phases as a feasible transition between domain expert and knowledge engineer. Conclusion The contributions of this paper are the thorough description of the steps we suggest when building an ontology, example use of concept maps, consideration of applicability to the development of lower-level ontologies and application to decentralised environments. We have found that within our scenario conceptual maps played an important role in the development process.

  17. Design Ontology-Contrasting an empirical and a theoritical approach

    Ahmed, Saeema; Storga, Mario

    2007-01-01

    This paper presents the result of the research that compares two previous and separate efforts of the authors to develop engineering design ontologies with a longer-term aim to produce a useable and theoretical sound ontology. The research methodology adopted was to examine each of the concepts a...

  18. Web Approach for Ontology-Based Classification, Integration, and Interdisciplinary Usage of Geoscience Metadata

    B Ritschel

    2012-10-01

    Full Text Available The Semantic Web is a W3C approach that integrates the different sources of semantics within documents and services using ontology-based techniques. The main objective of this approach in the geoscience domain is the improvement of understanding, integration, and usage of Earth and space science related web content in terms of data, information, and knowledge for machines and people. The modeling and representation of semantic attributes and relations within and among documents can be realized by human readable concept maps and machine readable OWL documents. The objectives for the usage of the Semantic Web approach in the GFZ data center ISDC project are the design of an extended classification of metadata documents for product types related to instruments, platforms, and projects as well as the integration of different types of metadata related to data product providers, users, and data centers. Sources of content and semantics for the description of Earth and space science product types and related classes are standardized metadata documents (e.g., DIF documents, publications, grey literature, and Web pages. Other sources are information provided by users, such as tagging data and social navigation information. The integration of controlled vocabularies as well as folksonomies plays an important role in the design of well formed ontologies.

  19. DServO: A Peer-to-Peer-based Approach to Biomedical Ontology Repositories.

    Mambone, Zakaria; Savadogo, Mahamadi; Some, Borlli Michel Jonas; Diallo, Gayo

    2015-01-01

    We present in this poster an extension of the ServO ontology server system, which adopts a decentralized Peer-To-Peer approach for managing multiple heterogeneous knowledge organization systems. It relies on the use of the JXTA protocol coupled with information retrieval techniques to provide a decentralized infrastructure for managing multiples instances of Ontology Repositories.

  20. An ontological approach to describing neurons and their relationships

    David J. Hamilton

    2012-04-01

    Full Text Available The advancement of neuroscience, perhaps the most information rich discipline of all the life sciences, requires basic frameworks for organizing the vast amounts of data generated by the research community to promote novel insights and integrated understanding. Since Cajal, the neuron remains a fundamental unit of the nervous system, yet even with the explosion of information technology, we still have few comprehensive or systematic strategies for aggregating cell-level knowledge. Progress toward this goal is hampered by the multiplicity of names for cells and by lack of a consensus on the criteria for defining neuron types. However, through umbrella projects like the Neuroscience Information Framework and the International Neuroinformatics Coordinating Facility, we have the opportunity to propose and implement an informatics infrastructure for establishing common tools and approaches to describe neurons through a standard terminology for nerve cells and a database (a Neuron Registry where these descriptions can be deposited and compared. This article provides an overview of the problem and outlines a solution approach utilizing ontological characterizations.

  1. The prediction of candidate genes for cervix related cancer through gene ontology and graph theoretical approach.

    Hindumathi, V; Kranthi, T; Rao, S B; Manimaran, P

    2014-06-01

    With rapidly changing technology, prediction of candidate genes has become an indispensable task in recent years mainly in the field of biological research. The empirical methods for candidate gene prioritization that succors to explore the potential pathway between genetic determinants and complex diseases are highly cumbersome and labor intensive. In such a scenario predicting potential targets for a disease state through in silico approaches are of researcher's interest. The prodigious availability of protein interaction data coupled with gene annotation renders an ease in the accurate determination of disease specific candidate genes. In our work we have prioritized the cervix related cancer candidate genes by employing Csaba Ortutay and his co-workers approach of identifying the candidate genes through graph theoretical centrality measures and gene ontology. With the advantage of the human protein interaction data, cervical cancer gene sets and the ontological terms, we were able to predict 15 novel candidates for cervical carcinogenesis. The disease relevance of the anticipated candidate genes was corroborated through a literature survey. Also the presence of the drugs for these candidates was detected through Therapeutic Target Database (TTD) and DrugMap Central (DMC) which affirms that they may be endowed as potential drug targets for cervical cancer.

  2. ONTOLOGICAL STANDARDIZATION FOR HISTORICAL MAP COLLECTIONS: STUDYING THE GREEK BORDERLINES OF 1881

    E. Gkadolou

    2012-07-01

    Full Text Available Historical maps deliver valuable historical information which is applicable in several domains while they document the spatiotemporal evolution of the geographical entities that are depicted therein. In order to use the historical cartographic information effectively, the maps' semantic documentation becomes a necessity for restoring any semantic ambiguities and structuring the relationship between historical and current geographical space. This paper examines cartographic ontologies as a proposed methodology and presents the first outcomes of the methodology applied for the historical map series «Carte de la nouvelle frontière Turco-Grecque» that sets the borderlines between Greece and Ottoman Empire in 1881. The map entities were modelled and compared to the current ones so as to record the changes in their spatial and thematic attributes and an ontology was developed in Protégé OWL Editor 3.4.4 for the attributes that thoroughly define a historical map and the digitised spatial entities. Special focus was given on the Greek borderline and the changes that it caused to other geographic entities.

  3. Ontological Standardization for Historical Map Collections: Studying the Greek Borderlines of 1881

    Gkadolou, E.; Tomai, E.; Stefanakis, E.; Kritikos, G.

    2012-07-01

    Historical maps deliver valuable historical information which is applicable in several domains while they document the spatiotemporal evolution of the geographical entities that are depicted therein. In order to use the historical cartographic information effectively, the maps' semantic documentation becomes a necessity for restoring any semantic ambiguities and structuring the relationship between historical and current geographical space. This paper examines cartographic ontologies as a proposed methodology and presents the first outcomes of the methodology applied for the historical map series «Carte de la nouvelle frontière Turco-Grecque» that sets the borderlines between Greece and Ottoman Empire in 1881. The map entities were modelled and compared to the current ones so as to record the changes in their spatial and thematic attributes and an ontology was developed in Protégé OWL Editor 3.4.4 for the attributes that thoroughly define a historical map and the digitised spatial entities. Special focus was given on the Greek borderline and the changes that it caused to other geographic entities.

  4. An ontology approach to comparative phenomics in plants

    Plant phenotypes (observable characteristics) are described using many different formats and specialized vocabularies or "ontologies". Similar phenotypes in different species may be given different names. These differences in terms complicate phenotype comparisons across species. This research descr...

  5. Summarizing an Ontology: A "Big Knowledge" Coverage Approach.

    Zheng, Ling; Perl, Yehoshua; Elhanan, Gai; Ochs, Christopher; Geller, James; Halper, Michael

    2017-01-01

    Maintenance and use of a large ontology, consisting of thousands of knowledge assertions, are hampered by its scope and complexity. It is important to provide tools for summarization of ontology content in order to facilitate user "big picture" comprehension. We present a parameterized methodology for the semi-automatic summarization of major topics in an ontology, based on a compact summary of the ontology, called an "aggregate partial-area taxonomy", followed by manual enhancement. An experiment is presented to test the effectiveness of such summarization measured by coverage of a given list of major topics of the corresponding application domain. SNOMED CT's Specimen hierarchy is the test-bed. A domain-expert provided a list of topics that serves as a gold standard. The enhanced results show that the aggregate taxonomy covers most of the domain's main topics.

  6. Translation of overlay models of student knowledge for relative domains based on domain ontology mapping

    Sosnovsky, Sergey; Dolog, Peter; Henze, Nicola

    2007-01-01

    The effectiveness of an adaptive educational system in many respects depends on the precision of modeling assumptions it makes about a student. One of the well-known challenges in student modeling is to adequately assess the initial level of student's knowledge when s/he starts working...... with a system. Sometimes potentially handful data are available as a part of user model from a system used by the student before. The usage of external user modeling information is troublesome because of differences in system architecture, knowledge representation, modeling constraints, etc. In this paper, we...... argue that the implementation of underlying knowledge models in a sharable format, as domain ontologies - along with application of automatic ontology mapping techniques for model alignment - can help to overcome the "new-user" problem and will greatly widen opportunities for student model translation...

  7. Exploration and implementation of ontology-based cultural relic knowledge map integration platform

    Yang, Weiqiang; Dong, Yiqiang

    2018-05-01

    To help designers to better carry out creative design and improve the ability of searching traditional cultural relic information, the ontology-based knowledge map construction method was explored and an integrated platform for cultural relic knowledge map was developed. First of all, the construction method of the ontology of cultural relics was put forward, and the construction of the knowledge map of cultural relics was completed based on the constructed cultural relic otology. Then, a personalized semantic retrieval framework for creative design was proposed. Finally, the integrated platform of the knowledge map of cultural relics was designed and realized. The platform was divided into two parts. One was the foreground display system, which was used for designers to search and browse cultural relics. The other was the background management system, which was for cultural experts to manage cultural relics' knowledge. The research results showed that the platform designed could improve the retrieval ability of cultural relic information. To sum up, the platform can provide a good support for the designer's creative design.

  8. Assessing the practice of biomedical ontology evaluation: Gaps and opportunities.

    Amith, Muhammad; He, Zhe; Bian, Jiang; Lossio-Ventura, Juan Antonio; Tao, Cui

    2018-04-01

    With the proliferation of heterogeneous health care data in the last three decades, biomedical ontologies and controlled biomedical terminologies play a more and more important role in knowledge representation and management, data integration, natural language processing, as well as decision support for health information systems and biomedical research. Biomedical ontologies and controlled terminologies are intended to assure interoperability. Nevertheless, the quality of biomedical ontologies has hindered their applicability and subsequent adoption in real-world applications. Ontology evaluation is an integral part of ontology development and maintenance. In the biomedicine domain, ontology evaluation is often conducted by third parties as a quality assurance (or auditing) effort that focuses on identifying modeling errors and inconsistencies. In this work, we first organized four categorical schemes of ontology evaluation methods in the existing literature to create an integrated taxonomy. Further, to understand the ontology evaluation practice in the biomedicine domain, we reviewed a sample of 200 ontologies from the National Center for Biomedical Ontology (NCBO) BioPortal-the largest repository for biomedical ontologies-and observed that only 15 of these ontologies have documented evaluation in their corresponding inception papers. We then surveyed the recent quality assurance approaches for biomedical ontologies and their use. We also mapped these quality assurance approaches to the ontology evaluation criteria. It is our anticipation that ontology evaluation and quality assurance approaches will be more widely adopted in the development life cycle of biomedical ontologies. Copyright © 2018 Elsevier Inc. All rights reserved.

  9. An Intelligent Information Retrieval Approach Based on Two Degrees of Uncertainty Fuzzy Ontology

    Maryam Hourali

    2011-01-01

    Full Text Available In spite of the voluminous studies in the field of intelligent retrieval systems, effective retrieving of information has been remained an important unsolved problem. Implementations of different conceptual knowledge in the information retrieval process such as ontology have been considered as a solution to enhance the quality of results. Furthermore, the conceptual formalism supported by typical ontology may not be sufficient to represent uncertainty information due to the lack of clear-cut boundaries between concepts of the domains. To tackle this type of problems, one possible solution is to insert fuzzy logic into ontology construction process. In this article, a novel approach for fuzzy ontology generation with two uncertainty degrees is proposed. Hence, by implementing linguistic variables, uncertainty level in domain's concepts (Software Maintenance Engineering (SME domain has been modeled, and ontology relations have been modeled by fuzzy theory consequently. Then, we combined these uncertain models and proposed a new ontology with two degrees of uncertainty both in concept expression and relation expression. The generated fuzzy ontology was implemented for expansion of initial user's queries in SME domain. Experimental results showed that the proposed model has better overall retrieval performance comparing to keyword-based or crisp ontology-based retrieval systems.

  10. A robust data-driven approach for gene ontology annotation.

    Li, Yanpeng; Yu, Hong

    2014-01-01

    Gene ontology (GO) and GO annotation are important resources for biological information management and knowledge discovery, but the speed of manual annotation became a major bottleneck of database curation. BioCreative IV GO annotation task aims to evaluate the performance of system that automatically assigns GO terms to genes based on the narrative sentences in biomedical literature. This article presents our work in this task as well as the experimental results after the competition. For the evidence sentence extraction subtask, we built a binary classifier to identify evidence sentences using reference distance estimator (RDE), a recently proposed semi-supervised learning method that learns new features from around 10 million unlabeled sentences, achieving an F1 of 19.3% in exact match and 32.5% in relaxed match. In the post-submission experiment, we obtained 22.1% and 35.7% F1 performance by incorporating bigram features in RDE learning. In both development and test sets, RDE-based method achieved over 20% relative improvement on F1 and AUC performance against classical supervised learning methods, e.g. support vector machine and logistic regression. For the GO term prediction subtask, we developed an information retrieval-based method to retrieve the GO term most relevant to each evidence sentence using a ranking function that combined cosine similarity and the frequency of GO terms in documents, and a filtering method based on high-level GO classes. The best performance of our submitted runs was 7.8% F1 and 22.2% hierarchy F1. We found that the incorporation of frequency information and hierarchy filtering substantially improved the performance. In the post-submission evaluation, we obtained a 10.6% F1 using a simpler setting. Overall, the experimental analysis showed our approaches were robust in both the two tasks. © The Author(s) 2014. Published by Oxford University Press.

  11. Ontological Encoding of GeoSciML and INSPIRE geological standard vocabularies and schemas: application to geological mapping

    Lombardo, Vincenzo; Piana, Fabrizio; Mimmo, Dario; Fubelli, Giandomenico; Giardino, Marco

    2016-04-01

    Encoding of geologic knowledge in formal languages is an ambitious task, aiming at the interoperability and organic representation of geological data, and semantic characterization of geologic maps. Initiatives such as GeoScience Markup Language (last version is GeoSciML 4, 2015[1]) and INSPIRE "Data Specification on Geology" (an operative simplification of GeoSciML, last version is 3.0 rc3, 2013[2]), as well as the recent terminological shepherding of the Geoscience Terminology Working Group (GTWG[3]) have been promoting information exchange of the geologic knowledge. There have also been limited attempts to encode the knowledge in a machine-readable format, especially in the lithology domain (see e.g. the CGI_Lithology ontology[4]), but a comprehensive ontological model that connect the several knowledge sources is still lacking. This presentation concerns the "OntoGeonous" initiative, which aims at encoding the geologic knowledge, as expressed through the standard vocabularies, schemas and data models mentioned above, through a number of interlinked computational ontologies, based on the languages of the Semantic Web and the paradigm of Linked Open Data. The initiative proceeds in parallel with a concrete case study, concerning the setting up of a synthetic digital geological map of the Piemonte region (NW Italy), named "GEOPiemonteMap" (developed by the CNR Institute of Geosciences and Earth Resources, CNR IGG, Torino), where the description and classification of GeologicUnits has been supported by the modeling and implementation of the ontologies. We have devised a tripartite ontological model called OntoGeonous that consists of: 1) an ontology of the geologic features (in particular, GeologicUnit, GeomorphologicFeature, and GeologicStructure[5], modeled from the definitions and UML schemata of CGI vocabularies[6], GeoScienceML and INSPIRE, and aligned with the Planetary realm of NASA SWEET ontology[7]), 2) an ontology of the Earth materials (as defined by the

  12. Adaptation of the MapMan ontology to biotic stress responses: application in solanaceous species

    Stitt Mark

    2007-09-01

    Full Text Available Abstract Background The results of transcriptome microarray analysis are usually presented as a list of differentially expressed genes. As these lists can be long, it is hard to interpret the desired experimental treatment effect on the physiology of analysed tissue, e.g. via selected metabolic or other pathways. For some organisms, gene ontologies and data visualization software have been implemented to overcome this problem, whereas for others, software adaptation is yet to be done. Results We present the classification of tentative potato contigs from the potato gene index (StGI available from Dana-Farber Cancer Institute (DFCI into the MapMan ontology to enable the application of the MapMan family of tools to potato microarrays. Special attention has been focused on mapping genes that could not be annotated based on similarity to Arabidopsis genes alone, thus possibly representing genes unique for potato. 97 such genes were classified into functional BINs (i.e. functional classes after manual annotation. A new pathway, focusing on biotic stress responses, has been added and can be used for all other organisms for which mappings have been done. The BIN representation on the potato 10 k cDNA microarray, in comparison with all putative potato gene sequences, has been tested. The functionality of the prepared potato mapping was validated with experimental data on plant response to viral infection. In total 43,408 unigenes were mapped into 35 corresponding BINs. Conclusion The potato mappings can be used to visualize up-to-date, publicly available, expressed sequence tags (ESTs and other sequences from GenBank, in combination with metabolic pathways. Further expert work on potato annotations will be needed with the ongoing EST and genome sequencing of potato. The current MapMan application for potato is directly applicable for analysis of data obtained on potato 10 k cDNA microarray by TIGR (The Institute for Genomic Research but can also be used

  13. THE PRINCIPLES AND METHODS OF INFORMATION AND EDUCATIONAL SPACE SEMANTIC STRUCTURING BASED ON ONTOLOGIC APPROACH REALIZATION

    Yurij F. Telnov

    2014-01-01

    Full Text Available This article reveals principles of semantic structuring of information and educational space of objects of knowledge and scientific and educational services with use of methods of ontologic engineering. Novelty of offered approach is interface of ontology of a content and ontology of scientific and educational services that allows to carry out effective composition of services and objects of knowledge according to models of professional competences and requirements being trained. As a result of application of methods of information and educational space semantic structuring integration of use of the diverse distributed scientific and educational content by educational institutions for carrying out scientific researches, methodical development and training is provided.

  14. Database Concepts in a Domain Ontology

    Gorskis Henrihs

    2017-12-01

    Full Text Available There are multiple approaches for mapping from a domain ontology to a database in the task of ontology-based data access. For that purpose, external mapping documents are most commonly used. These documents describe how the data necessary for the description of ontology individuals and other values, are to be obtained from the database. The present paper investigates the use of special database concepts. These concepts are not separated from the domain ontology; they are mixed with domain concepts to form a combined application ontology. By creating natural relationships between database concepts and domain concepts, mapping can be implemented more easily and with a specific purpose. The paper also investigates how the use of such database concepts in addition to domain concepts impacts ontology building and data retrieval.

  15. A novel ontology approach to support design for reliability considering environmental effects.

    Sun, Bo; Li, Yu; Ye, Tianyuan; Ren, Yi

    2015-01-01

    Environmental effects are not considered sufficiently in product design. Reliability problems caused by environmental effects are very prominent. This paper proposes a method to apply ontology approach in product design. During product reliability design and analysis, environmental effects knowledge reusing is achieved. First, the relationship of environmental effects and product reliability is analyzed. Then environmental effects ontology to describe environmental effects domain knowledge is designed. Related concepts of environmental effects are formally defined by using the ontology approach. This model can be applied to arrange environmental effects knowledge in different environments. Finally, rubber seals used in the subhumid acid rain environment are taken as an example to illustrate ontological model application on reliability design and analysis.

  16. Leave-two-out stability of ontology learning algorithm

    Wu, Jianzhang; Yu, Xiao; Zhu, Linli; Gao, Wei

    2016-01-01

    Ontology is a semantic analysis and calculation model, which has been applied to many subjects. Ontology similarity calculation and ontology mapping are employed as machine learning approaches. The purpose of this paper is to study the leave-two-out stability of ontology learning algorithm. Several leave-two-out stabilities are defined in ontology learning setting and the relationship among these stabilities are presented. Furthermore, the results manifested reveal that leave-two-out stability is a sufficient and necessary condition for ontology learning algorithm.

  17. Building spatio-temporal database model based on ontological approach using relational database environment

    Mahmood, N.; Burney, S.M.A.

    2017-01-01

    Everything in this world is encapsulated by space and time fence. Our daily life activities are utterly linked and related with other objects in vicinity. Therefore, a strong relationship exist with our current location, time (including past, present and future) and event through with we are moving as an object also affect our activities in life. Ontology development and its integration with database are vital for the true understanding of the complex systems involving both spatial and temporal dimensions. In this paper we propose a conceptual framework for building spatio-temporal database model based on ontological approach. We have used relational data model for modelling spatio-temporal data content and present our methodology with spatio-temporal ontological accepts and its transformation into spatio-temporal database model. We illustrate the implementation of our conceptual model through a case study related to cultivated land parcel used for agriculture to exhibit the spatio-temporal behaviour of agricultural land and related entities. Moreover, it provides a generic approach for designing spatiotemporal databases based on ontology. The proposed model is capable to understand the ontological and somehow epistemological commitments and to build spatio-temporal ontology and transform it into a spatio-temporal data model. Finally, we highlight the existing and future research challenges. (author)

  18. An Approach to Formalizing Ontology Driven Semantic Integration: Concepts, Dimensions and Framework

    Gao, Wenlong

    2012-01-01

    The ontology approach has been accepted as a very promising approach to semantic integration today. However, because of the diversity of focuses and its various connections to other research domains, the core concepts, theoretical and technical approaches, and research areas of this domain still remain unclear. Such ambiguity makes it difficult to…

  19. A multi-ontology approach to annotate scientific documents based on a modularization technique.

    Gomes, Priscilla Corrêa E Castro; Moura, Ana Maria de Carvalho; Cavalcanti, Maria Cláudia

    2015-12-01

    Scientific text annotation has become an important task for biomedical scientists. Nowadays, there is an increasing need for the development of intelligent systems to support new scientific findings. Public databases available on the Web provide useful data, but much more useful information is only accessible in scientific texts. Text annotation may help as it relies on the use of ontologies to maintain annotations based on a uniform vocabulary. However, it is difficult to use an ontology, especially those that cover a large domain. In addition, since scientific texts explore multiple domains, which are covered by distinct ontologies, it becomes even more difficult to deal with such task. Moreover, there are dozens of ontologies in the biomedical area, and they are usually big in terms of the number of concepts. It is in this context that ontology modularization can be useful. This work presents an approach to annotate scientific documents using modules of different ontologies, which are built according to a module extraction technique. The main idea is to analyze a set of single-ontology annotations on a text to find out the user interests. Based on these annotations a set of modules are extracted from a set of distinct ontologies, and are made available for the user, for complementary annotation. The reduced size and focus of the extracted modules tend to facilitate the annotation task. An experiment was conducted to evaluate this approach, with the participation of a bioinformatician specialist of the Laboratory of Peptides and Proteins of the IOC/Fiocruz, who was interested in discovering new drug targets aiming at the combat of tropical diseases. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. An approach to development of ontological knowledge base in the field of scientific and research activity in Russia

    Murtazina, M. Sh; Avdeenko, T. V.

    2018-05-01

    The state of art and the progress in application of semantic technologies in the field of scientific and research activity have been analyzed. Even elementary empirical comparison has shown that the semantic search engines are superior in all respects to conventional search technologies. However, semantic information technologies are insufficiently used in the field of scientific and research activity in Russia. In present paper an approach to construction of ontological model of knowledge base is proposed. The ontological model is based on the upper-level ontology and the RDF mechanism for linking several domain ontologies. The ontological model is implemented in the Protégé environment.

  1. Approaching the axiomatic enrichment of the Gene Ontology from a lexical perspective.

    Quesada-Martínez, Manuel; Mikroyannidi, Eleni; Fernández-Breis, Jesualdo Tomás; Stevens, Robert

    2015-09-01

    The main goal of this work is to measure how lexical regularities in biomedical ontology labels can be used for the automatic creation of formal relationships between classes, and to evaluate the results of applying our approach to the Gene Ontology (GO). In recent years, we have developed a method for the lexical analysis of regularities in biomedical ontology labels, and we showed that the labels can present a high degree of regularity. In this work, we extend our method with a cross-products extension (CPE) metric, which estimates the potential interest of a specific regularity for axiomatic enrichment in the lexical analysis, using information on exact matches in external ontologies. The GO consortium recently enriched the GO by using so-called cross-product extensions. Cross-products are generated by establishing axioms that relate a given GO class with classes from the GO or other biomedical ontologies. We apply our method to the GO and study how its lexical analysis can identify and reconstruct the cross-products that are defined by the GO consortium. The label of the classes of the GO are highly regular in lexical terms, and the exact matches with labels of external ontologies affect 80% of the GO classes. The CPE metric reveals that 31.48% of the classes that exhibit regularities have fragments that are classes into two external ontologies that are selected for our experiment, namely, the Cell Ontology and the Chemical Entities of Biological Interest ontology, and 18.90% of them are fully decomposable into smaller parts. Our results show that the CPE metric permits our method to detect GO cross-product extensions with a mean recall of 62% and a mean precision of 28%. The study is completed with an analysis of false positives to explain this precision value. We think that our results support the claim that our lexical approach can contribute to the axiomatic enrichment of biomedical ontologies and that it can provide new insights into the engineering of

  2. ONTOLOGY MAPPING IN THE RESILIENCE STUDY: THE ORGANIZATIONAL PERSPECTIVE FOR EUROPEAN UNION CASE

    Tiberiu-Tudor SALANŢIU

    2017-12-01

    Full Text Available The ontology mapping in resilience surveillance on organization level can found utilization in analysis of association between idiosyncrasies and structure adaptability. Starting from the data regarding the economic trends for European Union members from 2014 to 2016 the aim of the research is to analyse the European Union resilience through interpretation of the link between members behaviour and structure convergence. The members positioned in European Union was analysed after organization clusterization of the twenty-eight state members. Two different structures are included into analysis for the studied periods: a structure which incorporates just the state members, and other which also take into account the eurozone blue-chips. In order to analyse the members’ relation in structure a gravity model has been developed, the obtained results for each state members pair are contained in a skew matrix. The values are interpreted through a knowledge-base to highlight the European Union resilience degree.

  3. Ontology-aided annotation, visualization and generalization of geological time-scale information from online geological map services

    Ma, X.; Carranza, E.J.M.; Wu, C.; Meer, F.D. van der

    2012-01-01

    Geological maps are increasingly published and shared online, whereas tools and services supporting information retrieval and knowledge discovery are underdeveloped. In this study, we developed an ontology of geological time scale by using a RDF (Resource Description Framework) model to represent

  4. Ontology-aided annotation, visualization and generalization of geological time scale information from online geological map services

    Ma, Marshal; Ma, X.; Carranza, E.J.M; Wu, C.; van der Meer, F.D.

    2012-01-01

    Geological maps are increasingly published and shared online, whereas tools and services supporting information retrieval and knowledge discovery are underdeveloped. In this study, we developed an ontology of geological time scale by using a Resource Description Framework model to represent the

  5. An Ontology-centered Approach for Designing an Interactive Competence Management System for IT Companies

    Stefan TRAUSAN-MATU

    2009-01-01

    Full Text Available The paper presents a generic framework for an intelligent information system of competence management based on ontologies for information technology companies. In a first step it will be applied in an information technology (IT small enterprise and then its applicability will be verified for other organizations of the same type. The work presented in the paper is performed under the project "CONTO – Ontology-based Competencies Management in Information Technology" funded by the Romanian Ministry of Education and Research, involving two universities, a research institute and an IT private company. A competence management system (CMS, in our vision has to achieve three functions: (a to support the complete and systematic acquisition of knowledge about the competence of the members of an enterprise; (b to provide the knowledge about competences and their owners; (c to apply the available knowledge to serve a purpose. The core of the competence management information system is an ontology that plays the role of the declarative knowledge repository containing the basic concepts (such as: company-job, competence, domain, group, person etc. and their relationships with other concepts, instances and properties. The Protégé environment was used for the development of this ontology. The structure of the ontology is conceived so that description logics can be used to represent the concept definitions of the application domain in a structured and formally well-understood way. Knowledge acquisition is performed in our approach by enriching the ontology, according to the requirements of the IT company. An advantage of using an ontology-based system is the possibility of the identification of new relations among concepts based on inferences starting from the existing knowledge. The user can choose to query instances of one type of concept. The paper also presents some use-cases.

  6. Facet Theory and the Mapping Sentence As Hermeneutically Consistent Structured Meta-Ontology and Structured Meta-Mereology

    Hackett, Paul M. W.

    2016-01-01

    When behavior is interpreted in a reliable manner (i.e., robustly across different situations and times) its explained meaning may be seen to possess hermeneutic consistency. In this essay I present an evaluation of the hermeneutic consistency that I propose may be present when the research tool known as the mapping sentence is used to create generic structural ontologies. I also claim that theoretical and empirical validity is a likely result of employing the mapping sentence in research design and interpretation. These claims are non-contentious within the realm of quantitative psychological and behavioral research. However, I extend the scope of both facet theory based research and claims for its structural utility, reliability and validity to philosophical and qualitative investigations. I assert that the hermeneutic consistency of a structural ontology is a product of a structural representation's ontological components and the mereological relationships between these ontological sub-units: the mapping sentence seminally allows for the depiction of such structure. PMID:27065932

  7. Axiomatic Ontology Learning Approaches for English Translation of the Meaning of Quranic Texts

    Saad Saidah

    2017-01-01

    Full Text Available Ontology learning (OL is the computational task of generating a knowledge base in the form of an ontology, given an unstructured corpus in natural language (NL. While most works in the field of ontology learning have been primarily based on a statistical approach to extract lightweight OL, very few attempts have been made to extract axiomatic OL (called heavyweight OL from NL text documents. Axiomatic OL supports more precise formal logic-based reasoning when compared to lightweight OL. Lexico-syntactic pattern matching and statisticsal one cannot lead to very accurate learning, mostly because of several linguistic nuances in the NL. Axiomatic OL is an alternative methodology that has not been explored much, where a deep linguistics analysis in computational linguistics is used to generate formal axioms and definitions instead of simply inducing a taxonomy. The ontology that is created not only stores the information about the application domain in explicit knowledge, but also can deduce the implicit knowledge from this ontology. This research will explore the English translation of the meaning of Quranic texts.

  8. From Participatory Design and Ontological Ethics, Towards an Approach to Constructive Ethics

    Hansen, Sandra Burri Gram; Ryberg, Thomas

    2015-01-01

    This paper explores, analyses and discusses the potential of applying Danish theologian and philosopher K.E. Løgstrup’s ontological approach to ethics, when planning and conducting participatory design activities. By doing so, ethical considerations, will transform from being a summative evaluation...

  9. FIDELITY TOWARDS FORMS: AN ONTOLOGICAL APPROACH – PART II

    ANA BAZAC

    2015-05-01

    Full Text Available The paper opposes to a common attitude towards forms – as being something non-important, superficial, “formal” – Plato and Aristotle’s philosophy, according to which things exist because of forms. From the inquiry of their logic that mixes the epistemological and the ontological standpoint, the analysis goes on to the problem of the understanding of forms as events: as mirrors of the manner we see the world/as mirrors of the way of thinking. I contrast the event to the situation – in Alain Badiou’s manner – and I show that there is a logic of continuity between Aristotle’s insistence on the concrete face of form (σύνoλoν and Badiou’s concept of fidelity: because this concept always relates to the concrete which deserves to be faithful towards. The value of things we support gives their “forms”. If so, fidelity towards forms is something more complete and suggestive than to follow essences: forms are as important as essences; this is obvious when the forms change but the essence do not; in fact, it is not a real change. The real change is when the form changes bringing also the change of the essence.

  10. FIDELITY TOWARDS FORMS: AN ONTOLOGICAL APPROACH – PART I

    ANA BAZAC

    2014-11-01

    Full Text Available The paper opposes to a common attitude towards forms – as being something non-important, superficial, “formal” – Plato and Aristotle’s philosophy, according to which things exist because of forms. From the inquiry of their logic that mixes the epistemological and the ontological standpoint, the analysis goes on to the problem of the understanding of forms as events: as mirrors of the manner we see the world/as mirrors of the way of thinking. I contrast the event to the situation – in Alain Badiou’s manner – and I show that there is a logic of continuity between Aristotle’s insistence on the concrete face of form (σύνoλoν and Badiou’s concept of fidelity: because this concept always relates to the concrete which deserves to be faithful towards. The value of things we support gives their “forms”. If so, fidelity towards forms is something more complete and suggestive than to follow essences: forms are as important as essences; this is obvious when the forms change but the essence do not; in fact, it is not a real change. The real change is when the form changes bringing also the change of the essence.

  11. NeuroLOG: sharing neuroimaging data using an ontology-based federated approach.

    Gibaud, Bernard; Kassel, Gilles; Dojat, Michel; Batrancourt, Bénédicte; Michel, Franck; Gaignard, Alban; Montagnat, Johan

    2011-01-01

    This paper describes the design of the NeuroLOG middleware data management layer, which provides a platform to share heterogeneous and distributed neuroimaging data using a federated approach. The semantics of shared information is captured through a multi-layer application ontology and a derived Federated Schema used to align the heterogeneous database schemata from different legacy repositories. The system also provides a facility to translate the relational data into a semantic representation that can be queried using a semantic search engine thus enabling the exploitation of knowledge embedded in the ontology. This work shows the relevance of the distributed approach for neurosciences data management. Although more complex than a centralized approach, it is also more realistic when considering the federation of large data sets, and open strong perspectives to implement multi-centric neurosciences studies.

  12. An Ontology-supported Approach for Automatic Chaining of Web Services in Geospatial Knowledge Discovery

    di, L.; Yue, P.; Yang, W.; Yu, G.

    2006-12-01

    Recent developments in geospatial semantic Web have shown promise for automatic discovery, access, and use of geospatial Web services to quickly and efficiently solve particular application problems. With the semantic Web technology, it is highly feasible to construct intelligent geospatial knowledge systems that can provide answers to many geospatial application questions. A key challenge in constructing such intelligent knowledge system is to automate the creation of a chain or process workflow that involves multiple services and highly diversified data and can generate the answer to a specific question of users. This presentation discusses an approach for automating composition of geospatial Web service chains by employing geospatial semantics described by geospatial ontologies. It shows how ontology-based geospatial semantics are used for enabling the automatic discovery, mediation, and chaining of geospatial Web services. OWL-S is used to represent the geospatial semantics of individual Web services and the type of the services it belongs to and the type of the data it can handle. The hierarchy and classification of service types are described in the service ontology. The hierarchy and classification of data types are presented in the data ontology. For answering users' geospatial questions, an Artificial Intelligent (AI) planning algorithm is used to construct the service chain by using the service and data logics expressed in the ontologies. The chain can be expressed as a graph with nodes representing services and connection weights representing degrees of semantic matching between nodes. The graph is a visual representation of logical geo-processing path for answering users' questions. The graph can be instantiated to a physical service workflow for execution to generate the answer to a user's question. A prototype system, which includes real world geospatial applications, is implemented to demonstrate the concept and approach.

  13. THE CIDOC CRM GAME: A Serious Game Approach to Ontology Learning

    Guillem, A.; Bruseker, G.

    2017-08-01

    Formal ontologies such as CIDOC CRM (Conceptual Reference Model) form part of the central strategy for the medium and longterm integration of cultural heritage data to allow for its greater valorization and dissemination. Despite this, uptake of CIDOC CRM at the ground level of Cultural Heriage (CH) practice is limited. Part of the reason behind this lack of uptake lies in the fact that ontologies are considered too complicated and abstract for application in real life scenarios. This paper presents the rationale behind and the design of a CIDOC CRM game, the intent of which is to provide a learning mechanism to allow learners of wide backgrounds and interests to approach CIDOC CRM in a hands-on and interactive fashion. The CIDOC CRM game consist of decks of cards and game boards that allow players to engage with the concepts of a formal ontology in relation to real data in an entertaining and informative way. It is argued that the CIDOC CRM Game can form an important part of introducing the basic elements of formal ontology and this standard to a wider audience in order to aid wider understanding and adoption of the same.

  14. Formalized Conflicts Detection Based on the Analysis of Multiple Emails: An Approach Combining Statistics and Ontologies

    Zakaria, Chahnez; Curé, Olivier; Salzano, Gabriella; Smaïli, Kamel

    In Computer Supported Cooperative Work (CSCW), it is crucial for project leaders to detect conflicting situations as early as possible. Generally, this task is performed manually by studying a set of documents exchanged between team members. In this paper, we propose a full-fledged automatic solution that identifies documents, subjects and actors involved in relational conflicts. Our approach detects conflicts in emails, probably the most popular type of documents in CSCW, but the methods used can handle other text-based documents. These methods rely on the combination of statistical and ontological operations. The proposed solution is decomposed in several steps: (i) we enrich a simple negative emotion ontology with terms occuring in the corpus of emails, (ii) we categorize each conflicting email according to the concepts of this ontology and (iii) we identify emails, subjects and team members involved in conflicting emails using possibilistic description logic and a set of proposed measures. Each of these steps are evaluated and validated on concrete examples. Moreover, this approach's framework is generic and can be easily adapted to domains other than conflicts, e.g. security issues, and extended with operations making use of our proposed set of measures.

  15. Advancing data reuse in phyloinformatics using an ontology-driven Semantic Web approach.

    Panahiazar, Maryam; Sheth, Amit P; Ranabahu, Ajith; Vos, Rutger A; Leebens-Mack, Jim

    2013-01-01

    Phylogenetic analyses can resolve historical relationships among genes, organisms or higher taxa. Understanding such relationships can elucidate a wide range of biological phenomena, including, for example, the importance of gene and genome duplications in the evolution of gene function, the role of adaptation as a driver of diversification, or the evolutionary consequences of biogeographic shifts. Phyloinformaticists are developing data standards, databases and communication protocols (e.g. Application Programming Interfaces, APIs) to extend the accessibility of gene trees, species trees, and the metadata necessary to interpret these trees, thus enabling researchers across the life sciences to reuse phylogenetic knowledge. Specifically, Semantic Web technologies are being developed to make phylogenetic knowledge interpretable by web agents, thereby enabling intelligently automated, high-throughput reuse of results generated by phylogenetic research. This manuscript describes an ontology-driven, semantic problem-solving environment for phylogenetic analyses and introduces artefacts that can promote phyloinformatic efforts to promote accessibility of trees and underlying metadata. PhylOnt is an extensible ontology with concepts describing tree types and tree building methodologies including estimation methods, models and programs. In addition we present the PhylAnt platform for annotating scientific articles and NeXML files with PhylOnt concepts. The novelty of this work is the annotation of NeXML files and phylogenetic related documents with PhylOnt Ontology. This approach advances data reuse in phyloinformatics.

  16. A Method for Building Personalized Ontology Summaries

    Queiroz-Sousa, Paulo Orlando; Salgado, Ana Carolina; Pires, Carlos Eduardo

    2013-01-01

    In the context of ontology engineering, the ontology understanding is the basis for its further developmentand reuse. One intuitive eective approach to support ontology understanding is the process of ontology summarizationwhich highlights the most important concepts of an ontology. Ontology summarization identies an excerpt from anontology that contains the most relevant concepts and produces an abridged ontology. In this article, we present amethod for summarizing ontologies that represent ...

  17. An Approach to Folksonomy-Based Ontology Maintenance for Learning Environments

    Gasevic, D.; Zouaq, Amal; Torniai, Carlo; Jovanovic, J.; Hatala, Marek

    2011-01-01

    Recent research in learning technologies has demonstrated many promising contributions from the use of ontologies and semantic web technologies for the development of advanced learning environments. In spite of those benefits, ontology development and maintenance remain the key research challenges to be solved before ontology-enhanced learning…

  18. Evaluating The Global Inventory of Planetary Analog Environments on Earth: An Ontological Approach

    Conrad, P. G.

    2010-12-01

    Introduction: Field sites on Earth are routinely used to simulate planetary environments so that we can try to understand the evidence of processes such as sedimentary deposition, weathering, evolution of habitable environments, and behavior of spacecraft and instrumentation prior to selection of mission architectures, payload investigations and landing sites for in situ exploration of other planets. The rapid evolution of astrobiology science drivers for space exploration as well as increasing capability to explore planetary surfaces in situ has led to a proliferation of declarations that various Earth environments are analogs for less accessible planetary environments. We have not yet progressed to standardized measures of analog fidelity, and the analog value of field sites can be variable de-pending upon a variety of factors. Here we present a method of evaluating the fidelity and hence utility of analog environments by using an ontological approach to evaluating how well the analogs work. The use of ontologies as specification constructs is now quite common in artificial intelligence, systems engineering, business development and various informatics systems. We borrow from these developments just as they derive from the original use of ontology in philosophy, where it was meant as a systematic approach to describing the fundamental elements that define “being,” or existence [1]. An ontology is a framework for the specification of a concept or domain of interest. The knowledge regarding that domain, eg., inventory of objects, hierarchical classes, relationships and functions is what describes and defines the domain as a declarative formalism [2]. In the case of planetary environments, one can define a list of fundamen-tal attributes without which the domain (environment) in question must be defined (classified) otherwise. In particu-lar this is problematic when looking at ancient environments because of their alteration over time. In other words, their

  19. Feasibility of automated foundational ontology interchangeability

    Khan, ZC

    2014-11-01

    Full Text Available the Source Domain Ontology (sOd), with the domain knowledge com- ponent of the source ontology, the Source Foundational Ontology (sOf ) that is the foundational ontology component of the source ontology that is to be interchanged, and any equivalence... or subsumption mappings between enti- ties in sOd and sOf . – The Target Ontology (tO) which has been interchanged, which comprises the Target Domain Ontology (tOd), with the domain knowledge component of the target ontology, and the Target Foundational Ontology...

  20. Gene expression profiling in susceptible interaction of grapevine with its fungal pathogen Eutypa lata: Extending MapMan ontology for grapevine

    Usadel Björn

    2009-08-01

    Full Text Available Abstract Background Whole genome transcriptomics analysis is a very powerful approach because it gives an overview of the activity of genes in certain cells or tissue types. However, biological interpretation of such results can be rather tedious. MapMan is a software tool that displays large datasets (e.g. gene expression data onto diagrams of metabolic pathways or other processes and thus enables easier interpretation of results. The grapevine (Vitis vinifera genome sequence has recently become available bringing a new dimension into associated research. Two microarray platforms were designed based on the TIGR Gene Index database and used in several physiological studies. Results To enable easy and effective visualization of those and further experiments, annotation of Vitis vinifera Gene Index (VvGI version 5 to MapMan ontology was set up. Due to specificities of grape physiology, we have created new pictorial representations focusing on three selected pathways: carotenoid pathway, terpenoid pathway and phenylpropanoid pathway, the products of these pathways being important for wine aroma, flavour and colour, as well as plant defence against pathogens. This new tool was validated on Affymetrix microarrays data obtained during berry ripening and it allowed the discovery of new aspects in process regulation. We here also present results on transcriptional profiling of grape plantlets after exposal to the fungal pathogen Eutypa lata using Operon microarrays including visualization of results with MapMan. The data show that the genes induced in infected plants, encode pathogenesis related proteins and enzymes of the flavonoid metabolism, which are well known as being responsive to fungal infection. Conclusion The extension of MapMan ontology to grapevine together with the newly constructed pictorial representations for carotenoid, terpenoid and phenylpropanoid metabolism provide an alternative approach to the analysis of grapevine gene expression

  1. USE OF ONTOLOGIES FOR KNOWLEDGE BASES CREATION TUTORING COMPUTER SYSTEMS

    Cheremisina Lyubov

    2014-01-01

    This paper deals with the use of ontology for the use and development of intelligent tutoring systems. We consider the shortcomings of educational software and distance learning systems and the advantages of using ontology’s in their design. Actuality creates educational computer systems based on systematic knowledge. We consider classification of properties, use and benefits of ontology’s. Characterized approaches to the problem of ontology mapping, the first of which – manual mapping, the s...

  2. Ontology-driven approach for describing industrial socio-cyberphysical systems’ components

    Teslya Nikolay

    2018-01-01

    Full Text Available Nowadays, the concept of the industrial Internet of things is considered by researchers as the basis of Industry 4.0. Its use is aimed at creating a single information space that allows to unite all the components of production, starting from the processed raw materials to the interaction with suppliers and users of completed goods. Such a union will allow to change the established business processes of production to increase the customization of end products for the consumer and to reduce the costs for its producers. Each of the components is described using a digital twin, showing their main characteristics, important for production. The heterogeneity of these characteristics for each of the production levels makes it very difficult to exchange information between them. To solve the problem of interaction between individual components this paper proposes to use the ontological approach to model the components of industrial socio-cyberphysical systems. The paper considers four scenarios of interaction in the industrial Internet of things, based on which the upper-level ontology is formed, which describes the main components of industrial socio-cyberphysical systems and the connections between them.

  3. An ontology-based semantic configuration approach to constructing Data as a Service for enterprises

    Cai, Hongming; Xie, Cheng; Jiang, Lihong; Fang, Lu; Huang, Chenxi

    2016-03-01

    To align business strategies with IT systems, enterprises should rapidly implement new applications based on existing information with complex associations to adapt to the continually changing external business environment. Thus, Data as a Service (DaaS) has become an enabling technology for enterprise through information integration and the configuration of existing distributed enterprise systems and heterogonous data sources. However, business modelling, system configuration and model alignment face challenges at the design and execution stages. To provide a comprehensive solution to facilitate data-centric application design in a highly complex and large-scale situation, a configurable ontology-based service integrated platform (COSIP) is proposed to support business modelling, system configuration and execution management. First, a meta-resource model is constructed and used to describe and encapsulate information resources by way of multi-view business modelling. Then, based on ontologies, three semantic configuration patterns, namely composite resource configuration, business scene configuration and runtime environment configuration, are designed to systematically connect business goals with executable applications. Finally, a software architecture based on model-view-controller (MVC) is provided and used to assemble components for software implementation. The result of the case study demonstrates that the proposed approach provides a flexible method of implementing data-centric applications.

  4. Data mining for ontology development.

    Davidson, George S.; Strasburg, Jana (Pacific Northwest National Laboratory, Richland, WA); Stampf, David (Brookhaven National Laboratory, Upton, NY); Neymotin,Lev (Brookhaven National Laboratory, Upton, NY); Czajkowski, Carl (Brookhaven National Laboratory, Upton, NY); Shine, Eugene (Savannah River National Laboratory, Aiken, SC); Bollinger, James (Savannah River National Laboratory, Aiken, SC); Ghosh, Vinita (Brookhaven National Laboratory, Upton, NY); Sorokine, Alexandre (Oak Ridge National Laboratory, Oak Ridge, TN); Ferrell, Regina (Oak Ridge National Laboratory, Oak Ridge, TN); Ward, Richard (Oak Ridge National Laboratory, Oak Ridge, TN); Schoenwald, David Alan

    2010-06-01

    A multi-laboratory ontology construction effort during the summer and fall of 2009 prototyped an ontology for counterfeit semiconductor manufacturing. This effort included an ontology development team and an ontology validation methods team. Here the third team of the Ontology Project, the Data Analysis (DA) team reports on their approaches, the tools they used, and results for mining literature for terminology pertinent to counterfeit semiconductor manufacturing. A discussion of the value of ontology-based analysis is presented, with insights drawn from other ontology-based methods regularly used in the analysis of genomic experiments. Finally, suggestions for future work are offered.

  5. Ethnicity Recording in Primary Care Computerised Medical Record Systems: An Ontological Approach

    Zayd Tippu

    2017-03-01

    Full Text Available Background Ethnicity recording within primary care computerised medical record (CMR systems is suboptimal, exacerbated by tangled taxonomies within current coding systems. Objective To develop a method for extending ethnicity identification using routinely collected data. Methods We used an ontological method to maximise the reliability and prevalence of ethnicity information in the Royal College of General Practitioner’s Research and Surveillance database. Clinical codes were either directly mapped to ethnicity group or utilised as proxy markers (such as language spoken from which ethnicity could be inferred. We compared the performance of our method with the recording rates that would be identified by code lists utilised by the UK pay for the performance system, with the help of the Quality and Outcomes Framework (QOF. Results Data from 2,059,453 patients across 110 practices were included. The overall categorisable ethnicity using QOF codes was 36.26% (95% confidence interval (CI: 36.20%–36.33%. This rose to 48.57% (CI:48.50%–48.64% using the described ethnicity mapping process. Mapping increased across all ethnic groups. The largest increase was seen in the white ethnicity category (30.61%; CI: 30.55%–30.67% to 40.24%; CI: 40.17%–40.30%. The highest relative increase was in the ethnic group categorised as the other (0.04%; CI: 0.03%–0.04% to 0.92%; CI: 0.91%–0.93%. Conclusions This mapping method substantially increases the prevalence of known ethnicity in CMR data and may aid future epidemiological research based on routine data.

  6. Security of Heterogeneous Content in Cloud Based Library Information Systems Using an Ontology Based Approach

    Mihai DOINEA

    2014-01-01

    Full Text Available As in any domain that involves the use of software, the library information systems take advantages of cloud computing. The paper highlights the main aspect of cloud based systems, describing some public solutions provided by the most important players on the market. Topics related to content security in cloud based services are tackled in order to emphasize the requirements that must be met by these types of systems. A cloud based implementation of an Information Library System is presented and some adjacent tools that are used together with it to provide digital content and metadata links are described. In a cloud based Information Library System security is approached by means of ontologies. Aspects such as content security in terms of digital rights are presented and a methodology for security optimization is proposed.

  7. A document-centric approach for developing the tolAPC ontology.

    Blfgeh, Aisha; Warrender, Jennifer; Hilkens, Catharien M U; Lord, Phillip

    2017-11-28

    There are many challenges associated with ontology building, as the process often touches on many different subject areas; it needs knowledge of the problem domain, an understanding of the ontology formalism, software in use and, sometimes, an understanding of the philosophical background. In practice, it is very rare that an ontology can be completed by a single person, as they are unlikely to combine all of these skills. So people with these skills must collaborate. One solution to this is to use face-to-face meetings, but these can be expensive and time-consuming for teams that are not co-located. Remote collaboration is possible, of course, but one difficulty here is that domain specialists use a wide-variety of different "formalisms" to represent and share their data - by the far most common, however, is the "office file" either in the form of a word-processor document or a spreadsheet. Here we describe the development of an ontology of immunological cell types; this was initially developed by domain specialists using an Excel spreadsheet for collaboration. We have transformed this spreadsheet into an ontology using highly-programmatic and pattern-driven ontology development. Critically, the spreadsheet remains part of the source for the ontology; the domain specialists are free to update it, and changes will percolate to the end ontology. We have developed a new ontology describing immunological cell lines built by instantiating ontology design patterns written programmatically, using values from a spreadsheet catalogue. This method employs a spreadsheet that was developed by domain experts. The spreadsheet is unconstrained in its usage and can be freely updated resulting in a new ontology. This provides a general methodology for ontology development using data generated by domain specialists.

  8. FOCIH: Form-Based Ontology Creation and Information Harvesting

    Tao, Cui; Embley, David W.; Liddle, Stephen W.

    Creating an ontology and populating it with data are both labor-intensive tasks requiring a high degree of expertise. Thus, scaling ontology creation and population to the size of the web in an effort to create a web of data—which some see as Web 3.0—is prohibitive. Can we find ways to streamline these tasks and lower the barrier enough to enable Web 3.0? Toward this end we offer a form-based approach to ontology creation that provides a way to create Web 3.0 ontologies without the need for specialized training. And we offer a way to semi-automatically harvest data from the current web of pages for a Web 3.0 ontology. In addition to harvesting information with respect to an ontology, the approach also annotates web pages and links facts in web pages to ontological concepts, resulting in a web of data superimposed over the web of pages. Experience with our prototype system shows that mappings between conceptual-model-based ontologies and forms are sufficient for creating the kind of ontologies needed for Web 3.0, and experiments with our prototype system show that automatic harvesting, automatic annotation, and automatic superimposition of a web of data over a web of pages work well.

  9. The MMI Device Ontology: Enabling Sensor Integration

    Rueda, C.; Galbraith, N.; Morris, R. A.; Bermudez, L. E.; Graybeal, J.; Arko, R. A.; Mmi Device Ontology Working Group

    2010-12-01

    The Marine Metadata Interoperability (MMI) project has developed an ontology for devices to describe sensors and sensor networks. This ontology is implemented in the W3C Web Ontology Language (OWL) and provides an extensible conceptual model and controlled vocabularies for describing heterogeneous instrument types, with different data characteristics, and their attributes. It can help users populate metadata records for sensors; associate devices with their platforms, deployments, measurement capabilities and restrictions; aid in discovery of sensor data, both historic and real-time; and improve the interoperability of observational oceanographic data sets. We developed the MMI Device Ontology following a community-based approach. By building on and integrating other models and ontologies from related disciplines, we sought to facilitate semantic interoperability while avoiding duplication. Key concepts and insights from various communities, including the Open Geospatial Consortium (eg., SensorML and Observations and Measurements specifications), Semantic Web for Earth and Environmental Terminology (SWEET), and W3C Semantic Sensor Network Incubator Group, have significantly enriched the development of the ontology. Individuals ranging from instrument designers, science data producers and consumers to ontology specialists and other technologists contributed to the work. Applications of the MMI Device Ontology are underway for several community use cases. These include vessel-mounted multibeam mapping sonars for the Rolling Deck to Repository (R2R) program and description of diverse instruments on deepwater Ocean Reference Stations for the OceanSITES program. These trials involve creation of records completely describing instruments, either by individual instances or by manufacturer and model. Individual terms in the MMI Device Ontology can be referenced with their corresponding Uniform Resource Identifiers (URIs) in sensor-related metadata specifications (e

  10. Systematization of climate data in the virtual research environment on the basis of ontology approach

    Alipova, K. A.; Bart, A. A.; Fazliev, A. Z.; Gordov, E. P.; Okladnikov, I. G.; Privezentsev, A. I.; Titov, A. G.

    2017-11-01

    The first version of a primitive OWL-ontology of collections climate and meteorological data of Institute of Monitoring of Climatic and Ecological Systems SB RAS is presented. The ontology is a component of expert and decision support systems intended for quick search for climate and meteorological data required for solution of a certain class of applied problems.

  11. A Semi-Automatic Approach to Construct Vietnamese Ontology from Online Text

    Nguyen, Bao-An; Yang, Don-Lin

    2012-01-01

    An ontology is an effective formal representation of knowledge used commonly in artificial intelligence, semantic web, software engineering, and information retrieval. In open and distance learning, ontologies are used as knowledge bases for e-learning supplements, educational recommenders, and question answering systems that support students with…

  12. GO-Bayes: Gene Ontology-based overrepresentation analysis using a Bayesian approach.

    Zhang, Song; Cao, Jing; Kong, Y Megan; Scheuermann, Richard H

    2010-04-01

    A typical approach for the interpretation of high-throughput experiments, such as gene expression microarrays, is to produce groups of genes based on certain criteria (e.g. genes that are differentially expressed). To gain more mechanistic insights into the underlying biology, overrepresentation analysis (ORA) is often conducted to investigate whether gene sets associated with particular biological functions, for example, as represented by Gene Ontology (GO) annotations, are statistically overrepresented in the identified gene groups. However, the standard ORA, which is based on the hypergeometric test, analyzes each GO term in isolation and does not take into account the dependence structure of the GO-term hierarchy. We have developed a Bayesian approach (GO-Bayes) to measure overrepresentation of GO terms that incorporates the GO dependence structure by taking into account evidence not only from individual GO terms, but also from their related terms (i.e. parents, children, siblings, etc.). The Bayesian framework borrows information across related GO terms to strengthen the detection of overrepresentation signals. As a result, this method tends to identify sets of closely related GO terms rather than individual isolated GO terms. The advantage of the GO-Bayes approach is demonstrated with a simulation study and an application example.

  13. An Approach for Multi-Artifact Testing Through an Ontological Perspective for Behavior-Driven Development

    Thiago Rocha Silva

    2016-07-01

    Full Text Available In a user-centered development process, artifacts evolve in iterative cycles until they meet users’ requirements and then become the final product. Every cycle gives the opportunity to revise the design and to introduce new requirements which might affect the specification of artifacts that have been set in former development phases. Testing the consistency of multiple artifacts used to develop interactive systems every time that new requirements are introduced is a cumbersome activity, especially if it is done manually. This paper proposes an approach based on Behavior-Driven Development (BDD to support the automated assessment of artifacts along the development process of interactive systems. The paper uses an ontology for specifying tests that can run over multiple artifacts sharing similar concepts. A case study testing Task Models, Prototypes, and Final User Interfaces is presented to demonstrate the feasibility of this approach from the early phases of the design process, providing a continuous quality assurance of requirements, and helping clients and development teams to identify potential problems and inconsistencies before commitments with software implementation are made.

  14. Blockchain-Oriented Coalition Formation by CPS Resources: Ontological Approach and Case Study

    Alexey Kashevnik

    2018-05-01

    Full Text Available Cyber-physical systems (CPS, robotics, Internet of Things, information and communication technologies have become more and more popular over the last several years. These topics open new perspectives and scenarios that can automate processes in human life. CPS are aimed at interaction support in information space for physical entities communicated in physical space in real time. At the same time the blockchain technology that becomes popular last years allows to organize immutable distributed database that store all significant information and provide access for CPS participants. The paper proposes an approach that is based on ontology-based context management, publish/subscribe semantic interoperability support, and blockchain techniques. Utilization of these techniques provide possibilities to develop CPS that supports dynamic, distributed, and stable coalition formation of the resources. The case study presented has been implemented for the scenario of heterogeneous mobile robots’ collaboration for the overcoming of obstacles. There are two types of robots and an information service participating in the scenario. Evaluation shows that the proposed approach is applicable for the presented class of scenarios.

  15. Conceptual querying through ontologies

    Andreasen, Troels; Bulskov, Henrik

    2009-01-01

    is motivated by an obvious need for users to survey huge volumes of objects in query answers. An ontology formalism and a special notion of-instantiated ontology" are introduced. The latter is a structure reflecting the content in the document collection in that; it is a restriction of a general world......We present here ail approach to conceptual querying where the aim is, given a collection of textual database objects or documents, to target an abstraction of the entire database content in terms of the concepts appearing in documents, rather than the documents in the collection. The approach...... knowledge ontology to the concepts instantiated in the collection. The notion of ontology-based similarity is briefly described, language constructs for direct navigation and retrieval of concepts in the ontology are discussed and approaches to conceptual summarization are presented....

  16. An Ontological Approach to Developing Information Operations Applications for use on the Semantic Web

    Clarke, Timothy L

    2008-01-01

    .... By expressing IO capabilities in a formal ontology suitable for use on the Semantic Web, conditions are set such that computational power can more efficiently be leveraged to better define required...

  17. An Ontological Approach to Developing Information Operations Applications for Use on the Semantic Web

    Clarke, Timothy L

    2008-01-01

    .... By expressing IO capabilities in a formal ontology suitable for use on the Semantic Web, conditions are set such that computational power can more efficiently be leveraged to better define required...

  18. An Intelligent Information Retrieval Approach Based on Two Degrees of Uncertainty Fuzzy Ontology

    Maryam Hourali; Gholam Ali Montazer

    2011-01-01

    In spite of the voluminous studies in the field of intelligent retrieval systems, effective retrieving of information has been remained an important unsolved problem. Implementations of different conceptual knowledge in the information retrieval process such as ontology have been considered as a solution to enhance the quality of results. Furthermore, the conceptual formalism supported by typical ontology may not be sufficient to represent uncertainty information due to the lack of clear-cut ...

  19. Ontological Surprises

    Leahu, Lucian

    2016-01-01

    a hybrid approach where machine learning algorithms are used to identify objects as well as connections between them; finally, it argues for remaining open to ontological surprises in machine learning as they may enable the crafting of different relations with and through technologies.......This paper investigates how we might rethink design as the technological crafting of human-machine relations in the context of a machine learning technique called neural networks. It analyzes Google’s Inceptionism project, which uses neural networks for image recognition. The surprising output...

  20. Applying of an Ontology based Modeling Approach to Cultural Heritage Systems

    POPOVICI, D.-M.

    2011-08-01

    Full Text Available Any virtual environment (VE built in a classical way is dedicated to a very specific domain. Its modification or even adaptation to another domain requires an expensive human intervention measured in time and money. This way, the product, that means the VE, returns at the first phases of the development process. In a previous work we proposed an approach that combines domain ontologies and conceptual modeling to construct more accurate VEs. Our method is based on the description of the domain knowledge in a standard format and the assisted creation (using these pieces of knowledge of the VE. This permits the explanation within the virtual reality (VR simulation of the semantic of the whole context and of each object. This knowledge may be then transferred to the public users. In this paper we prove the effectiveness of our method on the construction process of an VE that simulates the organization of a Greek-Roman colony situated on the Black Sea coast and the economic and social activities of its people.

  1. An Ontological-Fuzzy Approach to Advance Reservation in Multi-Cluster Grids

    Ferreira, D J; Dantas, M A R; Bauer, Michael A

    2010-01-01

    Advance reservation is an important mechanism for a successful utilization of available resources in distributed multi-cluster environments. This mechanism allows, for example, a user to provide parameters aiming to satisfy requirements related to applications' execution time and temporal dependence. This predictability can lead the system to reach higher levels of QoS. However, the support for advance reservation has been restricted due to the complexity of large scale configurations and also dynamic changes verified in these systems. In this research work it is proposed an advance reservation method, based on a ontology-fuzzy approach. It allows a user to reserve a wide variety of resources and enable large jobs to be reserved among different nodes. In addition, it dynamically verifies the possibility of reservation with the local RMS, avoiding future allocation conflicts. Experimental results of the proposal, through simulation, indicate that the proposed mechanism reached a successful level of flexibility for large jobs and more appropriated distribution of resources in a distributed multi-cluster configuration.

  2. An Ontological-Fuzzy Approach to Advance Reservation in Multi-Cluster Grids

    Ferreira, D J; Dantas, M A R; Bauer, Michael A, E-mail: ded@inf.ufsc.br, E-mail: mario@inf.ufsc.br, E-mail: bauer@csd.uwo.ca

    2010-11-01

    Advance reservation is an important mechanism for a successful utilization of available resources in distributed multi-cluster environments. This mechanism allows, for example, a user to provide parameters aiming to satisfy requirements related to applications' execution time and temporal dependence. This predictability can lead the system to reach higher levels of QoS. However, the support for advance reservation has been restricted due to the complexity of large scale configurations and also dynamic changes verified in these systems. In this research work it is proposed an advance reservation method, based on a ontology-fuzzy approach. It allows a user to reserve a wide variety of resources and enable large jobs to be reserved among different nodes. In addition, it dynamically verifies the possibility of reservation with the local RMS, avoiding future allocation conflicts. Experimental results of the proposal, through simulation, indicate that the proposed mechanism reached a successful level of flexibility for large jobs and more appropriated distribution of resources in a distributed multi-cluster configuration.

  3. Monument Damage Information System (mondis): AN Ontological Approach to Cultural Heritage Documentation

    Cacciotti, R.; Valach, J.; Kuneš, P.; Čerňanský, M.; Blaško, M.; Křemen, P.

    2013-07-01

    Deriving from the complex nature of cultural heritage conservation it is the need for enhancing a systematic but flexible organization of expert knowledge in the field. Such organization should address comprehensively the interrelations and complementariness among the different factors that come into play in the understanding of diagnostic and intervention problems. The purpose of MONDIS is to endorse this kind of organization. The approach consists in applying an ontological representation to the field of heritage conservation in order to establish an appropriate processing of data. The system allows replicating in a computer readable form the basic dependence among factors influencing the description, diagnosis and intervention of damages to immovable objects. More specifically MONDIS allows to input and search entries concerning object description, structural evolution, location characteristics and risk, component, material properties, surveys and measurements, damage typology, damage triggering events and possible interventions. The system supports searching features typical of standard databases, as it allows for the digitalization of a wide range of information including professional reports, books, articles and scientific papers. It also allows for computer aided retrieval of information tailored to user's requirements. The foreseen outputs will include a web user interface and a mobile application for visual inspection purposes.

  4. Ontology alignment architecture for semantic sensor Web integration.

    Fernandez, Susel; Marsa-Maestre, Ivan; Velasco, Juan R; Alarcos, Bernardo

    2013-09-18

    Sensor networks are a concept that has become very popular in data acquisition and processing for multiple applications in different fields such as industrial, medicine, home automation, environmental detection, etc. Today, with the proliferation of small communication devices with sensors that collect environmental data, semantic Web technologies are becoming closely related with sensor networks. The linking of elements from Semantic Web technologies with sensor networks has been called Semantic Sensor Web and has among its main features the use of ontologies. One of the key challenges of using ontologies in sensor networks is to provide mechanisms to integrate and exchange knowledge from heterogeneous sources (that is, dealing with semantic heterogeneity). Ontology alignment is the process of bringing ontologies into mutual agreement by the automatic discovery of mappings between related concepts. This paper presents a system for ontology alignment in the Semantic Sensor Web which uses fuzzy logic techniques to combine similarity measures between entities of different ontologies. The proposed approach focuses on two key elements: the terminological similarity, which takes into account the linguistic and semantic information of the context of the entity's names, and the structural similarity, based on both the internal and relational structure of the concepts. This work has been validated using sensor network ontologies and the Ontology Alignment Evaluation Initiative (OAEI) tests. The results show that the proposed techniques outperform previous approaches in terms of precision and recall.

  5. Ontology Alignment Architecture for Semantic Sensor Web Integration

    Bernardo Alarcos

    2013-09-01

    Full Text Available Sensor networks are a concept that has become very popular in data acquisition and processing for multiple applications in different fields such as industrial, medicine, home automation, environmental detection, etc. Today, with the proliferation of small communication devices with sensors that collect environmental data, semantic Web technologies are becoming closely related with sensor networks. The linking of elements from Semantic Web technologies with sensor networks has been called Semantic Sensor Web and has among its main features the use of ontologies. One of the key challenges of using ontologies in sensor networks is to provide mechanisms to integrate and exchange knowledge from heterogeneous sources (that is, dealing with semantic heterogeneity. Ontology alignment is the process of bringing ontologies into mutual agreement by the automatic discovery of mappings between related concepts. This paper presents a system for ontology alignment in the Semantic Sensor Web which uses fuzzy logic techniques to combine similarity measures between entities of different ontologies. The proposed approach focuses on two key elements: the terminological similarity, which takes into account the linguistic and semantic information of the context of the entity’s names, and the structural similarity, based on both the internal and relational structure of the concepts. This work has been validated using sensor network ontologies and the Ontology Alignment Evaluation Initiative (OAEI tests. The results show that the proposed techniques outperform previous approaches in terms of precision and recall.

  6. Integrating phenotype ontologies with PhenomeNET

    Rodriguez-Garcia, Miguel Angel

    2017-12-19

    Background Integration and analysis of phenotype data from humans and model organisms is a key challenge in building our understanding of normal biology and pathophysiology. However, the range of phenotypes and anatomical details being captured in clinical and model organism databases presents complex problems when attempting to match classes across species and across phenotypes as diverse as behaviour and neoplasia. We have previously developed PhenomeNET, a system for disease gene prioritization that includes as one of its components an ontology designed to integrate phenotype ontologies. While not applicable to matching arbitrary ontologies, PhenomeNET can be used to identify related phenotypes in different species, including human, mouse, zebrafish, nematode worm, fruit fly, and yeast. Results Here, we apply the PhenomeNET to identify related classes from two phenotype and two disease ontologies using automated reasoning. We demonstrate that we can identify a large number of mappings, some of which require automated reasoning and cannot easily be identified through lexical approaches alone. Combining automated reasoning with lexical matching further improves results in aligning ontologies. Conclusions PhenomeNET can be used to align and integrate phenotype ontologies. The results can be utilized for biomedical analyses in which phenomena observed in model organisms are used to identify causative genes and mutations underlying human disease.

  7. Summarization by domain ontology navigation

    Andreasen, Troels; Bulskov, Henrik

    2013-01-01

    of the subject. In between these two extremes, conceptual summaries encompass selected concepts derived using background knowledge. We address in this paper an approach where conceptual summaries are provided through a conceptualization as given by an ontology. The ontology guiding the summarization can...... be a simple taxonomy or a generative domain ontology. A domain ontology can be provided by a preanalysis of a domain corpus and can be used to condense improved summaries that better reflects the conceptualization of a given domain....

  8. Quantum ontologies

    Stapp, H.P.

    1988-12-01

    Quantum ontologies are conceptions of the constitution of the universe that are compatible with quantum theory. The ontological orientation is contrasted to the pragmatic orientation of science, and reasons are given for considering quantum ontologies both within science, and in broader contexts. The principal quantum ontologies are described and evaluated. Invited paper at conference: Bell's Theorem, Quantum Theory, and Conceptions of the Universe, George Mason University, October 20-21, 1988. 16 refs

  9. Ontology-based Information Retrieval

    Styltsvig, Henrik Bulskov

    In this thesis, we will present methods for introducing ontologies in information retrieval. The main hypothesis is that the inclusion of conceptual knowledge such as ontologies in the information retrieval process can contribute to the solution of major problems currently found in information...... retrieval. This utilization of ontologies has a number of challenges. Our focus is on the use of similarity measures derived from the knowledge about relations between concepts in ontologies, the recognition of semantic information in texts and the mapping of this knowledge into the ontologies in use......, as well as how to fuse together the ideas of ontological similarity and ontological indexing into a realistic information retrieval scenario. To achieve the recognition of semantic knowledge in a text, shallow natural language processing is used during indexing that reveals knowledge to the level of noun...

  10. Automating data acquisition into ontologies from pharmacogenetics relational data sources using declarative object definitions and XML.

    Rubin, Daniel L; Hewett, Micheal; Oliver, Diane E; Klein, Teri E; Altman, Russ B

    2002-01-01

    Ontologies are useful for organizing large numbers of concepts having complex relationships, such as the breadth of genetic and clinical knowledge in pharmacogenomics. But because ontologies change and knowledge evolves, it is time consuming to maintain stable mappings to external data sources that are in relational format. We propose a method for interfacing ontology models with data acquisition from external relational data sources. This method uses a declarative interface between the ontology and the data source, and this interface is modeled in the ontology and implemented using XML schema. Data is imported from the relational source into the ontology using XML, and data integrity is checked by validating the XML submission with an XML schema. We have implemented this approach in PharmGKB (http://www.pharmgkb.org/), a pharmacogenetics knowledge base. Our goals were to (1) import genetic sequence data, collected in relational format, into the pharmacogenetics ontology, and (2) automate the process of updating the links between the ontology and data acquisition when the ontology changes. We tested our approach by linking PharmGKB with data acquisition from a relational model of genetic sequence information. The ontology subsequently evolved, and we were able to rapidly update our interface with the external data and continue acquiring the data. Similar approaches may be helpful for integrating other heterogeneous information sources in order make the diversity of pharmacogenetics data amenable to computational analysis.

  11. ANALYSIS, THEMATIC MAPS AND DATA MINING FROM POINT CLOUD TO ONTOLOGY FOR SOFTWARE DEVELOPMENT

    R. Nespeca

    2016-06-01

    Full Text Available The primary purpose of the survey for the restoration of Cultural Heritage is the interpretation of the state of building preservation. For this, the advantages of the remote sensing systems that generate dense point cloud (range-based or image-based are not limited only to the acquired data. The paper shows that it is possible to extrapolate very useful information in diagnostics using spatial annotation, with the use of algorithms already implemented in open-source software. Generally, the drawing of degradation maps is the result of manual work, so dependent on the subjectivity of the operator. This paper describes a method of extraction and visualization of information, obtained by mathematical procedures, quantitative, repeatable and verifiable. The case study is a part of the east facade of the Eglise collégiale Saint-Maurice also called Notre Dame des Grâces, in Caromb, in southern France. The work was conducted on the matrix of information contained in the point cloud asci format. The first result is the extrapolation of new geometric descriptors. First, we create the digital maps with the calculated quantities. Subsequently, we have moved to semi-quantitative analyses that transform new data into useful information. We have written the algorithms for accurate selection, for the segmentation of point cloud, for automatic calculation of the real surface and the volume. Furthermore, we have created the graph of spatial distribution of the descriptors. This work shows that if we work during the data processing we can transform the point cloud into an enriched database: the use, the management and the data mining is easy, fast and effective for everyone involved in the restoration process.

  12. A Semantic Social Recommender System Using Ontologies Based Approach For Tunisian Tourism

    Mohamed FRIKHA

    2015-12-01

    Full Text Available Tunisia is well placed in terms of medical tourism and has highly qualified and specialized medical and surgical teams. Integrating social networks in Tunisian medical tourism recommender systems can result in much more accurate recommendations. That is to say, information, interests, and recommendations retrieved from social networks can improve the prediction accuracy. This paper aims to improve traditional recommender systems by incorporating information in social network; including user preferences and influences from social friends. Accordingly, a user interest ontology is developed to make personalized recommendations out of such information. In this paper, we present a semantic social recommender system employing a user interest ontology and a Tunisian Medical Tourism ontology. Our system can improve the quality of recommendation for Tunisian tourism domain. Finally, our social recommendation algorithm is implemented in order to be used in a Tunisia tourism Website to assist users interested in visiting Tunisia for medical purposes.

  13. MetaGO: Predicting Gene Ontology of Non-homologous Proteins Through Low-Resolution Protein Structure Prediction and Protein-Protein Network Mapping.

    Zhang, Chengxin; Zheng, Wei; Freddolino, Peter L; Zhang, Yang

    2018-03-10

    Homology-based transferal remains the major approach to computational protein function annotations, but it becomes increasingly unreliable when the sequence identity between query and template decreases below 30%. We propose a novel pipeline, MetaGO, to deduce Gene Ontology attributes of proteins by combining sequence homology-based annotation with low-resolution structure prediction and comparison, and partner's homology-based protein-protein network mapping. The pipeline was tested on a large-scale set of 1000 non-redundant proteins from the CAFA3 experiment. Under the stringent benchmark conditions where templates with >30% sequence identity to the query are excluded, MetaGO achieves average F-measures of 0.487, 0.408, and 0.598, for Molecular Function, Biological Process, and Cellular Component, respectively, which are significantly higher than those achieved by other state-of-the-art function annotations methods. Detailed data analysis shows that the major advantage of the MetaGO lies in the new functional homolog detections from partner's homology-based network mapping and structure-based local and global structure alignments, the confidence scores of which can be optimally combined through logistic regression. These data demonstrate the power of using a hybrid model incorporating protein structure and interaction networks to deduce new functional insights beyond traditional sequence homology-based referrals, especially for proteins that lack homologous function templates. The MetaGO pipeline is available at http://zhanglab.ccmb.med.umich.edu/MetaGO/. Copyright © 2018. Published by Elsevier Ltd.

  14. Biomedical ontologies: toward scientific debate.

    Maojo, V; Crespo, J; García-Remesal, M; de la Iglesia, D; Perez-Rey, D; Kulikowski, C

    2011-01-01

    Biomedical ontologies have been very successful in structuring knowledge for many different applications, receiving widespread praise for their utility and potential. Yet, the role of computational ontologies in scientific research, as opposed to knowledge management applications, has not been extensively discussed. We aim to stimulate further discussion on the advantages and challenges presented by biomedical ontologies from a scientific perspective. We review various aspects of biomedical ontologies going beyond their practical successes, and focus on some key scientific questions in two ways. First, we analyze and discuss current approaches to improve biomedical ontologies that are based largely on classical, Aristotelian ontological models of reality. Second, we raise various open questions about biomedical ontologies that require further research, analyzing in more detail those related to visual reasoning and spatial ontologies. We outline significant scientific issues that biomedical ontologies should consider, beyond current efforts of building practical consensus between them. For spatial ontologies, we suggest an approach for building "morphospatial" taxonomies, as an example that could stimulate research on fundamental open issues for biomedical ontologies. Analysis of a large number of problems with biomedical ontologies suggests that the field is very much open to alternative interpretations of current work, and in need of scientific debate and discussion that can lead to new ideas and research directions.

  15. An ontological approach to identifying cases of chronic kidney disease from routine primary care data: a cross-sectional study.

    Cole, Nicholas I; Liyanage, Harshana; Suckling, Rebecca J; Swift, Pauline A; Gallagher, Hugh; Byford, Rachel; Williams, John; Kumar, Shankar; de Lusignan, Simon

    2018-04-10

    Accurately identifying cases of chronic kidney disease (CKD) from primary care data facilitates the management of patients, and is vital for surveillance and research purposes. Ontologies provide a systematic and transparent basis for clinical case definition and can be used to identify clinical codes relevant to all aspects of CKD care and its diagnosis. We used routinely collected primary care data from the Royal College of General Practitioners Research and Surveillance Centre. A domain ontology was created and presented in Ontology Web Language (OWL). The identification and staging of CKD was then carried out using two parallel approaches: (1) clinical coding consistent with a diagnosis of CKD; (2) laboratory-confirmed CKD, based on estimated glomerular filtration rate (eGFR) or the presence of proteinuria. The study cohort comprised of 1.2 million individuals aged 18 years and over. 78,153 (6.4%) of the population had CKD on the basis of an eGFR of < 60 mL/min/1.73m 2 , and a further 7366 (0.6%) individuals were identified as having CKD due to proteinuria. 19,504 (1.6%) individuals without laboratory-confirmed CKD had a clinical code consistent with the diagnosis. In addition, a subset of codes allowed for 1348 (0.1%) individuals receiving renal replacement therapy to be identified. Finding cases of CKD from primary care data using an ontological approach may have greater sensitivity than less comprehensive methods, particularly for identifying those receiving renal replacement therapy or with CKD stages 1 or 2. However, the possibility of inaccurate coding may limit the specificity of this method.

  16. Towards Agile Ontology Maintenance

    Luczak-Rösch, Markus

    Ontologies are an appropriate means to represent knowledge on the Web. Research on ontology engineering reached practices for an integrative lifecycle support. However, a broader success of ontologies in Web-based information systems remains unreached while the more lightweight semantic approaches are rather successful. We assume, paired with the emerging trend of services and microservices on the Web, new dynamic scenarios gain momentum in which a shared knowledge base is made available to several dynamically changing services with disparate requirements. Our work envisions a step towards such a dynamic scenario in which an ontology adapts to the requirements of the accessing services and applications as well as the user's needs in an agile way and reduces the experts' involvement in ontology maintenance processes.

  17. A tiered approach for ecosystem services mapping

    Grêt-Regamey, Adrienne; Weibel, Bettina; Rabe, Sven-Erik; Burkhard, Benjamin

    2017-01-01

    Mapping ecosystem services delivers essential insights into the spatial characteristics of various goods’ and services’ flows from nature to human society. It has become a central topic of science, policy, business and society – all belonging on functioning ecosystems. This textbook summarises the current state-of-the-art of ecosystem services mapping, related theory and methods, different ecosystem service quantification and modelling approaches as well as practical applications. The book...

  18. Benchmarking ontologies: bigger or better?

    Lixia Yao

    2011-01-01

    Full Text Available A scientific ontology is a formal representation of knowledge within a domain, typically including central concepts, their properties, and relations. With the rise of computers and high-throughput data collection, ontologies have become essential to data mining and sharing across communities in the biomedical sciences. Powerful approaches exist for testing the internal consistency of an ontology, but not for assessing the fidelity of its domain representation. We introduce a family of metrics that describe the breadth and depth with which an ontology represents its knowledge domain. We then test these metrics using (1 four of the most common medical ontologies with respect to a corpus of medical documents and (2 seven of the most popular English thesauri with respect to three corpora that sample language from medicine, news, and novels. Here we show that our approach captures the quality of ontological representation and guides efforts to narrow the breach between ontology and collective discourse within a domain. Our results also demonstrate key features of medical ontologies, English thesauri, and discourse from different domains. Medical ontologies have a small intersection, as do English thesauri. Moreover, dialects characteristic of distinct domains vary strikingly as many of the same words are used quite differently in medicine, news, and novels. As ontologies are intended to mirror the state of knowledge, our methods to tighten the fit between ontology and domain will increase their relevance for new areas of biomedical science and improve the accuracy and power of inferences computed across them.

  19. GOClonto: an ontological clustering approach for conceptualizing PubMed abstracts.

    Zheng, Hai-Tao; Borchert, Charles; Kim, Hong-Gee

    2010-02-01

    Concurrent with progress in biomedical sciences, an overwhelming of textual knowledge is accumulating in the biomedical literature. PubMed is the most comprehensive database collecting and managing biomedical literature. To help researchers easily understand collections of PubMed abstracts, numerous clustering methods have been proposed to group similar abstracts based on their shared features. However, most of these methods do not explore the semantic relationships among groupings of documents, which could help better illuminate the groupings of PubMed abstracts. To address this issue, we proposed an ontological clustering method called GOClonto for conceptualizing PubMed abstracts. GOClonto uses latent semantic analysis (LSA) and gene ontology (GO) to identify key gene-related concepts and their relationships as well as allocate PubMed abstracts based on these key gene-related concepts. Based on two PubMed abstract collections, the experimental results show that GOClonto is able to identify key gene-related concepts and outperforms the STC (suffix tree clustering) algorithm, the Lingo algorithm, the Fuzzy Ants algorithm, and the clustering based TRS (tolerance rough set) algorithm. Moreover, the two ontologies generated by GOClonto show significant informative conceptual structures.

  20. Manufacturing ontology through templates

    Diciuc Vlad

    2017-01-01

    Full Text Available The manufacturing industry contains a high volume of knowhow and of high value, much of it being held by key persons in the company. The passing of this know-how is the basis of manufacturing ontology. Among other methods like advanced filtering and algorithm based decision making, one way of handling the manufacturing ontology is via templates. The current paper tackles this approach and highlights the advantages concluding with some recommendations.

  1. Constructing a Geology Ontology Using a Relational Database

    Hou, W.; Yang, L.; Yin, S.; Ye, J.; Clarke, K.

    2013-12-01

    In geology community, the creation of a common geology ontology has become a useful means to solve problems of data integration, knowledge transformation and the interoperation of multi-source, heterogeneous and multiple scale geological data. Currently, human-computer interaction methods and relational database-based methods are the primary ontology construction methods. Some human-computer interaction methods such as the Geo-rule based method, the ontology life cycle method and the module design method have been proposed for applied geological ontologies. Essentially, the relational database-based method is a reverse engineering of abstracted semantic information from an existing database. The key is to construct rules for the transformation of database entities into the ontology. Relative to the human-computer interaction method, relational database-based methods can use existing resources and the stated semantic relationships among geological entities. However, two problems challenge the development and application. One is the transformation of multiple inheritances and nested relationships and their representation in an ontology. The other is that most of these methods do not measure the semantic retention of the transformation process. In this study, we focused on constructing a rule set to convert the semantics in a geological database into a geological ontology. According to the relational schema of a geological database, a conversion approach is presented to convert a geological spatial database to an OWL-based geological ontology, which is based on identifying semantics such as entities, relationships, inheritance relationships, nested relationships and cluster relationships. The semantic integrity of the transformation was verified using an inverse mapping process. In a geological ontology, an inheritance and union operations between superclass and subclass were used to present the nested relationship in a geochronology and the multiple inheritances

  2. Gradient Learning Algorithms for Ontology Computing

    Gao, Wei; Zhu, Linli

    2014-01-01

    The gradient learning model has been raising great attention in view of its promising perspectives for applications in statistics, data dimensionality reducing, and other specific fields. In this paper, we raise a new gradient learning model for ontology similarity measuring and ontology mapping in multidividing setting. The sample error in this setting is given by virtue of the hypothesis space and the trick of ontology dividing operator. Finally, two experiments presented on plant and humanoid robotics field verify the efficiency of the new computation model for ontology similarity measure and ontology mapping applications in multidividing setting. PMID:25530752

  3. Gradient Learning Algorithms for Ontology Computing

    Wei Gao

    2014-01-01

    Full Text Available The gradient learning model has been raising great attention in view of its promising perspectives for applications in statistics, data dimensionality reducing, and other specific fields. In this paper, we raise a new gradient learning model for ontology similarity measuring and ontology mapping in multidividing setting. The sample error in this setting is given by virtue of the hypothesis space and the trick of ontology dividing operator. Finally, two experiments presented on plant and humanoid robotics field verify the efficiency of the new computation model for ontology similarity measure and ontology mapping applications in multidividing setting.

  4. Community Based Informatics: Geographical Information Systems, Remote Sensing and Ontology collaboration - A technical hands-on approach

    Branch, B. D.; Raskin, R. G.; Rock, B.; Gagnon, M.; Lecompte, M. A.; Hayden, L. B.

    2009-12-01

    With the nation challenged to comply with Executive Order 12906 and its needs to augment the Science, Technology, Engineering and Mathematics (STEM) pipeline, applied focus on geosciences pipelines issue may be at risk. The Geosciences pipeline may require intentional K-12 standard course of study consideration in the form of project based, science based and evidenced based learning. Thus, the K-12 to geosciences to informatics pipeline may benefit from an earth science experience that utilizes a community based “learning by doing” approach. Terms such as Community GIS, Community Remotes Sensing, and Community Based Ontology development are termed Community Informatics. Here, approaches of interdisciplinary work to promote and earth science literacy are affordable, consisting of low cost equipment that renders GIS/remote sensing data processing skills necessary in the workforce. Hence, informal community ontology development may evolve or mature from a local community towards formal scientific community collaboration. Such consideration may become a means to engage educational policy towards earth science paradigms and needs, specifically linking synergy among Math, Computer Science, and Earth Science disciplines.

  5. Derivation of Event-B Models from OWL Ontologies

    Alkhammash Eman H.

    2016-01-01

    Full Text Available The derivation of formal specifications from large and complex requirements is a key challenge in systems engineering. In this paper we present an approach that aims to address this challenge by building formal models from OWL ontologies. An ontology is used in the field of knowledge representation to capture a clear view of the domain and to produce a concise and unambiguous set of domain requirements. We harness the power of ontologies to handle inconsistency of domain requirements and produce clear, concise and unambiguous set of domain requirements for Event-B modelling. The proposed approach works by generating Attempto Controlled English (ACE from the OWL ontology and then maps the ACE requirements to develop Event-B models. ACE is a subset of English that can be unambiguously translated into first-order logic. There is an injective mapping between OWL ontology and a subset of ACE. ACE is a suitable interlingua for producing the mapping between OWL and Event-B models for many reasons. Firstly, ACE is easy to learn and understand, it hides the math of OWL and would be natural to use by everybody. Secondly ACE has a parser that converts ACE texts into Discourse Representation Structures (DRS. Finally, ACE can be extended to target a richer syntactic subset of Event-B which ultimately would facilitate the translation of ACE requirements to Event-B.

  6. USE OF ONTOLOGIES FOR KNOWLEDGE BASES CREATION TUTORING COMPUTER SYSTEMS

    Cheremisina Lyubov

    2014-11-01

    Full Text Available This paper deals with the use of ontology for the use and development of intelligent tutoring systems. We consider the shortcomings of educational software and distance learning systems and the advantages of using ontology’s in their design. Actuality creates educational computer systems based on systematic knowledge. We consider classification of properties, use and benefits of ontology’s. Characterized approaches to the problem of ontology mapping, the first of which – manual mapping, the second – a comparison of the names of concepts based on their lexical similarity and using special dictionaries. The analysis of languages available for the formal description of ontology. Considered a formal mathematical model of ontology’s and ontology consistency problem, which is that different developers for the same domain ontology can be created, syntactically or semantically heterogeneous, and their use requires a compatible broadcast or display. An algorithm combining ontology’s. The characteristic of the practical value of developing an ontology for electronic educational resources and recommendations for further research and development, such as implementation of other components of the system integration, formalization of the processes of integration and development of a universal expansion algorithms ontology’s software

  7. The Problem of Being in Latin America: Approaching the Latin American Ontological sentipensar

    JUAN CEPEDA H.

    2017-06-01

    Full Text Available In the following, I endeavor to subvert the classical notion of being found in Western philosophy by following the logic of negation found in the work of Rodolfo Kusch. In order to develop a better understanding of cultural feelings as well as appreciate the natural, the rhythmic and the vital in the Latin American context, I propose that we follow the ontological sentipensar. By using this methodological framework, I seek to reveal a sense of being germane to Latin American intercultural philosophy.

  8. Using Assertion Capabilities of an OWL-Based Ontology for Query Formulation

    Munir, Kamran; Bloodsworth, Peter; McClatchey, Richard

    2008-01-01

    This paper reports on the development of a framework to assist users in formulating relational queries without requiring a complete knowledge of the information structure and access mechanisms to underlying data sources. The emphasis here is on exploiting the semantic relationships and assertion capabilities of OWL ontologies to assist clinicians in writing complex queries. This has been achieved using both a bottom-up and top-down approaches to build an ontology model to be the repository for complex end user queries. Relational database schemas are mapped into the newly generated ontology schema to reinforce the current domain ontology being developed. One of the key merits of this approach is that it does not require storing data interpretation, with the added advantage of even not storing database instances as part of the domain ontology, especially for systems with huge volume of data.

  9. OntoTrader: An Ontological Web Trading Agent Approach for Environmental Information Retrieval

    Luis Iribarne

    2014-01-01

    Full Text Available Modern Web-based Information Systems (WIS are becoming increasingly necessary to provide support for users who are in different places with different types of information, by facilitating their access to the information, decision making, workgroups, and so forth. Design of these systems requires the use of standardized methods and techniques that enable a common vocabulary to be defined to represent the underlying knowledge. Thus, mediation elements such as traders enrich the interoperability of web components in open distributed systems. These traders must operate with other third-party traders and/or agents in the system, which must also use a common vocabulary for communication between them. This paper presents the OntoTrader architecture, an Ontological Web Trading agent based on the OMG ODP trading standard. It also presents the ontology needed by some system agents to communicate with the trading agent and the behavioral framework for the SOLERES OntoTrader agent, an Environmental Management Information System (EMIS. This framework implements a “Query-Searching/Recovering-Response” information retrieval model using a trading service, SPARQL notation, and the JADE platform. The paper also presents reflection, delegation and, federation mediation models and describes formalization, an experimental testing environment in three scenarios, and a tool which allows our proposal to be evaluated and validated.

  10. Defining datasets and creating data dictionaries for quality improvement and research in chronic disease using routinely collected data: an ontology-driven approach

    Simon de Lusignan

    2011-06-01

    Conclusion Adopting an ontology-driven approach to case finding could improve the quality of disease registers and of research based on routine data. It would offer considerable advantages over using limited datasets to define cases. This approach should be considered by those involved in research and quality improvement projects which utilise routine data.

  11. Ontological Planning

    Ahmet Alkan

    2017-12-01

    • Is it possible to redefine ontology within the hierarchical structure of planning? We are going to seek answers to some of these questions within the limited scope of this paper and we are going to offer the rest for discussion by just asking them. In light of these assessments, drawing attention, based on ontological knowledge relying on the wholeness of universe, to the question, on macro level planning, of whether or not the ontological realities of man, energy and movements of thinking can provide macro data for planning on a universal level as important factors affecting mankind will be one of the limited objectives of the paper.

  12. Empirical Phenomenon, Subjective Construction And Ontological Trught: (An Analysis of Problems of Scientific Explanation and Critical Realism Approach

    Faramarz Taghilou

    2014-12-01

    Full Text Available Both the positivist and negativist frameworks of explanation are common in this naturalist proposition that unlike the metaphysical philosophy, reality is embedded only in experimental level. Therefore, the scientific explanation of natural and social phenomenon should refer to this experimental level in order to be called meaningful, verifiable and scientific. But, the problem was always that the principle of causality as a necessary condition for every kind of scientific explanation is not logically deductible from induction in experimental level and remains as a metaphysical principle. The principle of experimental objectivity as a condition for the verifiability clause of scientific explanations could not be defended, because the experimentation was always embedded in subjectivity and theory. The Kantian idealists, in contrast, considering the scientific explanation as a mere representation of reality in subjective categories, could not justify the experimental knowledge of reality and the rationality for comparison among theories and paradigms. Critical Realism as an important approach in philosophy of science that relates to the works and thoughts of Roy Bhaskar tries to solve these problems by resorting to its principles of ontological realism, epistemological relativism, and judgmental rationality. Considering and analyzing the scientific explanation’s issues, we have focused here on the answers of the Critical Realism in this case. We will argue that how the Critical Realist interpretation of scientific explanation, the experimental phenomenon, and the subjective construction and ontological reality all reach to a logical coherence with each other.

  13. Addressing issues in foundational ontology mediation

    Khan, ZC

    2013-09-01

    Full Text Available An approach in achieving semantic interoperability among heterogeneous systems is to offer infrastructure to assist with linking and integration using a foundational ontology. Due to the creation of multiple foundational ontologies, this also means...

  14. Semi-automated ontology generation and evolution

    Stirtzinger, Anthony P.; Anken, Craig S.

    2009-05-01

    Extending the notion of data models or object models, ontology can provide rich semantic definition not only to the meta-data but also to the instance data of domain knowledge, making these semantic definitions available in machine readable form. However, the generation of an effective ontology is a difficult task involving considerable labor and skill. This paper discusses an Ontology Generation and Evolution Processor (OGEP) aimed at automating this process, only requesting user input when un-resolvable ambiguous situations occur. OGEP directly attacks the main barrier which prevents automated (or self learning) ontology generation: the ability to understand the meaning of artifacts and the relationships the artifacts have to the domain space. OGEP leverages existing lexical to ontological mappings in the form of WordNet, and Suggested Upper Merged Ontology (SUMO) integrated with a semantic pattern-based structure referred to as the Semantic Grounding Mechanism (SGM) and implemented as a Corpus Reasoner. The OGEP processing is initiated by a Corpus Parser performing a lexical analysis of the corpus, reading in a document (or corpus) and preparing it for processing by annotating words and phrases. After the Corpus Parser is done, the Corpus Reasoner uses the parts of speech output to determine the semantic meaning of a word or phrase. The Corpus Reasoner is the crux of the OGEP system, analyzing, extrapolating, and evolving data from free text into cohesive semantic relationships. The Semantic Grounding Mechanism provides a basis for identifying and mapping semantic relationships. By blending together the WordNet lexicon and SUMO ontological layout, the SGM is given breadth and depth in its ability to extrapolate semantic relationships between domain entities. The combination of all these components results in an innovative approach to user assisted semantic-based ontology generation. This paper will describe the OGEP technology in the context of the architectural

  15. A two-staged approach to developing and evaluating an ontology for delivering personalized education to diabetic patients.

    Quinn, Susan; Bond, Raymond; Nugent, Chris

    2018-09-01

    Ontologies are often used in biomedical and health domains to provide a concise and consistent means of attributing meaning to medical terminology. While they are novices in terms of ontology engineering, the evaluation of an ontology by domain specialists provides an opportunity to enhance its objectivity, accuracy, and coverage of the domain itself. This paper provides an evaluation of the viability of using ontology engineering novices to evaluate and enrich an ontology that can be used for personalized diabetic patient education. We describe a methodology for engaging healthcare and information technology specialists with a range of ontology engineering tasks. We used 87.8% of the data collected to validate the accuracy of our ontological model. The contributions also enabled a 16% increase in the class size and an 18% increase in object properties. Furthermore, we propose that ontology engineering novices can make valuable contributions to ontology development. Application-specific evaluation of the ontology using a semantic-web-based architecture is also discussed.

  16. Ontology through a Mindfulness Process

    Bearance, Deborah; Holmes, Kimberley

    2015-01-01

    Traditionally, when ontology is taught in a graduate studies course on social research, there is a tendency for this concept to be examined through the process of lectures and readings. Such an approach often leaves graduate students to grapple with a personal embodiment of this concept and to comprehend how ontology can ground their research.…

  17. Ontology: ambiguity and accuracy

    Marcelo Schiessl

    2012-08-01

    Full Text Available Ambiguity is a major obstacle to information retrieval. It is source of several researches in Information Science. Ontologies have been studied in order to solve problems related to ambiguities. Paradoxically, “ontology” term is also ambiguous and it is understood according to the use by the community. Philosophy and Computer Science seems to have the most accentuated difference related to the term sense. The former holds undisputed tradition and authority. The latter, in despite of being quite recent, holds an informal sense, but pragmatic. Information Science acts ranging from philosophical to computational approaches so as to get organized collections based on balance between users’ necessities and available information. The semantic web requires informational cycle automation and demands studies related to ontologies. Consequently, revisiting relevant approaches for the study of ontologies plays a relevant role as a way to provide useful ideas to researchers maintaining philosophical rigor, and convenience provided by computers.

  18. A meta-ontological framework for multi-agent systems design

    Sokolova, Marina; Fernández Caballero, Antonio

    2007-01-01

    The paper introduces an approach to using a meta-ontology framework for complex multi-agent systems design, and illustrates it in an application related to ecological-medical issues. The described shared ontology is pooled from private sub-ontologies, which represent a problem area ontology, an agent ontology, a task ontology, an ontology of interactions, and the multi-agent system architecture ontology.

  19. SUGOI: automated ontology interchangeability

    Khan, ZC

    2015-04-01

    Full Text Available A foundational ontology can solve interoperability issues among the domain ontologies aligned to it. However, several foundational ontologies have been developed, hence such interoperability issues exist among domain ontologies. The novel SUGOI tool...

  20. Ontology Based Resolution of Semantic Conflicts in Information Integration

    LU Han; LI Qing-zhong

    2004-01-01

    Semantic conflict is the conflict caused by using different ways in heterogeneous systems to express the same entity in reality.This prevents information integration from accomplishing semantic coherence.Since ontology helps to solve semantic problems, this area has become a hot topic in information integration.In this paper, we introduce semantic conflict into information integration of heterogeneous applications.We discuss the origins and categories of the conflict, and present an ontology-based schema mapping approach to eliminate semantic conflicts.

  1. Learning expressive ontologies

    Völker, J

    2009-01-01

    This publication advances the state-of-the-art in ontology learning by presenting a set of novel approaches to the semi-automatic acquisition, refinement and evaluation of logically complex axiomatizations. It has been motivated by the fact that the realization of the semantic web envisioned by Tim Berners-Lee is still hampered by the lack of ontological resources, while at the same time more and more applications of semantic technologies emerge from fast-growing areas such as e-business or life sciences. Such knowledge-intensive applications, requiring large scale reasoning over complex domai

  2. Ontological engineering versus metaphysics

    Tataj, Emanuel; Tomanek, Roman; Mulawka, Jan

    2011-10-01

    It has been recognized that ontologies are a semantic version of world wide web and can be found in knowledge-based systems. A recent time survey of this field also suggest that practical artificial intelligence systems may be motivated by this research. Especially strong artificial intelligence as well as concept of homo computer can also benefit from their use. The main objective of this contribution is to present and review already created ontologies and identify the main advantages which derive such approach for knowledge management systems. We would like to present what ontological engineering borrows from metaphysics and what a feedback it can provide to natural language processing, simulations and modelling. The potential topics of further development from philosophical point of view is also underlined.

  3. The Knowledge Sharing Based on PLIB Ontology and XML for Collaborative Product Commerce

    Ma, Jun; Luo, Guofu; Li, Hao; Xiao, Yanqiu

    Collaborative Product Commerce (CPC) has become a brand-new commerce mode for manufacture. In order to promote information communication with each other more efficiently in CPC, a knowledge-sharing framework based on PLIB (ISO 13584) ontology and XML was presented, and its implementation method was studied. At first, according to the methodology of PLIB (ISO 13584), a common ontology—PLIB ontology was put forward which provide a coherent conceptual meaning within the context of CPC domain. Meanwhile, for the sake of knowledge intercommunion via internet, the PLIB ontology formalization description by EXPRESS mode was converted into XML Schema, and two mapping methods were presented: direct mapping approach and meta-levels mapping approach, while the latter was adopted. Based on above work, a parts resource knowledge-sharing framework (CPC-KSF) was put forward and realized, which has been applied in the process of automotive component manufacturing collaborative product commerce.

  4. Brain maps 4.0-Structure of the rat brain: An open access atlas with global nervous system nomenclature ontology and flatmaps.

    Swanson, Larry W

    2018-04-15

    The fourth edition (following editions in 1992, 1998, 2004) of Brain maps: structure of the rat brain is presented here as an open access internet resource for the neuroscience community. One new feature is a set of 10 hierarchical nomenclature tables that define and describe all parts of the rat nervous system within the framework of a strictly topographic system devised previously for the human nervous system. These tables constitute a global ontology for knowledge management systems dealing with neural circuitry. A second new feature is an aligned atlas of bilateral flatmaps illustrating rat nervous system development from the neural plate stage to the adult stage, where most gray matter regions, white matter tracts, ganglia, and nerves listed in the nomenclature tables are illustrated schematically. These flatmaps are convenient for future development of online applications analogous to "Google Maps" for systems neuroscience. The third new feature is a completely revised Atlas of the rat brain in spatially aligned transverse sections that can serve as a framework for 3-D modeling. Atlas parcellation is little changed from the preceding edition, but the nomenclature for rat is now aligned with an emerging panmammalian neuroanatomical nomenclature. All figures are presented in Adobe Illustrator vector graphics format that can be manipulated, modified, and resized as desired, and freely used with a Creative Commons license. © 2018 The Authors The Journal of Comparative Neurology Published by Wiley Periodicals, Inc.

  5. BioPortal: An Open-Source Community-Based Ontology Repository

    Noy, N.; NCBO Team

    2011-12-01

    Advances in computing power and new computational techniques have changed the way researchers approach science. In many fields, one of the most fruitful approaches has been to use semantically aware software to break down the barriers among disparate domains, systems, data sources, and technologies. Such software facilitates data aggregation, improves search, and ultimately allows the detection of new associations that were previously not detectable. Achieving these analyses requires software systems that take advantage of the semantics and that can intelligently negotiate domains and knowledge sources, identifying commonality across systems that use different and conflicting vocabularies, while understanding apparent differences that may be concealed by the use of superficially similar terms. An ontology, a semantically rich vocabulary for a domain of interest, is the cornerstone of software for bridging systems, domains, and resources. However, as ontologies become the foundation of all semantic technologies in e-science, we must develop an infrastructure for sharing ontologies, finding and evaluating them, integrating and mapping among them, and using ontologies in applications that help scientists process their data. BioPortal [1] is an open-source on-line community-based ontology repository that has been used as a critical component of semantic infrastructure in several domains, including biomedicine and bio-geochemical data. BioPortal, uses the social approaches in the Web 2.0 style to bring structure and order to the collection of biomedical ontologies. It enables users to provide and discuss a wide array of knowledge components, from submitting the ontologies themselves, to commenting on and discussing classes in the ontologies, to reviewing ontologies in the context of their own ontology-based projects, to creating mappings between overlapping ontologies and discussing and critiquing the mappings. Critically, it provides web-service access to all its

  6. Annotating the human genome with Disease Ontology

    Osborne, John D; Flatow, Jared; Holko, Michelle; Lin, Simon M; Kibbe, Warren A; Zhu, Lihua (Julie); Danila, Maria I; Feng, Gang; Chisholm, Rex L

    2009-01-01

    Background The human genome has been extensively annotated with Gene Ontology for biological functions, but minimally computationally annotated for diseases. Results We used the Unified Medical Language System (UMLS) MetaMap Transfer tool (MMTx) to discover gene-disease relationships from the GeneRIF database. We utilized a comprehensive subset of UMLS, which is disease-focused and structured as a directed acyclic graph (the Disease Ontology), to filter and interpret results from MMTx. The results were validated against the Homayouni gene collection using recall and precision measurements. We compared our results with the widely used Online Mendelian Inheritance in Man (OMIM) annotations. Conclusion The validation data set suggests a 91% recall rate and 97% precision rate of disease annotation using GeneRIF, in contrast with a 22% recall and 98% precision using OMIM. Our thesaurus-based approach allows for comparisons to be made between disease containing databases and allows for increased accuracy in disease identification through synonym matching. The much higher recall rate of our approach demonstrates that annotating human genome with Disease Ontology and GeneRIF for diseases dramatically increases the coverage of the disease annotation of human genome. PMID:19594883

  7. PAV ontology: provenance, authoring and versioning.

    Ciccarese, Paolo; Soiland-Reyes, Stian; Belhajjame, Khalid; Gray, Alasdair Jg; Goble, Carole; Clark, Tim

    2013-11-22

    Provenance is a critical ingredient for establishing trust of published scientific content. This is true whether we are considering a data set, a computational workflow, a peer-reviewed publication or a simple scientific claim with supportive evidence. Existing vocabularies such as Dublin Core Terms (DC Terms) and the W3C Provenance Ontology (PROV-O) are domain-independent and general-purpose and they allow and encourage for extensions to cover more specific needs. In particular, to track authoring and versioning information of web resources, PROV-O provides a basic methodology but not any specific classes and properties for identifying or distinguishing between the various roles assumed by agents manipulating digital artifacts, such as author, contributor and curator. We present the Provenance, Authoring and Versioning ontology (PAV, namespace http://purl.org/pav/): a lightweight ontology for capturing "just enough" descriptions essential for tracking the provenance, authoring and versioning of web resources. We argue that such descriptions are essential for digital scientific content. PAV distinguishes between contributors, authors and curators of content and creators of representations in addition to the provenance of originating resources that have been accessed, transformed and consumed. We explore five projects (and communities) that have adopted PAV illustrating their usage through concrete examples. Moreover, we present mappings that show how PAV extends the W3C PROV-O ontology to support broader interoperability. The initial design of the PAV ontology was driven by requirements from the AlzSWAN project with further requirements incorporated later from other projects detailed in this paper. The authors strived to keep PAV lightweight and compact by including only those terms that have demonstrated to be pragmatically useful in existing applications, and by recommending terms from existing ontologies when plausible. We analyze and compare PAV with related

  8. Building a biomedical ontology recommender web service

    Jonquet Clement

    2010-06-01

    Full Text Available Abstract Background Researchers in biomedical informatics use ontologies and terminologies to annotate their data in order to facilitate data integration and translational discoveries. As the use of ontologies for annotation of biomedical datasets has risen, a common challenge is to identify ontologies that are best suited to annotating specific datasets. The number and variety of biomedical ontologies is large, and it is cumbersome for a researcher to figure out which ontology to use. Methods We present the Biomedical Ontology Recommender web service. The system uses textual metadata or a set of keywords describing a domain of interest and suggests appropriate ontologies for annotating or representing the data. The service makes a decision based on three criteria. The first one is coverage, or the ontologies that provide most terms covering the input text. The second is connectivity, or the ontologies that are most often mapped to by other ontologies. The final criterion is size, or the number of concepts in the ontologies. The service scores the ontologies as a function of scores of the annotations created using the National Center for Biomedical Ontology (NCBO Annotator web service. We used all the ontologies from the UMLS Metathesaurus and the NCBO BioPortal. Results We compare and contrast our Recommender by an exhaustive functional comparison to previously published efforts. We evaluate and discuss the results of several recommendation heuristics in the context of three real world use cases. The best recommendations heuristics, rated ‘very relevant’ by expert evaluators, are the ones based on coverage and connectivity criteria. The Recommender service (alpha version is available to the community and is embedded into BioPortal.

  9. Developing an Ontology for Ocean Biogeochemistry Data

    Chandler, C. L.; Allison, M. D.; Groman, R. C.; West, P.; Zednik, S.; Maffei, A. R.

    2010-12-01

    Semantic Web technologies offer great promise for enabling new and better scientific research. However, significant challenges must be met before the promise of the Semantic Web can be realized for a discipline as diverse as oceanography. Evolving expectations for open access to research data combined with the complexity of global ecosystem science research themes present a significant challenge, and one that is best met through an informatics approach. The Biological and Chemical Oceanography Data Management Office (BCO-DMO) is funded by the National Science Foundation Division of Ocean Sciences to work with ocean biogeochemistry researchers to improve access to data resulting from their respective programs. In an effort to improve data access, BCO-DMO staff members are collaborating with researchers from the Tetherless World Constellation (Rensselaer Polytechnic Institute) to develop an ontology that formally describes the concepts and relationships in the data managed by the BCO-DMO. The project required transforming a legacy system of human-readable, flat files of metadata to well-ordered controlled vocabularies to a fully developed ontology. To improve semantic interoperability, terms from the BCO-DMO controlled vocabularies are being mapped to controlled vocabulary terms adopted by other oceanographic data management organizations. While the entire process has proven to be difficult, time-consuming and labor-intensive, the work has been rewarding and is a necessary prerequisite for the eventual incorporation of Semantic Web tools. From the beginning of the project, development of the ontology has been guided by a use case based approach. The use cases were derived from data access related requests received from members of the research community served by the BCO-DMO. The resultant ontology satisfies the requirements of the use cases and reflects the information stored in the metadata database. The BCO-DMO metadata database currently contains information that

  10. A Statistical Ontology-Based Approach to Ranking for Multiword Search

    Kim, Jinwoo

    2013-01-01

    Keyword search is a prominent data retrieval method for the Web, largely because the simple and efficient nature of keyword processing allows a large amount of information to be searched with fast response. However, keyword search approaches do not formally capture the clear meaning of a keyword query and fail to address the semantic relationships…

  11. Automatic Generation of Analogy Questions for Student Assessment: An Ontology-Based Approach

    Alsubait, Tahani; Parsia, Bijan; Sattler, Uli

    2012-01-01

    Different computational models for generating analogies of the form "A is to B as C is to D" have been proposed over the past 35 years. However, analogy generation is a challenging problem that requires further research. In this article, we present a new approach for generating analogies in Multiple Choice Question (MCQ) format that can be used…

  12. Building ontologies with basic formal ontology

    Arp, Robert; Spear, Andrew D.

    2015-01-01

    In the era of "big data," science is increasingly information driven, and the potential for computers to store, manage, and integrate massive amounts of data has given rise to such new disciplinary fields as biomedical informatics. Applied ontology offers a strategy for the organization of scientific information in computer-tractable form, drawing on concepts not only from computer and information science but also from linguistics, logic, and philosophy. This book provides an introduction to the field of applied ontology that is of particular relevance to biomedicine, covering theoretical components of ontologies, best practices for ontology design, and examples of biomedical ontologies in use. After defining an ontology as a representation of the types of entities in a given domain, the book distinguishes between different kinds of ontologies and taxonomies, and shows how applied ontology draws on more traditional ideas from metaphysics. It presents the core features of the Basic Formal Ontology (BFO), now u...

  13. Ontology authoring with Forza

    Keet, CM

    2014-11-01

    Full Text Available Generic, reusable ontology elements, such as a foundational ontology's categories and part-whole relations, are essential for good and interoperable knowledge representation. Ontology developers, which include domain experts and novices, face...

  14. Building a developmental toxicity ontology.

    Baker, Nancy; Boobis, Alan; Burgoon, Lyle; Carney, Edward; Currie, Richard; Fritsche, Ellen; Knudsen, Thomas; Laffont, Madeleine; Piersma, Aldert H; Poole, Alan; Schneider, Steffen; Daston, George

    2018-04-03

    As more information is generated about modes of action for developmental toxicity and more data are generated using high-throughput and high-content technologies, it is becoming necessary to organize that information. This report discussed the need for a systematic representation of knowledge about developmental toxicity (i.e., an ontology) and proposes a method to build one based on knowledge of developmental biology and mode of action/ adverse outcome pathways in developmental toxicity. This report is the result of a consensus working group developing a plan to create an ontology for developmental toxicity that spans multiple levels of biological organization. This report provide a description of some of the challenges in building a developmental toxicity ontology and outlines a proposed methodology to meet those challenges. As the ontology is built on currently available web-based resources, a review of these resources is provided. Case studies on one of the most well-understood morphogens and developmental toxicants, retinoic acid, are presented as examples of how such an ontology might be developed. This report outlines an approach to construct a developmental toxicity ontology. Such an ontology will facilitate computer-based prediction of substances likely to induce human developmental toxicity. © 2018 Wiley Periodicals, Inc.

  15. Matching disease and phenotype ontologies in the ontology alignment evaluation initiative.

    Harrow, Ian; Jiménez-Ruiz, Ernesto; Splendiani, Andrea; Romacker, Martin; Woollard, Peter; Markel, Scott; Alam-Faruque, Yasmin; Koch, Martin; Malone, James; Waaler, Arild

    2017-12-02

    The disease and phenotype track was designed to evaluate the relative performance of ontology matching systems that generate mappings between source ontologies. Disease and phenotype ontologies are important for applications such as data mining, data integration and knowledge management to support translational science in drug discovery and understanding the genetics of disease. Eleven systems (out of 21 OAEI participating systems) were able to cope with at least one of the tasks in the Disease and Phenotype track. AML, FCA-Map, LogMap(Bio) and PhenoMF systems produced the top results for ontology matching in comparison to consensus alignments. The results against manually curated mappings proved to be more difficult most likely because these mapping sets comprised mostly subsumption relationships rather than equivalence. Manual assessment of unique equivalence mappings showed that AML, LogMap(Bio) and PhenoMF systems have the highest precision results. Four systems gave the highest performance for matching disease and phenotype ontologies. These systems coped well with the detection of equivalence matches, but struggled to detect semantic similarity. This deserves more attention in the future development of ontology matching systems. The findings of this evaluation show that such systems could help to automate equivalence matching in the workflow of curators, who maintain ontology mapping services in numerous domains such as disease and phenotype.

  16. Brain maps 4.0—Structure of the rat brain: An open access atlas with global nervous system nomenclature ontology and flatmaps

    2018-01-01

    Abstract The fourth edition (following editions in 1992, 1998, 2004) of Brain maps: structure of the rat brain is presented here as an open access internet resource for the neuroscience community. One new feature is a set of 10 hierarchical nomenclature tables that define and describe all parts of the rat nervous system within the framework of a strictly topographic system devised previously for the human nervous system. These tables constitute a global ontology for knowledge management systems dealing with neural circuitry. A second new feature is an aligned atlas of bilateral flatmaps illustrating rat nervous system development from the neural plate stage to the adult stage, where most gray matter regions, white matter tracts, ganglia, and nerves listed in the nomenclature tables are illustrated schematically. These flatmaps are convenient for future development of online applications analogous to “Google Maps” for systems neuroscience. The third new feature is a completely revised Atlas of the rat brain in spatially aligned transverse sections that can serve as a framework for 3‐D modeling. Atlas parcellation is little changed from the preceding edition, but the nomenclature for rat is now aligned with an emerging panmammalian neuroanatomical nomenclature. All figures are presented in Adobe Illustrator vector graphics format that can be manipulated, modified, and resized as desired, and freely used with a Creative Commons license. PMID:29277900

  17. An ontology-based approach to patient follow-up assessment for continuous and personalized chronic disease management.

    Zhang, Yi-Fan; Gou, Ling; Zhou, Tian-Shu; Lin, De-Nan; Zheng, Jing; Li, Ye; Li, Jing-Song

    2017-08-01

    Chronic diseases are complex and persistent clinical conditions that require close collaboration among patients and health care providers in the implementation of long-term and integrated care programs. However, current solutions focus partially on intensive interventions at hospitals rather than on continuous and personalized chronic disease management. This study aims to fill this gap by providing computerized clinical decision support during follow-up assessments of chronically ill patients at home. We proposed an ontology-based framework to integrate patient data, medical domain knowledge, and patient assessment criteria for chronic disease patient follow-up assessments. A clinical decision support system was developed to implement this framework for automatic selection and adaptation of standard assessment protocols to suit patient personal conditions. We evaluated our method in the case study of type 2 diabetic patient follow-up assessments. The proposed framework was instantiated using real data from 115,477 follow-up assessment records of 36,162 type 2 diabetic patients. Standard evaluation criteria were automatically selected and adapted to the particularities of each patient. Assessment results were generated as a general typing of patient overall condition and detailed scoring for each criterion, providing important indicators to the case manager about possible inappropriate judgments, in addition to raising patient awareness of their disease control outcomes. Using historical data as the gold standard, our system achieved a rate of accuracy of 99.93% and completeness of 95.00%. This study contributes to improving the accessibility, efficiency and quality of current patient follow-up services. It also provides a generic approach to knowledge sharing and reuse for patient-centered chronic disease management. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Gene Ontology

    Gaston K. Mazandu

    2012-01-01

    Full Text Available The wide coverage and biological relevance of the Gene Ontology (GO, confirmed through its successful use in protein function prediction, have led to the growth in its popularity. In order to exploit the extent of biological knowledge that GO offers in describing genes or groups of genes, there is a need for an efficient, scalable similarity measure for GO terms and GO-annotated proteins. While several GO similarity measures exist, none adequately addresses all issues surrounding the design and usage of the ontology. We introduce a new metric for measuring the distance between two GO terms using the intrinsic topology of the GO-DAG, thus enabling the measurement of functional similarities between proteins based on their GO annotations. We assess the performance of this metric using a ROC analysis on human protein-protein interaction datasets and correlation coefficient analysis on the selected set of protein pairs from the CESSM online tool. This metric achieves good performance compared to the existing annotation-based GO measures. We used this new metric to assess functional similarity between orthologues, and show that it is effective at determining whether orthologues are annotated with similar functions and identifying cases where annotation is inconsistent between orthologues.

  19. Learning and Sharing Technology in Informal Contexts. A Multiagent-Based Ontological Approach

    Dino Borri

    2014-05-01

    Full Text Available An increasing debate is growing today, in both academic and research-in-action contexts, about the roles of new and traditional technologies in raising knowledge of agents involved, as well as in boosting an effective development of communities. The last century has been largely dominated by capital-intensive technologies, impacting large and populated areas. From the late 1990s up to the present days, due to social, financial, environmental concerns, new low-impact, local-born, little to medium-scale experiences have been challenging large technologies, with interesting results. The importance of such experiences seems to lay on the abilities and knowledge of local populations, which are quite difficult to emerge as formal methodologies and attain recognizable levels of generalization and sharing. Yet the effectiveness of local-based technologies is being increasingly documented, often succeeding in cases where more formal technologies had previously failed. The EU-funded ANTINOMOS project has largely dealt with local-community knowledge enhancing and managing in the water sector management, aiming at creating a real learning environment for the sharing and the active generation of knowledge through mutual synergies. In this paper, the above subject is discussed and carried out with a cross-disciplinary, cross-scale, multi-agent approach, considering the different forms of local knowledge and language involved.

  20. A multiscale approach to mapping seabed sediments.

    Benjamin Misiuk

    Full Text Available Benthic habitat maps, including maps of seabed sediments, have become critical spatial-decision support tools for marine ecological management and conservation. Despite the increasing recognition that environmental variables should be considered at multiple spatial scales, variables used in habitat mapping are often implemented at a single scale. The objective of this study was to evaluate the potential for using environmental variables at multiple scales for modelling and mapping seabed sediments. Sixteen environmental variables were derived from multibeam echosounder data collected near Qikiqtarjuaq, Nunavut, Canada at eight spatial scales ranging from 5 to 275 m, and were tested as predictor variables for modelling seabed sediment distributions. Using grain size data obtained from grab samples, we tested which scales of each predictor variable contributed most to sediment models. Results showed that the default scale was often not the best. Out of 129 potential scale-dependent variables, 11 were selected to model the additive log-ratio of mud and sand at five different scales, and 15 were selected to model the additive log-ratio of gravel and sand, also at five different scales. Boosted Regression Tree models that explained between 46.4 and 56.3% of statistical deviance produced multiscale predictions of mud, sand, and gravel that were correlated with cross-validated test data (Spearman's ρmud = 0.77, ρsand = 0.71, ρgravel = 0.58. Predictions of individual size fractions were classified to produce a map of seabed sediments that is useful for marine spatial planning. Based on the scale-dependence of variables in this study, we concluded that spatial scale consideration is at least as important as variable selection in seabed mapping.

  1. A Mobile Army of Ontologies

    Juul, Jesper

    2015-01-01

    Presentation at the Ludo-ontologies panel. Do we need ludo-ontologies, and what are they? In this event several scholars of games and videogames discuss these questions from a variety of perspectives. What different game and videogame ontologies exist and could exist, and why they are important...... for game and videogame research? The round table is designed to promote ludo-ontological dialogue in order to make these questions visible and debated. A series of short presentations (approximately 10 minutes each) will be followed by an intense debate through freeform dialogue. After the industrial...... commercialization of games and videogames their study has shifted between approaches focused on players (ludic processes) and artifacts (ludic objects). Some attempts to analyze the relationship between the process and the object have occasionally been done in terms of ‘ontology’ (Zagal 2005; Leino 2010; Gualeni...

  2. Ontological Engineering for the Cadastral Domain

    Stubkjær, Erik; Stuckenschmidt, Heiner

    2000-01-01

    conceptualization of the world is that much information remains implicit. Ontologies have set out to overcome the problem of implicit and hidden knowledge by making the conceptualization of a domain (e.g. mathematics) explicit. Ontological engineering is thus an approach to achieve a conceptual rigor...... that characterizes established academic disciplines, like geodesy. Many university courses address more application oriented fields, like cadastral law, and spatial planning, and they may benefit from the ontological engineering approach. The paper provides an introduction to the field of ontological engineering...

  3. How Ontologies are Made: Studying the Hidden Social Dynamics Behind Collaborative Ontology Engineering Projects.

    Strohmaier, Markus; Walk, Simon; Pöschko, Jan; Lamprecht, Daniel; Tudorache, Tania; Nyulas, Csongor; Musen, Mark A; Noy, Natalya F

    2013-05-01

    Traditionally, evaluation methods in the field of semantic technologies have focused on the end result of ontology engineering efforts, mainly, on evaluating ontologies and their corresponding qualities and characteristics. This focus has led to the development of a whole arsenal of ontology-evaluation techniques that investigate the quality of ontologies as a product . In this paper, we aim to shed light on the process of ontology engineering construction by introducing and applying a set of measures to analyze hidden social dynamics. We argue that especially for ontologies which are constructed collaboratively, understanding the social processes that have led to its construction is critical not only in understanding but consequently also in evaluating the ontology. With the work presented in this paper, we aim to expose the texture of collaborative ontology engineering processes that is otherwise left invisible. Using historical change-log data, we unveil qualitative differences and commonalities between different collaborative ontology engineering projects. Explaining and understanding these differences will help us to better comprehend the role and importance of social factors in collaborative ontology engineering projects. We hope that our analysis will spur a new line of evaluation techniques that view ontologies not as the static result of deliberations among domain experts, but as a dynamic, collaborative and iterative process that needs to be understood, evaluated and managed in itself. We believe that advances in this direction would help our community to expand the existing arsenal of ontology evaluation techniques towards more holistic approaches.

  4. How Ontologies are Made: Studying the Hidden Social Dynamics Behind Collaborative Ontology Engineering Projects

    Strohmaier, Markus; Walk, Simon; Pöschko, Jan; Lamprecht, Daniel; Tudorache, Tania; Nyulas, Csongor; Musen, Mark A.; Noy, Natalya F.

    2013-01-01

    Traditionally, evaluation methods in the field of semantic technologies have focused on the end result of ontology engineering efforts, mainly, on evaluating ontologies and their corresponding qualities and characteristics. This focus has led to the development of a whole arsenal of ontology-evaluation techniques that investigate the quality of ontologies as a product. In this paper, we aim to shed light on the process of ontology engineering construction by introducing and applying a set of measures to analyze hidden social dynamics. We argue that especially for ontologies which are constructed collaboratively, understanding the social processes that have led to its construction is critical not only in understanding but consequently also in evaluating the ontology. With the work presented in this paper, we aim to expose the texture of collaborative ontology engineering processes that is otherwise left invisible. Using historical change-log data, we unveil qualitative differences and commonalities between different collaborative ontology engineering projects. Explaining and understanding these differences will help us to better comprehend the role and importance of social factors in collaborative ontology engineering projects. We hope that our analysis will spur a new line of evaluation techniques that view ontologies not as the static result of deliberations among domain experts, but as a dynamic, collaborative and iterative process that needs to be understood, evaluated and managed in itself. We believe that advances in this direction would help our community to expand the existing arsenal of ontology evaluation techniques towards more holistic approaches. PMID:24311994

  5. New GIS approaches to wild land mapping in Europe

    Steffen Fritz; Steve Carver; Linda See

    2000-01-01

    This paper outlines modifications and new approaches to wild land mapping developed specifically for the United Kingdom and European areas. In particular, national level reconnaissance and local level mapping of wild land in the UK and Scotland are presented. A national level study for the UK is undertaken, and a local study focuses on the Cairngorm Mountains in...

  6. Fuzzy knowledge bases integration based on ontology

    Ternovoy, Maksym; Shtogrina, Olena

    2012-01-01

    the paper describes the approach for fuzzy knowledge bases integration with the usage of ontology. This approach is based on metadata-base usage for integration of different knowledge bases with common ontology. The design process of metadata-base is described.

  7. Didactical Ontologies

    Steffen Mencke, Reiner Dumke

    2008-03-01

    Full Text Available Ontologies are a fundamental concept of theSemantic Web envisioned by Tim Berners-Lee [1]. Togetherwith explicit representation of the semantics of data formachine-accessibility such domain theories are the basis forintelligent next generation applications for the web andother areas of interest [2]. Their application for specialaspects within the domain of e-learning is often proposed tosupport the increasing complexity ([3], [4], [5], [6]. So theycan provide a better support for course generation orlearning scenario description [7]. By the modeling ofdidactics-related expertise and their provision for thecreators of courses many improvements like reuse, rapiddevelopment and of course increased learning performancebecome possible due to the separation from other aspects ofe-learning platforms as already proposed in [8].

  8. Matching biomedical ontologies based on formal concept analysis.

    Zhao, Mengyi; Zhang, Songmao; Li, Weizhuo; Chen, Guowei

    2018-03-19

    The goal of ontology matching is to identify correspondences between entities from different yet overlapping ontologies so as to facilitate semantic integration, reuse and interoperability. As a well developed mathematical model for analyzing individuals and structuring concepts, Formal Concept Analysis (FCA) has been applied to ontology matching (OM) tasks since the beginning of OM research, whereas ontological knowledge exploited in FCA-based methods is limited. This motivates the study in this paper, i.e., to empower FCA with as much as ontological knowledge as possible for identifying mappings across ontologies. We propose a method based on Formal Concept Analysis to identify and validate mappings across ontologies, including one-to-one mappings, complex mappings and correspondences between object properties. Our method, called FCA-Map, incrementally generates a total of five types of formal contexts and extracts mappings from the lattices derived. First, the token-based formal context describes how class names, labels and synonyms share lexical tokens, leading to lexical mappings (anchors) across ontologies. Second, the relation-based formal context describes how classes are in taxonomic, partonomic and disjoint relationships with the anchors, leading to positive and negative structural evidence for validating the lexical matching. Third, the positive relation-based context can be used to discover structural mappings. Afterwards, the property-based formal context describes how object properties are used in axioms to connect anchor classes across ontologies, leading to property mappings. Last, the restriction-based formal context describes co-occurrence of classes across ontologies in anonymous ancestors of anchors, from which extended structural mappings and complex mappings can be identified. Evaluation on the Anatomy, the Large Biomedical Ontologies, and the Disease and Phenotype track of the 2016 Ontology Alignment Evaluation Initiative campaign

  9. Towards Technological Approaches for Concept Maps Mining from Text

    Camila Zacche Aguiar; Davidson Cury; Amal Zouaq

    2018-01-01

    Concept maps are resources for the representation and construction of knowledge. They allow showing, through concepts and relationships, how knowledge about a subject is organized. Technological advances have boosted the development of approaches for the automatic construction of a concept map, to facilitate and provide the benefits of that resource more broadly. Due to the need to better identify and analyze the functionalities and characteristics of those approaches, we conducted a detailed...

  10. Iterated-map approach to die tossing

    Feldberg, Rasmus; Szymkat, Maciej; Knudsen, Carsten

    1990-01-01

    Nonlinear dissipative mapping is applied to determine the trajectory of a two-dimensional die thrown onto an elastic table. The basins of attraction for different outcomes are obtained and their distribution in the space of initial conditions discussed. The system has certain properties in common...... with chaotic systems. However, a die falls to rest after a finite number of impacts, and therefore the system has a finite sensitivity to the initial conditions. Quantitative measures of this sensitivity are proposed and their variations with the initial momentum and orientation of the die investigated....

  11. Model Validation in Ontology Based Transformations

    Jesús M. Almendros-Jiménez

    2012-10-01

    Full Text Available Model Driven Engineering (MDE is an emerging approach of software engineering. MDE emphasizes the construction of models from which the implementation should be derived by applying model transformations. The Ontology Definition Meta-model (ODM has been proposed as a profile for UML models of the Web Ontology Language (OWL. In this context, transformations of UML models can be mapped into ODM/OWL transformations. On the other hand, model validation is a crucial task in model transformation. Meta-modeling permits to give a syntactic structure to source and target models. However, semantic requirements have to be imposed on source and target models. A given transformation will be sound when source and target models fulfill the syntactic and semantic requirements. In this paper, we present an approach for model validation in ODM based transformations. Adopting a logic programming based transformational approach we will show how it is possible to transform and validate models. Properties to be validated range from structural and semantic requirements of models (pre and post conditions to properties of the transformation (invariants. The approach has been applied to a well-known example of model transformation: the Entity-Relationship (ER to Relational Model (RM transformation.

  12. Clustering of color map pixels: an interactive approach

    Moon, Yiu Sang; Luk, Franklin T.; Yuen, K. N.; Yeung, Hoi Wo

    2003-12-01

    The demand for digital maps continues to arise as mobile electronic devices become more popular nowadays. Instead of creating the entire map from void, we may convert a scanned paper map into a digital one. Color clustering is the very first step of the conversion process. Currently, most of the existing clustering algorithms are fully automatic. They are fast and efficient but may not work well in map conversion because of the numerous ambiguous issues associated with printed maps. Here we introduce two interactive approaches for color clustering on the map: color clustering with pre-calculated index colors (PCIC) and color clustering with pre-calculated color ranges (PCCR). We also introduce a memory model that could enhance and integrate different image processing techniques for fine-tuning the clustering results. Problems and examples of the algorithms are discussed in the paper.

  13. Approach of simultaneous localization and mapping based on local maps for robot

    CHEN Bai-fan; CAI Zi-xing; HU De-wen

    2006-01-01

    An extended Kalman filter approach of simultaneous localization and mapping(SLAM) was proposed based on local maps.A local frame of reference was established periodically at the position of the robot, and then the observations of the robot and landmarks were fused into the global frame of reference. Because of the independence of the local map, the approach does not cumulate the estimate and calculation errors which are produced by SLAM using Kalman filter directly. At the same time, it reduces the computational complexity. This method is proven correct and feasible in simulation experiments.

  14. Anatomy Ontology Matching Using Markov Logic Networks

    Chunhua Li

    2016-01-01

    Full Text Available The anatomy of model species is described in ontologies, which are used to standardize the annotations of experimental data, such as gene expression patterns. To compare such data between species, we need to establish relationships between ontologies describing different species. Ontology matching is a kind of solutions to find semantic correspondences between entities of different ontologies. Markov logic networks which unify probabilistic graphical model and first-order logic provide an excellent framework for ontology matching. We combine several different matching strategies through first-order logic formulas according to the structure of anatomy ontologies. Experiments on the adult mouse anatomy and the human anatomy have demonstrated the effectiveness of proposed approach in terms of the quality of result alignment.

  15. Optogenetic Approaches for Mesoscopic Brain Mapping.

    Kyweriga, Michael; Mohajerani, Majid H

    2016-01-01

    Recent advances in identifying genetically unique neuronal proteins has revolutionized the study of brain circuitry. Researchers are now able to insert specific light-sensitive proteins (opsins) into a wide range of specific cell types via viral injections or by breeding transgenic mice. These opsins enable the activation, inhibition, or modulation of neuronal activity with millisecond control within distinct brain regions defined by genetic markers. Here we present a useful guide to implement this technique into any lab. We first review the materials needed and practical considerations and provide in-depth instructions for acute surgeries in mice. We conclude with all-optical mapping techniques for simultaneous recording and manipulation of population activity of many neurons in vivo by combining arbitrary point optogenetic stimulation and regional voltage-sensitive dye imaging. It is our intent to make these methods available to anyone wishing to use them.

  16. “Positive Regulation of RNA Metabolic Process” Ontology Group Highly Regulated in Porcine Oocytes Matured In Vitro: A Microarray Approach

    Piotr Celichowski

    2018-01-01

    Full Text Available The cumulus-oocyte complexes (COCs growth and development during folliculogenesis and oogenesis are accompanied by changes involving synthesis and accumulation of large amount of RNA and proteins. In this study, the transcriptomic profile of genes involved in “oocytes RNA synthesis” in relation to in vitro maturation in pigs was investigated for the first time. The RNA was isolated from oocytes before and after in vitro maturation (IVM. Interactions between differentially expressed genes/proteins belonging to “positive regulation of RNA metabolic process” ontology group were investigated by STRING10 software. Using microarray assays, we found expression of 12258 porcine transcripts. Genes with fold change higher than 2 and with corrected p value lower than 0.05 were considered as differentially expressed. The ontology group “positive regulation of RNA metabolic process” involved differential expression of AR, INHBA, WWTR1, FOS, MEF2C, VEGFA, IKZF2, IHH, RORA, MAP3K1, NFAT5, SMARCA1, EGR1, EGR2, MITF, SMAD4, APP, and NR5A1 transcripts. Since all of the presented genes were downregulated after IVM, we suggested that they might be significantly involved in regulation of RNA synthesis before reaching oocyte MII stage. Higher expression of “RNA metabolic process” related genes before IVM indicated that they might be recognized as important markers and specific “transcriptomic fingerprint” of RNA template accumulation and storage for further porcine embryos growth and development.

  17. Learning Resources Organization Using Ontological Framework

    Gavrilova, Tatiana; Gorovoy, Vladimir; Petrashen, Elena

    The paper describes the ontological approach to the knowledge structuring for the e-learning portal design as it turns out to be efficient and relevant to current domain conditions. It is primarily based on the visual ontology-based description of the content of the learning materials and this helps to provide productive and personalized access to these materials. The experience of ontology developing for Knowledge Engineering coursetersburg State University is discussed and “OntolingeWiki” tool for creating ontology-based e-learning portals is described.

  18. Product line based ontology development for semantic web service

    Zhang, Weishan; Kunz, Thomas

    2006-01-01

    Ontology is recognized as a key technology for the success of the Semantic Web. Building reusable and evolve-able ontologies in order to cope with ontology evolution and requirement changes is increasingly important. But the existing methodologies and tools fail to support effective ontology reuse...... will lead to the initial implementation of the meta-onotologies using design by reuse and with the objective of design for reuse. After that step new ontologies could be generated by reusing these meta-ontologies. We demonstrate our approach with a Semantic Web Service application to show how to build...

  19. The ontology-based answers (OBA) service: a connector for embedded usage of ontologies in applications.

    Dönitz, Jürgen; Wingender, Edgar

    2012-01-01

    The semantic web depends on the use of ontologies to let electronic systems interpret contextual information. Optimally, the handling and access of ontologies should be completely transparent to the user. As a means to this end, we have developed a service that attempts to bridge the gap between experts in a certain knowledge domain, ontologists, and application developers. The ontology-based answers (OBA) service introduced here can be embedded into custom applications to grant access to the classes of ontologies and their relations as most important structural features as well as to information encoded in the relations between ontology classes. Thus computational biologists can benefit from ontologies without detailed knowledge about the respective ontology. The content of ontologies is mapped to a graph of connected objects which is compatible to the object-oriented programming style in Java. Semantic functions implement knowledge about the complex semantics of an ontology beyond the class hierarchy and "partOf" relations. By using these OBA functions an application can, for example, provide a semantic search function, or (in the examples outlined) map an anatomical structure to the organs it belongs to. The semantic functions relieve the application developer from the necessity of acquiring in-depth knowledge about the semantics and curation guidelines of the used ontologies by implementing the required knowledge. The architecture of the OBA service encapsulates the logic to process ontologies in order to achieve a separation from the application logic. A public server with the current plugins is available and can be used with the provided connector in a custom application in scenarios analogous to the presented use cases. The server and the client are freely available if a project requires the use of custom plugins or non-public ontologies. The OBA service and further documentation is available at http://www.bioinf.med.uni-goettingen.de/projects/oba.

  20. Critical Ontology for an Enactive Music Pedagogy

    van der Schyff, Dylan; Schiavio, Andrea; Elliott, David J.

    2016-01-01

    An enactive approach to music education is explored through the lens of critical ontology. Assumptions central to Western academic music culture are critically discussed; and the concept of "ontological education" is introduced as an alternative framework. We argue that this orientation embraces more primordial ways of knowing and being,…

  1. Concept maps and nursing theory: a pedagogical approach.

    Hunter Revell, Susan M

    2012-01-01

    Faculty seek to teach nursing students how to link clinical and theoretical knowledge with the intent of improving patient outcomes. The author discusses an innovative 9-week concept mapping activity as a pedagogical approach to teach nursing theory in a graduate theory course. Weekly concept map building increased student engagement and fostered theoretical thinking. Unexpectedly, this activity also benefited students through group work and its ability to enhance theory-practice knowledge.

  2. Ontological Annotation with WordNet

    Sanfilippo, Antonio P.; Tratz, Stephen C.; Gregory, Michelle L.; Chappell, Alan R.; Whitney, Paul D.; Posse, Christian; Paulson, Patrick R.; Baddeley, Bob; Hohimer, Ryan E.; White, Amanda M.

    2006-06-06

    Semantic Web applications require robust and accurate annotation tools that are capable of automating the assignment of ontological classes to words in naturally occurring text (ontological annotation). Most current ontologies do not include rich lexical databases and are therefore not easily integrated with word sense disambiguation algorithms that are needed to automate ontological annotation. WordNet provides a potentially ideal solution to this problem as it offers a highly structured lexical conceptual representation that has been extensively used to develop word sense disambiguation algorithms. However, WordNet has not been designed as an ontology, and while it can be easily turned into one, the result of doing this would present users with serious practical limitations due to the great number of concepts (synonym sets) it contains. Moreover, mapping WordNet to an existing ontology may be difficult and requires substantial labor. We propose to overcome these limitations by developing an analytical platform that (1) provides a WordNet-based ontology offering a manageable and yet comprehensive set of concept classes, (2) leverages the lexical richness of WordNet to give an extensive characterization of concept class in terms of lexical instances, and (3) integrates a class recognition algorithm that automates the assignment of concept classes to words in naturally occurring text. The ensuing framework makes available an ontological annotation platform that can be effectively integrated with intelligence analysis systems to facilitate evidence marshaling and sustain the creation and validation of inference models.

  3. Automating Ontological Annotation with WordNet

    Sanfilippo, Antonio P.; Tratz, Stephen C.; Gregory, Michelle L.; Chappell, Alan R.; Whitney, Paul D.; Posse, Christian; Paulson, Patrick R.; Baddeley, Bob L.; Hohimer, Ryan E.; White, Amanda M.

    2006-01-22

    Semantic Web applications require robust and accurate annotation tools that are capable of automating the assignment of ontological classes to words in naturally occurring text (ontological annotation). Most current ontologies do not include rich lexical databases and are therefore not easily integrated with word sense disambiguation algorithms that are needed to automate ontological annotation. WordNet provides a potentially ideal solution to this problem as it offers a highly structured lexical conceptual representation that has been extensively used to develop word sense disambiguation algorithms. However, WordNet has not been designed as an ontology, and while it can be easily turned into one, the result of doing this would present users with serious practical limitations due to the great number of concepts (synonym sets) it contains. Moreover, mapping WordNet to an existing ontology may be difficult and requires substantial labor. We propose to overcome these limitations by developing an analytical platform that (1) provides a WordNet-based ontology offering a manageable and yet comprehensive set of concept classes, (2) leverages the lexical richness of WordNet to give an extensive characterization of concept class in terms of lexical instances, and (3) integrates a class recognition algorithm that automates the assignment of concept classes to words in naturally occurring text. The ensuing framework makes available an ontological annotation platform that can be effectively integrated with intelligence analysis systems to facilitate evidence marshaling and sustain the creation and validation of inference models.

  4. Geospatial Information Categories Mapping in a Cross-lingual Environment: A Case Study of “Surface Water” Categories in Chinese and American Topographic Maps

    Xi Kuai

    2016-06-01

    Full Text Available The need for integrating geospatial information (GI data from various heterogeneous sources has seen increased importance for geographic information system (GIS interoperability. Using domain ontologies to clarify and integrate the semantics of data is considered as a crucial step for successful semantic integration in the GI domain. Nevertheless, mechanisms are still needed to facilitate semantic mapping between GI ontologies described in different natural languages. This research establishes a formal ontology model for cross-lingual geospatial information ontology mapping. By first extracting semantic primitives from a free-text definition of categories in two GI classification standards with different natural languages, an ontology-driven approach is used, and a formal ontology model is established to formally represent these semantic primitives into semantic statements, in which the spatial-related properties and relations are considered as crucial statements for the representation and identification of the semantics of the GI categories. Then, an algorithm is proposed to compare these semantic statements in a cross-lingual environment. We further design a similarity calculation algorithm based on the proposed formal ontology model to distance the semantic similarities and identify the mapping relationships between categories. In particular, we work with two GI classification standards for Chinese and American topographic maps. The experimental results demonstrate the feasibility and reliability of the proposed model for cross-lingual geospatial information ontology mapping.

  5. An Ontology-Based Approach to Enable Knowledge Representation and Reasoning in Worker–Cobot Agile Manufacturing

    Ahmed R. Sadik

    2017-11-01

    accomplish the cooperative manufacturing concept, a proper approach is required to describe the shared environment between the worker and the cobot. The cooperative manufacturing shared environment includes the cobot, the co-worker, and other production components such as the product itself. Furthermore, the whole cooperative manufacturing system components need to communicate and share their knowledge, to reason and process the shared information, which eventually gives the control solution the capability of obtaining collective manufacturing decisions. Putting into consideration that the control solution should also provide a natural language which is human readable and in the same time can be understood by the machine (i.e., the cobot. Accordingly, a distributed control solution which combines an ontology-based Multi-Agent System (MAS and a Business Rule Management System (BRMS is proposed, in order to solve the mentioned challenges in the cooperative manufacturing, which are: manufacturing knowledge representation, sharing, and reasoning.

  6. Mapping Smart Regions. An Exploratory Approach

    Sylvie Occelli

    2014-05-01

    Full Text Available The paper presents the results of an exploratory approach aimed at extending the ranking procedures normally used in studying the socioeconomics determinants of smart growth at the regional level.   Most of these studies adopt a methodological procedure which essentially consists of the following steps: a identification of the pertinent elementary indicators according to the study objectives; b data selection and processing; c combination of the elementary indicators by multivariate statistical techniques aimed at obtaining a robust synthetic index to rank the observation units. In the procedure a relational dimension is mainly subsumed in the system oriented perspective adopted in selecting the indicators which would best represent the system determinants depending on the goals of the analysis (step a.  In order to get deeper insights into the smartness profile of the European regions, this study makes an effort to account of the relational dimension also in steps b and c of the procedure. The novelties of the proposed approach are twofold. First, by computing region-to-region distances associated with the selected indicators it extends the conventional ranking procedure (step c. Second, it uses a relational database (step b, dealing with the regional participation to the FP7-ICT project, to modify the distances and investigate its impact on the interpretation of the regional positioning.  The main results of this exercise seem to suggest that regional collaborations would have a positive role in regional convergence process. By providing an opportunity to get contacts with the areas endowed with a comparatively more robust smartness profile, regions may have a chance to enhance their own smartness profile.

  7. Towards Technological Approaches for Concept Maps Mining from Text

    Camila Zacche Aguiar

    2018-04-01

    Full Text Available Concept maps are resources for the representation and construction of knowledge. They allow showing, through concepts and relationships, how knowledge about a subject is organized. Technological advances have boosted the development of approaches for the automatic construction of a concept map, to facilitate and provide the benefits of that resource more broadly. Due to the need to better identify and analyze the functionalities and characteristics of those approaches, we conducted a detailed study on technological approaches for automatic construction of concept maps published between 1994 and 2016 in the IEEE Xplore, ACM and Elsevier Science Direct data bases. From this study, we elaborate a categorization defined on two perspectives, Data Source and Graphic Representation, and fourteen categories. That study collected 30 relevant articles, which were applied to the proposed categorization to identify the main features and limitations of each approach. A detailed view on these approaches, their characteristics and techniques are presented enabling a quantitative analysis. In addition, the categorization has given us objective conditions to establish new specification requirements for a new technological approach aiming at concept maps mining from texts.

  8. Conflict Resolution in Partially Ordered OWL DL Ontologies

    Ji, Q.; Gao, Z.; Huang, Z.

    2014-01-01

    Inconsistency handling in OWL DL ontologies is an important problem because an ontology can easily be inconsistent when it is generated or modified. Current approaches to dealing with inconsistent ontologies often assume that there exists a total order over axioms and use such an order to select

  9. On Automatic Modeling and Use of Domain-specific Ontologies

    Andreasen, Troels; Knappe, Rasmus; Bulskov, Henrik

    2005-01-01

    In this paper, we firstly introduce an approach to the modeling of a domain-specific ontology for use in connection with a given document collection. Secondly, we present a methodology for deriving conceptual similarity from the domain-specific ontology. Adopted for ontology representation is a s...

  10. Integrating reasoning and clinical archetypes using OWL ontologies and SWRL rules.

    Lezcano, Leonardo; Sicilia, Miguel-Angel; Rodríguez-Solano, Carlos

    2011-04-01

    Semantic interoperability is essential to facilitate the computerized support for alerts, workflow management and evidence-based healthcare across heterogeneous electronic health record (EHR) systems. Clinical archetypes, which are formal definitions of specific clinical concepts defined as specializations of a generic reference (information) model, provide a mechanism to express data structures in a shared and interoperable way. However, currently available archetype languages do not provide direct support for mapping to formal ontologies and then exploiting reasoning on clinical knowledge, which are key ingredients of full semantic interoperability, as stated in the SemanticHEALTH report [1]. This paper reports on an approach to translate definitions expressed in the openEHR Archetype Definition Language (ADL) to a formal representation expressed using the Ontology Web Language (OWL). The formal representations are then integrated with rules expressed with Semantic Web Rule Language (SWRL) expressions, providing an approach to apply the SWRL rules to concrete instances of clinical data. Sharing the knowledge expressed in the form of rules is consistent with the philosophy of open sharing, encouraged by archetypes. Our approach also allows the reuse of formal knowledge, expressed through ontologies, and extends reuse to propositions of declarative knowledge, such as those encoded in clinical guidelines. This paper describes the ADL-to-OWL translation approach, describes the techniques to map archetypes to formal ontologies, and demonstrates how rules can be applied to the resulting representation. We provide examples taken from a patient safety alerting system to illustrate our approach. Copyright © 2010 Elsevier Inc. All rights reserved.

  11. OntoMaven: Maven-based Ontology Development and Management of Distributed Ontology Repositories

    Paschke, Adrian

    2013-01-01

    In collaborative agile ontology development projects support for modular reuse of ontologies from large existing remote repositories, ontology project life cycle management, and transitive dependency management are important needs. The Apache Maven approach has proven its success in distributed collaborative Software Engineering by its widespread adoption. The contribution of this paper is a new design artifact called OntoMaven. OntoMaven adopts the Maven-based development methodology and ada...

  12. Las ontologías en la ingeniería de software: un acercamiento de dos grandes áreas del conocimiento Ontologies in software engineering: approaching two great knowledge areas

    Carlos Mario Zapata Jaramillo

    2010-01-01

    Full Text Available Los conceptos ontológicos se suelen acercar más a la ingeniería del conocimiento, por lo que los ingenieros del software no los suelen aplicar para resolver problemas de su área. Es necesario que los ingenieros de software se apropien de las ontologías, pues éstas proporcionan un vocabulario común, que podría contribuir en la solución de problemas recurrentes en ingeniería del software, tales como la dificultad de la comunicación entre analista e interesado para definir los requisitos de un sistema, la baja reutilización de componentes y la escasa generación automática de código, entre otros. En este artículo se presenta un primer enlace entre las ontologías y la ingeniería de software mediante la recopilación y análisis de la literatura relativa a la utilización de las ontologías en las diferentes fases del ciclo de vida de un producto de software.Ontology concepts have been traditionally linked to knowledge engineering and software engineers have not applied them to solve problems of this area. It is necessary that software engineers use these ontologies, since they provide a common language, which can contribute to the solution of some common software engineering problems like difficulties in communication between the analyst and the interested person in order to define a system requirements, the low components re-use, and scarce automatic generation in code generation, among others. In this paper, a first encounter between ontologies and software engineering by means of a state-of-the-art analysis related to the use of ontologies in several phases of software development life cycle is presented.

  13. Multimedia ontology representation and applications

    Chaudhury, Santanu; Ghosh, Hiranmay

    2015-01-01

    The result of more than 15 years of collective research, Multimedia Ontology: Representation and Applications provides a theoretical foundation for understanding the nature of media data and the principles involved in its interpretation. The book presents a unified approach to recent advances in multimedia and explains how a multimedia ontology can fill the semantic gap between concepts and the media world. It relays real-life examples of implementations in different domains to illustrate how this gap can be filled.The book contains information that helps with building semantic, content-based

  14. Root justifications for ontology repair

    Moodley, K

    2011-08-01

    Full Text Available stream_source_info Moodley_2011.pdf.txt stream_content_type text/plain stream_size 32328 Content-Encoding ISO-8859-1 stream_name Moodley_2011.pdf.txt Content-Type text/plain; charset=ISO-8859-1 Root Justi cations... the ontology, based on the no- tion of root justi cations [8, 9]. In Section 5, we discuss the implementation of a Prot eg e3 plugin which demonstrates our approach to ontology repair. In this section we also discuss some experimental results comparing...

  15. SPONGY (SPam ONtoloGY: Email Classification Using Two-Level Dynamic Ontology

    Seongwook Youn

    2014-01-01

    Full Text Available Email is one of common communication methods between people on the Internet. However, the increase of email misuse/abuse has resulted in an increasing volume of spam emails over recent years. An experimental system has been designed and implemented with the hypothesis that this method would outperform existing techniques, and the experimental results showed that indeed the proposed ontology-based approach improves spam filtering accuracy significantly. In this paper, two levels of ontology spam filters were implemented: a first level global ontology filter and a second level user-customized ontology filter. The use of the global ontology filter showed about 91% of spam filtered, which is comparable with other methods. The user-customized ontology filter was created based on the specific user’s background as well as the filtering mechanism used in the global ontology filter creation. The main contributions of the paper are (1 to introduce an ontology-based multilevel filtering technique that uses both a global ontology and an individual filter for each user to increase spam filtering accuracy and (2 to create a spam filter in the form of ontology, which is user-customized, scalable, and modularized, so that it can be embedded to many other systems for better performance.

  16. SPONGY (SPam ONtoloGY): email classification using two-level dynamic ontology.

    Youn, Seongwook

    2014-01-01

    Email is one of common communication methods between people on the Internet. However, the increase of email misuse/abuse has resulted in an increasing volume of spam emails over recent years. An experimental system has been designed and implemented with the hypothesis that this method would outperform existing techniques, and the experimental results showed that indeed the proposed ontology-based approach improves spam filtering accuracy significantly. In this paper, two levels of ontology spam filters were implemented: a first level global ontology filter and a second level user-customized ontology filter. The use of the global ontology filter showed about 91% of spam filtered, which is comparable with other methods. The user-customized ontology filter was created based on the specific user's background as well as the filtering mechanism used in the global ontology filter creation. The main contributions of the paper are (1) to introduce an ontology-based multilevel filtering technique that uses both a global ontology and an individual filter for each user to increase spam filtering accuracy and (2) to create a spam filter in the form of ontology, which is user-customized, scalable, and modularized, so that it can be embedded to many other systems for better performance.

  17. SPONGY (SPam ONtoloGY): Email Classification Using Two-Level Dynamic Ontology

    2014-01-01

    Email is one of common communication methods between people on the Internet. However, the increase of email misuse/abuse has resulted in an increasing volume of spam emails over recent years. An experimental system has been designed and implemented with the hypothesis that this method would outperform existing techniques, and the experimental results showed that indeed the proposed ontology-based approach improves spam filtering accuracy significantly. In this paper, two levels of ontology spam filters were implemented: a first level global ontology filter and a second level user-customized ontology filter. The use of the global ontology filter showed about 91% of spam filtered, which is comparable with other methods. The user-customized ontology filter was created based on the specific user's background as well as the filtering mechanism used in the global ontology filter creation. The main contributions of the paper are (1) to introduce an ontology-based multilevel filtering technique that uses both a global ontology and an individual filter for each user to increase spam filtering accuracy and (2) to create a spam filter in the form of ontology, which is user-customized, scalable, and modularized, so that it can be embedded to many other systems for better performance. PMID:25254240

  18. Assessment Applications of Ontologies.

    Chung, Gregory K. W. K.; Niemi, David; Bewley, William L.

    This paper discusses the use of ontologies and their applications to assessment. An ontology provides a shared and common understanding of a domain that can be communicated among people and computational systems. The ontology captures one or more experts' conceptual representation of a domain expressed in terms of concepts and the relationships…

  19. Ontology - MicrobeDB.jp | LSDB Archive [Life Science Database Archive metadata

    Full Text Available gzip) consists of some directories (see the following table). Data file File name: ontology....tar.gz File URL: ftp://ftp.biosciencedbc.jp/archive/microbedb/LATEST/ontology.tar.gz File size: 9...he NCBI Taxonomy and INSDC ontology files were obtained from the DDBJ web site. O...ples Data item Description ontology/meo/meo.ttl An ontology for describing organismal habitats (especially focused on microbes). onto...logy/meo/meo_fma_mapping.ttl An ontology mapping files t

  20. Emotion Education without Ontological Commitment?

    Kristjansson, Kristjan

    2010-01-01

    Emotion education is enjoying new-found popularity. This paper explores the "cosy consensus" that seems to have developed in education circles, according to which approaches to emotion education are immune from metaethical considerations such as contrasting rationalist and sentimentalist views about the moral ontology of emotions. I spell out five…

  1. Supplementary Material for: The flora phenotype ontology (FLOPO): tool for integrating morphological traits and phenotypes of vascular plants

    Hoehndorf, Robert; AlShahrani, Mona; Gkoutos, Georgios; Gosline, George; Groom, Quentin; Hamann, Thomas; Kattge, Jens; Oliveira, Sylvia de; Schmidt, Marco; Sierra, Soraya; Smets, Erik; Vos, Rutger; Weiland, Claus

    2016-01-01

    traits of plant species found in Floras. We used the Plant Ontology (PO) and the Phenotype And Trait Ontology (PATO) to extract entity-quality relationships from digitized taxon descriptions in Floras, and used a formal ontological approach based

  2. Markov Chain Ontology Analysis (MCOA).

    Frost, H Robert; McCray, Alexa T

    2012-02-03

    Biomedical ontologies have become an increasingly critical lens through which researchers analyze the genomic, clinical and bibliographic data that fuels scientific research. Of particular relevance are methods, such as enrichment analysis, that quantify the importance of ontology classes relative to a collection of domain data. Current analytical techniques, however, remain limited in their ability to handle many important types of structural complexity encountered in real biological systems including class overlaps, continuously valued data, inter-instance relationships, non-hierarchical relationships between classes, semantic distance and sparse data. In this paper, we describe a methodology called Markov Chain Ontology Analysis (MCOA) and illustrate its use through a MCOA-based enrichment analysis application based on a generative model of gene activation. MCOA models the classes in an ontology, the instances from an associated dataset and all directional inter-class, class-to-instance and inter-instance relationships as a single finite ergodic Markov chain. The adjusted transition probability matrix for this Markov chain enables the calculation of eigenvector values that quantify the importance of each ontology class relative to other classes and the associated data set members. On both controlled Gene Ontology (GO) data sets created with Escherichia coli, Drosophila melanogaster and Homo sapiens annotations and real gene expression data extracted from the Gene Expression Omnibus (GEO), the MCOA enrichment analysis approach provides the best performance of comparable state-of-the-art methods. A methodology based on Markov chain models and network analytic metrics can help detect the relevant signal within large, highly interdependent and noisy data sets and, for applications such as enrichment analysis, has been shown to generate superior performance on both real and simulated data relative to existing state-of-the-art approaches.

  3. The frequency-domain approach for apparent density mapping

    Tong, T.; Guo, L.

    2017-12-01

    Apparent density mapping is a technique to estimate density distribution in the subsurface layer from the observed gravity data. It has been widely applied for geologic mapping, tectonic study and mineral exploration for decades. Apparent density mapping usually models the density layer as a collection of vertical, juxtaposed prisms in both horizontal directions, whose top and bottom surfaces are assumed to be horizontal or variable-depth, and then inverts or deconvolves the gravity anomalies to determine the density of each prism. Conventionally, the frequency-domain approach, which assumes that both top and bottom surfaces of the layer are horizontal, is usually utilized for fast density mapping. However, such assumption is not always valid in the real world, since either the top surface or the bottom surface may be variable-depth. Here, we presented a frequency-domain approach for apparent density mapping, which permits both the top and bottom surfaces of the layer to be variable-depth. We first derived the formula for forward calculation of gravity anomalies caused by the density layer, whose top and bottom surfaces are variable-depth, and the formula for inversion of gravity anomalies for the density distribution. Then we proposed the procedure for density mapping based on both the formulas of inversion and forward calculation. We tested the approach on the synthetic data, which verified its effectiveness. We also tested the approach on the real Bouguer gravity anomalies data from the central South China. The top surface was assumed to be flat and was on the sea level, and the bottom surface was considered as the Moho surface. The result presented the crustal density distribution, which was coinciding well with the basic tectonic features in the study area.

  4. One Song, Many Works: A Pluralist Ontology of Rock

    Dan Burkett

    2016-01-01

    Full Text Available A number of attempts have been made to construct a plausible ontology of rock music. Each of these ontologies identifies a single type of ontological entity as the “work” in rock music. Yet, all the suggestions advanced to date fail to capture some important considerations about how we engage with music of this tradition. This prompted Lee Brown to advocate a healthy skepticism of higher-order musical ontologies. I argue here that we should instead embrace a pluralist ontology of rock, an ontology that recognizes more than one kind of entity as “the work” in rock music. I contend that this approach has a number of advantages over other ontologies of rock, including that of allowing us to make some comparisons across ontological kinds.

  5. Ontologies vs. Classification Systems

    Madsen, Bodil Nistrup; Erdman Thomsen, Hanne

    2009-01-01

    What is an ontology compared to a classification system? Is a taxonomy a kind of classification system or a kind of ontology? These are questions that we meet when working with people from industry and public authorities, who need methods and tools for concept clarification, for developing meta...... data sets or for obtaining advanced search facilities. In this paper we will present an attempt at answering these questions. We will give a presentation of various types of ontologies and briefly introduce terminological ontologies. Furthermore we will argue that classification systems, e.g. product...... classification systems and meta data taxonomies, should be based on ontologies....

  6. Comparing the performance of various digital soil mapping approaches to map physical soil properties

    Laborczi, Annamária; Takács, Katalin; Pásztor, László

    2015-04-01

    Spatial information on physical soil properties is intensely expected, in order to support environmental related and land use management decisions. One of the most widely used properties to characterize soils physically is particle size distribution (PSD), which determines soil water management and cultivability. According to their size, different particles can be categorized as clay, silt, or sand. The size intervals are defined by national or international textural classification systems. The relative percentage of sand, silt, and clay in the soil constitutes textural classes, which are also specified miscellaneously in various national and/or specialty systems. The most commonly used is the classification system of the United States Department of Agriculture (USDA). Soil texture information is essential input data in meteorological, hydrological and agricultural prediction modelling. Although Hungary has a great deal of legacy soil maps and other relevant soil information, it often occurs, that maps do not exist on a certain characteristic with the required thematic and/or spatial representation. The recent developments in digital soil mapping (DSM), however, provide wide opportunities for the elaboration of object specific soil maps (OSSM) with predefined parameters (resolution, accuracy, reliability etc.). Due to the simultaneous richness of available Hungarian legacy soil data, spatial inference methods and auxiliary environmental information, there is a high versatility of possible approaches for the compilation of a given soil map. This suggests the opportunity of optimization. For the creation of an OSSM one might intend to identify the optimum set of soil data, method and auxiliary co-variables optimized for the resources (data costs, computation requirements etc.). We started comprehensive analysis of the effects of the various DSM components on the accuracy of the output maps on pilot areas. The aim of this study is to compare and evaluate different

  7. An automated approach to mapping corn from Landsat imagery

    Maxwell, S.K.; Nuckols, J.R.; Ward, M.H.; Hoffer, R.M.

    2004-01-01

    Most land cover maps generated from Landsat imagery involve classification of a wide variety of land cover types, whereas some studies may only need spatial information on a single cover type. For example, we required a map of corn in order to estimate exposure to agricultural chemicals for an environmental epidemiology study. Traditional classification techniques, which require the collection and processing of costly ground reference data, were not feasible for our application because of the large number of images to be analyzed. We present a new method that has the potential to automate the classification of corn from Landsat satellite imagery, resulting in a more timely product for applications covering large geographical regions. Our approach uses readily available agricultural areal estimates to enable automation of the classification process resulting in a map identifying land cover as ‘highly likely corn,’ ‘likely corn’ or ‘unlikely corn.’ To demonstrate the feasibility of this approach, we produced a map consisting of the three corn likelihood classes using a Landsat image in south central Nebraska. Overall classification accuracy of the map was 92.2% when compared to ground reference data.

  8. Toxicology ontology perspectives.

    Hardy, Barry; Apic, Gordana; Carthew, Philip; Clark, Dominic; Cook, David; Dix, Ian; Escher, Sylvia; Hastings, Janna; Heard, David J; Jeliazkova, Nina; Judson, Philip; Matis-Mitchell, Sherri; Mitic, Dragana; Myatt, Glenn; Shah, Imran; Spjuth, Ola; Tcheremenskaia, Olga; Toldo, Luca; Watson, David; White, Andrew; Yang, Chihae

    2012-01-01

    The field of predictive toxicology requires the development of open, public, computable, standardized toxicology vocabularies and ontologies to support the applications required by in silico, in vitro, and in vivo toxicology methods and related analysis and reporting activities. In this article we review ontology developments based on a set of perspectives showing how ontologies are being used in predictive toxicology initiatives and applications. Perspectives on resources and initiatives reviewed include OpenTox, eTOX, Pistoia Alliance, ToxWiz, Virtual Liver, EU-ADR, BEL, ToxML, and Bioclipse. We also review existing ontology developments in neighboring fields that can contribute to establishing an ontological framework for predictive toxicology. A significant set of resources is already available to provide a foundation for an ontological framework for 21st century mechanistic-based toxicology research. Ontologies such as ToxWiz provide a basis for application to toxicology investigations, whereas other ontologies under development in the biological, chemical, and biomedical communities could be incorporated in an extended future framework. OpenTox has provided a semantic web framework for the implementation of such ontologies into software applications and linked data resources. Bioclipse developers have shown the benefit of interoperability obtained through ontology by being able to link their workbench application with remote OpenTox web services. Although these developments are promising, an increased international coordination of efforts is greatly needed to develop a more unified, standardized, and open toxicology ontology framework.

  9. Noise pollution mapping approach and accuracy on landscape scales.

    Iglesias Merchan, Carlos; Diaz-Balteiro, Luis

    2013-04-01

    Noise mapping allows the characterization of environmental variables, such as noise pollution or soundscape, depending on the task. Strategic noise mapping (as per Directive 2002/49/EC, 2002) is a tool intended for the assessment of noise pollution at the European level every five years. These maps are based on common methods and procedures intended for human exposure assessment in the European Union that could be also be adapted for assessing environmental noise pollution in natural parks. However, given the size of such areas, there could be an alternative approach to soundscape characterization rather than using human noise exposure procedures. It is possible to optimize the size of the mapping grid used for such work by taking into account the attributes of the area to be studied and the desired outcome. This would then optimize the mapping time and the cost. This type of optimization is important in noise assessment as well as in the study of other environmental variables. This study compares 15 models, using different grid sizes, to assess the accuracy of the noise mapping of the road traffic noise at a landscape scale, with respect to noise and landscape indicators. In a study area located in the Manzanares High River Basin Regional Park in Spain, different accuracy levels (Kappa index values from 0.725 to 0.987) were obtained depending on the terrain and noise source properties. The time taken for the calculations and the noise mapping accuracy results reveal the potential for setting the map resolution in line with decision-makers' criteria and budget considerations. Copyright © 2013 Elsevier B.V. All rights reserved.

  10. Ontology-Driven Knowledge-Based Health-Care System, An Emerging Area - Challenges And Opportunities - Indian Scenario

    Sunitha, A.; Babu, G. Suresh

    2014-11-01

    Recent studies in the decision making efforts in the area of public healthcare systems have been tremendously inspired and influenced by the entry of ontology. Ontology driven systems results in the effective implementation of healthcare strategies for the policy makers. The central source of knowledge is the ontology containing all the relevant domain concepts such as locations, diseases, environments and their domain sensitive inter-relationships which is the prime objective, concern and the motivation behind this paper. The paper further focuses on the development of a semantic knowledge-base for public healthcare system. This paper describes the approach and methodologies in bringing out a novel conceptual theme in establishing a firm linkage between three different ontologies related to diseases, places and environments in one integrated platform. This platform correlates the real-time mechanisms prevailing within the semantic knowledgebase and establishing their inter-relationships for the first time in India. This is hoped to formulate a strong foundation for establishing a much awaited basic need for a meaningful healthcare decision making system in the country. Introduction through a wide range of best practices facilitate the adoption of this approach for better appreciation, understanding and long term outcomes in the area. The methods and approach illustrated in the paper relate to health mapping methods, reusability of health applications, and interoperability issues based on mapping of the data attributes with ontology concepts in generating semantic integrated data driving an inference engine for user-interfaced semantic queries.

  11. COHeRE: Cross-Ontology Hierarchical Relation Examination for Ontology Quality Assurance.

    Cui, Licong

    Biomedical ontologies play a vital role in healthcare information management, data integration, and decision support. Ontology quality assurance (OQA) is an indispensable part of the ontology engineering cycle. Most existing OQA methods are based on the knowledge provided within the targeted ontology. This paper proposes a novel cross-ontology analysis method, Cross-Ontology Hierarchical Relation Examination (COHeRE), to detect inconsistencies and possible errors in hierarchical relations across multiple ontologies. COHeRE leverages the Unified Medical Language System (UMLS) knowledge source and the MapReduce cloud computing technique for systematic, large-scale ontology quality assurance work. COHeRE consists of three main steps with the UMLS concepts and relations as the input. First, the relations claimed in source vocabularies are filtered and aggregated for each pair of concepts. Second, inconsistent relations are detected if a concept pair is related by different types of relations in different source vocabularies. Finally, the uncovered inconsistent relations are voted according to their number of occurrences across different source vocabularies. The voting result together with the inconsistent relations serve as the output of COHeRE for possible ontological change. The highest votes provide initial suggestion on how such inconsistencies might be fixed. In UMLS, 138,987 concept pairs were found to have inconsistent relationships across multiple source vocabularies. 40 inconsistent concept pairs involving hierarchical relationships were randomly selected and manually reviewed by a human expert. 95.8% of the inconsistent relations involved in these concept pairs indeed exist in their source vocabularies rather than being introduced by mistake in the UMLS integration process. 73.7% of the concept pairs with suggested relationship were agreed by the human expert. The effectiveness of COHeRE indicates that UMLS provides a promising environment to enhance

  12. Improving the interoperability of biomedical ontologies with compound alignments.

    Oliveira, Daniela; Pesquita, Catia

    2018-01-09

    Ontologies are commonly used to annotate and help process life sciences data. Although their original goal is to facilitate integration and interoperability among heterogeneous data sources, when these sources are annotated with distinct ontologies, bridging this gap can be challenging. In the last decade, ontology matching systems have been evolving and are now capable of producing high-quality mappings for life sciences ontologies, usually limited to the equivalence between two ontologies. However, life sciences research is becoming increasingly transdisciplinary and integrative, fostering the need to develop matching strategies that are able to handle multiple ontologies and more complex relations between their concepts. We have developed ontology matching algorithms that are able to find compound mappings between multiple biomedical ontologies, in the form of ternary mappings, finding for instance that "aortic valve stenosis"(HP:0001650) is equivalent to the intersection between "aortic valve"(FMA:7236) and "constricted" (PATO:0001847). The algorithms take advantage of search space filtering based on partial mappings between ontology pairs, to be able to handle the increased computational demands. The evaluation of the algorithms has shown that they are able to produce meaningful results, with precision in the range of 60-92% for new mappings. The algorithms were also applied to the potential extension of logical definitions of the OBO and the matching of several plant-related ontologies. This work is a first step towards finding more complex relations between multiple ontologies. The evaluation shows that the results produced are significant and that the algorithms could satisfy specific integration needs.

  13. Paper Ontologies

    Lupton, Tina Jane

    2016-01-01

    This article takes a critical approach to the possibility of applying Bruno Lator’s work to Tristram Shandy. Sterne is both reflexive about the impossibility of representing the material work and determined to do just that in relation to his book. But these are competing tendencies in Latour’s te......’s terms, difficult to judge in the context of a single work. Latour’s recent language for Fiction as a “mode” ultimately leaves us without a way to distinguish between texts that are self-conscious and those that aren’t......This article takes a critical approach to the possibility of applying Bruno Lator’s work to Tristram Shandy. Sterne is both reflexive about the impossibility of representing the material work and determined to do just that in relation to his book. But these are competing tendencies in Latour...

  14. The Proteasix Ontology.

    Arguello Casteleiro, Mercedes; Klein, Julie; Stevens, Robert

    2016-06-04

    The Proteasix Ontology (PxO) is an ontology that supports the Proteasix tool; an open-source peptide-centric tool that can be used to predict automatically and in a large-scale fashion in silico the proteases involved in the generation of proteolytic cleavage fragments (peptides) The PxO re-uses parts of the Protein Ontology, the three Gene Ontology sub-ontologies, the Chemical Entities of Biological Interest Ontology, the Sequence Ontology and bespoke extensions to the PxO in support of a series of roles: 1. To describe the known proteases and their target cleaveage sites. 2. To enable the description of proteolytic cleaveage fragments as the outputs of observed and predicted proteolysis. 3. To use knowledge about the function, species and cellular location of a protease and protein substrate to support the prioritisation of proteases in observed and predicted proteolysis. The PxO is designed to describe the biological underpinnings of the generation of peptides. The peptide-centric PxO seeks to support the Proteasix tool by separating domain knowledge from the operational knowledge used in protease prediction by Proteasix and to support the confirmation of its analyses and results. The Proteasix Ontology may be found at: http://bioportal.bioontology.org/ontologies/PXO . This ontology is free and open for use by everyone.

  15. The Porifera Ontology (PORO): enhancing sponge systematics with an anatomy ontology.

    Thacker, Robert W; Díaz, Maria Cristina; Kerner, Adeline; Vignes-Lebbe, Régine; Segerdell, Erik; Haendel, Melissa A; Mungall, Christopher J

    2014-01-01

    Porifera (sponges) are ancient basal metazoans that lack organs. They provide insight into key evolutionary transitions, such as the emergence of multicellularity and the nervous system. In addition, their ability to synthesize unusual compounds offers potential biotechnical applications. However, much of the knowledge of these organisms has not previously been codified in a machine-readable way using modern web standards. The Porifera Ontology is intended as a standardized coding system for sponge anatomical features currently used in systematics. The ontology is available from http://purl.obolibrary.org/obo/poro.owl, or from the project homepage http://porifera-ontology.googlecode.com/. The version referred to in this manuscript is permanently available from http://purl.obolibrary.org/obo/poro/releases/2014-03-06/. By standardizing character representations, we hope to facilitate more rapid description and identification of sponge taxa, to allow integration with other evolutionary database systems, and to perform character mapping across the major clades of sponges to better understand the evolution of morphological features. Future applications of the ontology will focus on creating (1) ontology-based species descriptions; (2) taxonomic keys that use the nested terms of the ontology to more quickly facilitate species identifications; and (3) methods to map anatomical characters onto molecular phylogenies of sponges. In addition to modern taxa, the ontology is being extended to include features of fossil taxa.

  16. A web-based system architecture for ontology-based data integration in the domain of IT benchmarking

    Pfaff, Matthias; Krcmar, Helmut

    2018-03-01

    In the domain of IT benchmarking (ITBM), a variety of data and information are collected. Although these data serve as the basis for business analyses, no unified semantic representation of such data yet exists. Consequently, data analysis across different distributed data sets and different benchmarks is almost impossible. This paper presents a system architecture and prototypical implementation for an integrated data management of distributed databases based on a domain-specific ontology. To preserve the semantic meaning of the data, the ITBM ontology is linked to data sources and functions as the central concept for database access. Thus, additional databases can be integrated by linking them to this domain-specific ontology and are directly available for further business analyses. Moreover, the web-based system supports the process of mapping ontology concepts to external databases by introducing a semi-automatic mapping recommender and by visualizing possible mapping candidates. The system also provides a natural language interface to easily query linked databases. The expected result of this ontology-based approach of knowledge representation and data access is an increase in knowledge and data sharing in this domain, which will enhance existing business analysis methods.

  17. Maps help protect sensitive areas from spills : an integrated approach to environmental mapping

    Laflamme, A.; Leblanc, S.R.; Percy, R.J.

    2001-01-01

    The Atlantic Sensitivity Mapping Program (ASMP) is underway in Canada's Atlantic Region to develop and maintain the best possible sensitivity mapping system to provide planners and managers with the full range of information they would need in the event of a coastal oil spill drill or spill incident. This initiative also provides recommendations concerning resource protection at the time of a spill. ASMP has become a powerful tool, providing a consistent and standardized terminology throughout the range of spill planning, preparedness and real-time response activities. The desktop mapping system provides an easy-to-use approach for a wide range of technical and support data and information stored in various databases. The data and information are based on a consistent set of terms and definitions that describe the character of the shore zone, the objective and strategies for a specific response, and the methods for achieving those objectives. The data are linked with other resource information in a GIS-based system and can be updated quickly and easily as new information becomes available. The mapping program keeps evolving to better serve the needs of environmental emergency responders. In addition, all components will soon be integrated into a web-based mapping format for broader accessibility. Future work will focus on developing a pre-spill database for Labrador. 3 refs., 8 figs

  18. A zeta function approach to the semiclassical quantization of maps

    Smilansky, Uzi.

    1993-11-01

    The quantum analogue of an area preserving map on a compact phase space is a unitary (evolution) operator which can be represented by a matrix of dimension L∝ℎ -1 . The semiclassical theory for spectrum of the evolution operator will be reviewed with special emphasize on developing a dynamical zeta function approach, similar to the one introduced recently for a semiclassical quantization of hamiltonian systems. (author)

  19. Tropical forest carbon assessment: integrating satellite and airborne mapping approaches

    Asner, Gregory P

    2009-01-01

    Large-scale carbon mapping is needed to support the UNFCCC program to reduce deforestation and forest degradation (REDD). Managers of forested land can potentially increase their carbon credits via detailed monitoring of forest cover, loss and gain (hectares), and periodic estimates of changes in forest carbon density (tons ha -1 ). Satellites provide an opportunity to monitor changes in forest carbon caused by deforestation and degradation, but only after initial carbon densities have been assessed. New airborne approaches, especially light detection and ranging (LiDAR), provide a means to estimate forest carbon density over large areas, which greatly assists in the development of practical baselines. Here I present an integrated satellite-airborne mapping approach that supports high-resolution carbon stock assessment and monitoring in tropical forest regions. The approach yields a spatially resolved, regional state-of-the-forest carbon baseline, followed by high-resolution monitoring of forest cover and disturbance to estimate carbon emissions. Rapid advances and decreasing costs in the satellite and airborne mapping sectors are already making high-resolution carbon stock and emissions assessments viable anywhere in the world.

  20. Extracting Cross-Ontology Weighted Association Rules from Gene Ontology Annotations.

    Agapito, Giuseppe; Milano, Marianna; Guzzi, Pietro Hiram; Cannataro, Mario

    2016-01-01

    Gene Ontology (GO) is a structured repository of concepts (GO Terms) that are associated to one or more gene products through a process referred to as annotation. The analysis of annotated data is an important opportunity for bioinformatics. There are different approaches of analysis, among those, the use of association rules (AR) which provides useful knowledge, discovering biologically relevant associations between terms of GO, not previously known. In a previous work, we introduced GO-WAR (Gene Ontology-based Weighted Association Rules), a methodology for extracting weighted association rules from ontology-based annotated datasets. We here adapt the GO-WAR algorithm to mine cross-ontology association rules, i.e., rules that involve GO terms present in the three sub-ontologies of GO. We conduct a deep performance evaluation of GO-WAR by mining publicly available GO annotated datasets, showing how GO-WAR outperforms current state of the art approaches.

  1. A conformal mapping approach to a root-clustering problem

    Melnikov, Gennady I; Dudarenko, Nataly A; Melnikov, Vitaly G

    2014-01-01

    This paper presents a new approach for matrix root-clustering in sophisticated and multiply-connected regions of the complex plane. The parametric sweeping method and a concept of the closed forbidden region covered by a set of modified three-parametrical Cassini regions are used. A conformal mapping approach was applied to formulate the main results of the paper. An application of the developed method to the problem of matrix root-clustering in a multiply connected region is shown for illustration

  2. Use artificial neural network to align biological ontologies.

    Huang, Jingshan; Dang, Jiangbo; Huhns, Michael N; Zheng, W Jim

    2008-09-16

    Being formal, declarative knowledge representation models, ontologies help to address the problem of imprecise terminologies in biological and biomedical research. However, ontologies constructed under the auspices of the Open Biomedical Ontologies (OBO) group have exhibited a great deal of variety, because different parties can design ontologies according to their own conceptual views of the world. It is therefore becoming critical to align ontologies from different parties. During automated/semi-automated alignment across biological ontologies, different semantic aspects, i.e., concept name, concept properties, and concept relationships, contribute in different degrees to alignment results. Therefore, a vector of weights must be assigned to these semantic aspects. It is not trivial to determine what those weights should be, and current methodologies depend a lot on human heuristics. In this paper, we take an artificial neural network approach to learn and adjust these weights, and thereby support a new ontology alignment algorithm, customized for biological ontologies, with the purpose of avoiding some disadvantages in both rule-based and learning-based aligning algorithms. This approach has been evaluated by aligning two real-world biological ontologies, whose features include huge file size, very few instances, concept names in numerical strings, and others. The promising experiment results verify our proposed hypothesis, i.e., three weights for semantic aspects learned from a subset of concepts are representative of all concepts in the same ontology. Therefore, our method represents a large leap forward towards automating biological ontology alignment.

  3. Physico-empirical approach for mapping soil hydraulic behaviour

    G. D'Urso

    1997-01-01

    Full Text Available Abstract: Pedo-transfer functions are largely used in soil hydraulic characterisation of large areas. The use of physico-empirical approaches for the derivation of soil hydraulic parameters from disturbed samples data can be greatly enhanced if a characterisation performed on undisturbed cores of the same type of soil is available. In this study, an experimental procedure for deriving maps of soil hydraulic behaviour is discussed with reference to its application in an irrigation district (30 km2 in southern Italy. The main steps of the proposed procedure are: i the precise identification of soil hydraulic functions from undisturbed sampling of main horizons in representative profiles for each soil map unit; ii the determination of pore-size distribution curves from larger disturbed sampling data sets within the same soil map unit. iii the calibration of physical-empirical methods for retrieving soil hydraulic parameters from particle-size data and undisturbed soil sample analysis; iv the definition of functional hydraulic properties from water balance output; and v the delimitation of soil hydraulic map units based on functional properties.

  4. Aspect OntoMaven - Aspect-Oriented Ontology Development and Configuration With OntoMaven

    Paschke, Adrian; Schaefermeier, Ralph

    2015-01-01

    In agile ontology-based software engineering projects support for modular reuse of ontologies from large existing remote repositories, ontology project life cycle management, and transitive dependency management are important needs. The contribution of this paper is a new design artifact called OntoMaven combined with a unified approach to ontology modularization, aspect-oriented ontology development, which was inspired by aspect-oriented programming. OntoMaven adopts the Apache Maven-based d...

  5. Using C-OWL for the Alignment and Merging of Medical Ontologies

    Stuckenschmidt, Heiner; van Harmelen, Frank; Serafini, Luciano; Bouquet, Paolo; Giunchiglia, Fausto

    2004-01-01

    A number of sophisticated medical ontologies have been created over the past years. With their de-velopment the need for supporting the alignment of different ontologies is gaining importance. We proposed C-OWL, an extension of the Web Ontology Language OWL that supports alignment mappings between

  6. Ontology Alignment Repair through Modularization and Confidence-Based Heuristics.

    Emanuel Santos

    Full Text Available Ontology Matching aims at identifying a set of semantic correspondences, called an alignment, between related ontologies. In recent years, there has been a growing interest in efficient and effective matching methods for large ontologies. However, alignments produced for large ontologies are often logically incoherent. It was only recently that the use of repair techniques to improve the coherence of ontology alignments began to be explored. This paper presents a novel modularization technique for ontology alignment repair which extracts fragments of the input ontologies that only contain the necessary classes and relations to resolve all detectable incoherences. The paper presents also an alignment repair algorithm that uses a global repair strategy to minimize both the degree of incoherence and the number of mappings removed from the alignment, while overcoming the scalability problem by employing the proposed modularization technique. Our evaluation shows that our modularization technique produces significantly small fragments of the ontologies and that our repair algorithm produces more complete alignments than other current alignment repair systems, while obtaining an equivalent degree of incoherence. Additionally, we also present a variant of our repair algorithm that makes use of the confidence values of the mappings to improve alignment repair. Our repair algorithm was implemented as part of AgreementMakerLight, a free and open-source ontology matching system.

  7. Ontology Alignment Repair through Modularization and Confidence-Based Heuristics.

    Santos, Emanuel; Faria, Daniel; Pesquita, Catia; Couto, Francisco M

    2015-01-01

    Ontology Matching aims at identifying a set of semantic correspondences, called an alignment, between related ontologies. In recent years, there has been a growing interest in efficient and effective matching methods for large ontologies. However, alignments produced for large ontologies are often logically incoherent. It was only recently that the use of repair techniques to improve the coherence of ontology alignments began to be explored. This paper presents a novel modularization technique for ontology alignment repair which extracts fragments of the input ontologies that only contain the necessary classes and relations to resolve all detectable incoherences. The paper presents also an alignment repair algorithm that uses a global repair strategy to minimize both the degree of incoherence and the number of mappings removed from the alignment, while overcoming the scalability problem by employing the proposed modularization technique. Our evaluation shows that our modularization technique produces significantly small fragments of the ontologies and that our repair algorithm produces more complete alignments than other current alignment repair systems, while obtaining an equivalent degree of incoherence. Additionally, we also present a variant of our repair algorithm that makes use of the confidence values of the mappings to improve alignment repair. Our repair algorithm was implemented as part of AgreementMakerLight, a free and open-source ontology matching system.

  8. CRAVE: a database, middleware and visualization system for phenotype ontologies.

    Gkoutos, Georgios V; Green, Eain C J; Greenaway, Simon; Blake, Andrew; Mallon, Ann-Marie; Hancock, John M

    2005-04-01

    A major challenge in modern biology is to link genome sequence information to organismal function. In many organisms this is being done by characterizing phenotypes resulting from mutations. Efficiently expressing phenotypic information requires combinatorial use of ontologies. However tools are not currently available to visualize combinations of ontologies. Here we describe CRAVE (Concept Relation Assay Value Explorer), a package allowing storage, active updating and visualization of multiple ontologies. CRAVE is a web-accessible JAVA application that accesses an underlying MySQL database of ontologies via a JAVA persistent middleware layer (Chameleon). This maps the database tables into discrete JAVA classes and creates memory resident, interlinked objects corresponding to the ontology data. These JAVA objects are accessed via calls through the middleware's application programming interface. CRAVE allows simultaneous display and linking of multiple ontologies and searching using Boolean and advanced searches.

  9. Quality control for terms and definitions in ontologies and taxonomies

    Rüegg Alexander

    2006-04-01

    Full Text Available Abstract Background Ontologies and taxonomies are among the most important computational resources for molecular biology and bioinformatics. A series of recent papers has shown that the Gene Ontology (GO, the most prominent taxonomic resource in these fields, is marked by flaws of certain characteristic types, which flow from a failure to address basic ontological principles. As yet, no methods have been proposed which would allow ontology curators to pinpoint flawed terms or definitions in ontologies in a systematic way. Results We present computational methods that automatically identify terms and definitions which are defined in a circular or unintelligible way. We further demonstrate the potential of these methods by applying them to isolate a subset of 6001 problematic GO terms. By automatically aligning GO with other ontologies and taxonomies we were able to propose alternative synonyms and definitions for some of these problematic terms. This allows us to demonstrate that these other resources do not contain definitions superior to those supplied by GO. Conclusion Our methods provide reliable indications of the quality of terms and definitions in ontologies and taxonomies. Further, they are well suited to assist ontology curators in drawing their attention to those terms that are ill-defined. We have further shown the limitations of ontology mapping and alignment in assisting ontology curators in rectifying problems, thus pointing to the need for manual curation.

  10. Development of erosion risk map using fuzzy logic approach

    Fauzi Manyuk

    2017-01-01

    Full Text Available Erosion-hazard assessment is an important aspect in the management of a river basin such as Siak River Basin, Riau Province, Indonesia. This study presents an application of fuzzy logic approach to develop erosion risk map based on geographic information system. Fuzzy logic is a computing approach based on “degrees of truth” rather than the usual “true or false” (1 or 0 Boolean logic on which the modern computer is based. The results of the erosion risk map were verified by using field measurements. The verification result shows that the parameter of soil-erodibility (K indicates a good agreement with field measurement data. The classification of soil-erodibility (K as the result of validation were: very low (0.0–0.1, medium (0.21-0.32, high (0.44-0.55 and very high (0.56-0.64. The results obtained from this study show that the erosion risk map of Siak River Basin were dominantly classified as medium level which cover about 68.54%. The other classifications were high and very low erosion level which cover about 28.84% and 2.61% respectively.

  11. Changing energy-related behavior: An Intervention Mapping approach

    Kok, Gerjo; Lo, Siu Hing; Peters, Gjalt-Jorn Y.; Ruiter, Robert A.C.

    2011-01-01

    This paper's objective is to apply Intervention Mapping, a planning process for the systematic development of theory- and evidence-based health promotion interventions, to the development of interventions to promote energy conservation behavior. Intervention Mapping (IM) consists of six steps: needs assessment, program objectives, methods and applications, program development, planning for program implementation, and planning for program evaluation. Examples from the energy conservation field are provided to illustrate the activities associated with these steps. It is concluded that applying IM in the energy conservation field may help the development of effective behavior change interventions, and thus develop a domain specific knowledge-base for effective intervention design. - Highlights: → Intervention Mapping (IM) is a planning process for developing evidence-based interventions.→ IM takes a problem-driven rather than theory-driven approach. → IM can be applied to the promotion of energy-conservation in a multilevel approach. → IM helps identifying determinants of behaviors and environmental conditions. → IM helps selecting appropriate theory-based methods and practical applications.

  12. Changing energy-related behavior: An Intervention Mapping approach

    Kok, Gerjo, E-mail: g.kok@maastrichtuniversity.nl [Department of Work and Social Psychology, Maastricht University, P.O. Box 616, 6200 MD Maastricht (Netherlands); Lo, Siu Hing, E-mail: siu-hing.lo@maastrichtuniversity.nl [Department of Work and Social Psychology, Maastricht University, P.O. Box 616, 6200 MD Maastricht (Netherlands); Peters, Gjalt-Jorn Y., E-mail: gj.peters@maastrichtuniversity.nl [Department of Work and Social Psychology, Maastricht University, P.O. Box 616, 6200 MD Maastricht (Netherlands); Ruiter, Robert A.C., E-mail: r.ruiter@maastrichtuniversity.nl [Department of Work and Social Psychology, Maastricht University, P.O. Box 616, 6200 MD Maastricht (Netherlands)

    2011-09-15

    This paper's objective is to apply Intervention Mapping, a planning process for the systematic development of theory- and evidence-based health promotion interventions, to the development of interventions to promote energy conservation behavior. Intervention Mapping (IM) consists of six steps: needs assessment, program objectives, methods and applications, program development, planning for program implementation, and planning for program evaluation. Examples from the energy conservation field are provided to illustrate the activities associated with these steps. It is concluded that applying IM in the energy conservation field may help the development of effective behavior change interventions, and thus develop a domain specific knowledge-base for effective intervention design. - Highlights: > Intervention Mapping (IM) is a planning process for developing evidence-based interventions.> IM takes a problem-driven rather than theory-driven approach. > IM can be applied to the promotion of energy-conservation in a multilevel approach. > IM helps identifying determinants of behaviors and environmental conditions. > IM helps selecting appropriate theory-based methods and practical applications.

  13. Making methodology a matter of process ontology

    Revsbæk, Line

    2016-01-01

    This paper presents a practice of doing qualitative interview analysis from the insights of the process ontology in G. H. Mead’s Philosophy of the Present (1932). The paper presents two cases of analyzing in the present while listening to recorded interview material eliciting researcher’s case...... study and otherwise related experiences creating case narratives inclusive of researcher’s reflexive voice. The paper presents an auto-ethnographic approach to data analysis based on process theory ontology....

  14. New approach on seismic hazard isoseismal map for Romania

    Marmureanu, Gheorghe; Cioflan, Carmen Ortanza; Marmureanu, Alexandru

    2008-01-01

    The seismicity of Romania comes from the energy that is released by crustal earthquakes, which have a depth not more than 40 km, and by the intermediate earthquakes coming from Vrancea region (unique case in Europe) with a depth between 60 and 200 km. The authors developed the concept of 'control earthquake' and equations to obtain the banana shape of the attenuations curves of the macroseimic intensity I (along the directions defined by azimuth Az), in the case of a Vrancea earthquake at a depth 80 < x < 160 km. There were used deterministic and probabilistic approaches, linear and nonlinear ones. The final map is in MMI intensity (isoseismal map) for maximum possible Vrancea earthquake with Richter magnitude, MGR 7.5. This will avoid any drawbacks to civil structural designers and to insurance companies which are paying all damages and life loses in function of earthquake intensity. (authors)

  15. A Proposition Of Knowledge Management Methodology For The Purpose Of Reasoning With The Use Of An Upper-Ontology

    Kamil Szymański

    2007-01-01

    Full Text Available This article describes a proposition of knowledge organization for the purpose of reasoningusing an upper-ontology. It presents a model of integrated ontologies architecture whichconsists of a domain ontologies layer with instances, a shared upper-ontology layer withadditional rules and a layer of ontologies mapping concrete domain ontologies with the upperontology.Thanks to the upper-ontology, new facts were concluded from domain ontologiesduring the reasoning process. A practical realization proposition is given as well. It is basedon some popular SemanticWeb technologies and tools, such as OWL, SWRL, nRQL, Prot´eg´eand Racer.

  16. Constructive Ontology Engineering

    Sousan, William L.

    2010-01-01

    The proliferation of the Semantic Web depends on ontologies for knowledge sharing, semantic annotation, data fusion, and descriptions of data for machine interpretation. However, ontologies are difficult to create and maintain. In addition, their structure and content may vary depending on the application and domain. Several methods described in…

  17. Ontology-Driven Translator Generator for Data Display Configurations

    Jones, Charles

    2004-01-01

    .... In addition, the method includes the specification of mappings between a language-specific ontology and its corresponding syntax specification, that is, either an eXtensible Markup Language (XML...

  18. Toward the Use of an Upper Ontology for U.S. Government and U.S. Military Domains: An Evaluation

    Semy, Salim K; Pulvermacher, Mary K; Obrst, Leo J

    2004-01-01

    ...) of data and ultimately of applications. Key to the vision of a Semantic Web is the ability to capture data and application semantics in ontologies and map these ontologies together via related concepts...

  19. A Semantics-Based Approach to Retrieving Biomedical Information

    Andreasen, Troels; Bulskov, Henrik; Zambach, Sine

    2011-01-01

    This paper describes an approach to representing, organising, and accessing conceptual content of biomedical texts using a formal ontology. The ontology is based on UMLS resources supplemented with domain ontologies developed in the project. The approach introduces the notion of ‘generative ontol...... of data mining of texts identifying paraphrases and concept relations and measuring distances between key concepts in texts. Thus, the project is distinct in its attempt to provide a formal underpinning of conceptual similarity or relatedness of meaning.......This paper describes an approach to representing, organising, and accessing conceptual content of biomedical texts using a formal ontology. The ontology is based on UMLS resources supplemented with domain ontologies developed in the project. The approach introduces the notion of ‘generative...... ontologies’, i.e., ontologies providing increasingly specialised concepts reflecting the phrase structure of natural language. Furthermore, we propose a novel so called ontological semantics which maps noun phrases from texts and queries into nodes in the generative ontology. This enables an advanced form...

  20. Ion Channel ElectroPhysiology Ontology (ICEPO) - a case study of text mining assisted ontology development.

    Elayavilli, Ravikumar Komandur; Liu, Hongfang

    2016-01-01

    Computational modeling of biological cascades is of great interest to quantitative biologists. Biomedical text has been a rich source for quantitative information. Gathering quantitative parameters and values from biomedical text is one significant challenge in the early steps of computational modeling as it involves huge manual effort. While automatically extracting such quantitative information from bio-medical text may offer some relief, lack of ontological representation for a subdomain serves as impedance in normalizing textual extractions to a standard representation. This may render textual extractions less meaningful to the domain experts. In this work, we propose a rule-based approach to automatically extract relations involving quantitative data from biomedical text describing ion channel electrophysiology. We further translated the quantitative assertions extracted through text mining to a formal representation that may help in constructing ontology for ion channel events using a rule based approach. We have developed Ion Channel ElectroPhysiology Ontology (ICEPO) by integrating the information represented in closely related ontologies such as, Cell Physiology Ontology (CPO), and Cardiac Electro Physiology Ontology (CPEO) and the knowledge provided by domain experts. The rule-based system achieved an overall F-measure of 68.93% in extracting the quantitative data assertions system on an independently annotated blind data set. We further made an initial attempt in formalizing the quantitative data assertions extracted from the biomedical text into a formal representation that offers potential to facilitate the integration of text mining into ontological workflow, a novel aspect of this study. This work is a case study where we created a platform that provides formal interaction between ontology development and text mining. We have achieved partial success in extracting quantitative assertions from the biomedical text and formalizing them in ontological

  1. An Image Encryption Approach Using a Shuffling Map

    Xiao Yongliang; Xia Limin

    2009-01-01

    A new image encryption approach is proposed. First, a sort transformation based on nonlinear chaotic algorithm is used to shuffle the positions of image pixels. Then the states of hyper-chaos are used to change the grey values of the shuffled image according to the changed chaotic values of the same position between the above nonlinear chaotic sequence and the sorted chaotic sequence. The experimental results demonstrate that the image encryption scheme based on a shuffling map shows advantages of large key space and high-level security. Compared with some encryption algorithms, the suggested encryption scheme is more secure. (general)

  2. Practical ontologies for information professionals

    AUTHOR|(CDS)2071712

    2016-01-01

    Practical Ontologies for Information Professionals provides an introduction to ontologies and their development, an essential tool for fighting back against information overload. The development of robust and widely used ontologies is an increasingly important tool in the fight against information overload. The publishing and sharing of explicit explanations for a wide variety of conceptualizations, in a machine readable format, has the power to both improve information retrieval and identify new knowledge. This new book provides an accessible introduction to the following: * What is an ontology? Defining the concept and why it is increasingly important to the information professional * Ontologies and the semantic web * Existing ontologies, such as SKOS, OWL, FOAF, schema.org, and the DBpedia Ontology * Adopting and building ontologies, showing how to avoid repetition of work and how to build a simple ontology with Protege * Interrogating semantic web ontologies * The future of ontologies and the role of the ...

  3. Mapping topographic plant location properties using a dense matching approach

    Niederheiser, Robert; Rutzinger, Martin; Lamprecht, Andrea; Bardy-Durchhalter, Manfred; Pauli, Harald; Winkler, Manuela

    2017-04-01

    Within the project MEDIALPS (Disentangling anthropogenic drivers of climate change impacts on alpine plant species: Alps vs. Mediterranean mountains) six regions in Alpine and in Mediterranean mountain regions are investigated to assess how plant species respond to climate change. The project is embedded in the Global Observation Research Initiative in Alpine Environments (GLORIA), which is a well-established global monitoring initiative for systematic observation of changes in the plant species composition and soil temperature on mountain summits worldwide to discern accelerating climate change pressures on these fragile alpine ecosystems. Close-range sensing techniques such as terrestrial photogrammetry are well suited for mapping terrain topography of small areas with high resolution. Lightweight equipment, flexible positioning for image acquisition in the field, and independence on weather conditions (i.e. wind) make this a feasible method for in-situ data collection. New developments of dense matching approaches allow high quality 3D terrain mapping with less requirements for field set-up. However, challenges occur in post-processing and required data storage if many sites have to be mapped. Within MEDIALPS dense matching is used for mapping high resolution topography for 284 3x3 meter plots deriving information on vegetation coverage, roughness, slope, aspect and modelled solar radiation. This information helps identifying types of topography-dependent ecological growing conditions and evaluating the potential for existing refugial locations for specific plant species under climate change. This research is conducted within the project MEDIALPS - Disentangling anthropogenic drivers of climate change impacts on alpine plant species: Alps vs. Mediterranean mountains funded by the Earth System Sciences Programme of the Austrian Academy of Sciences.

  4. Reactive Leadership: Divining, Developing, and Demonstrating Community Ontologies

    Graybeal, J.

    2008-12-01

    The Marine Metadata Interoperability Project (known as MMI, on the web at http://marinemetadata.org) was formed to provide leadership in metadata practices to the marine science community. In 2004 this meant finding and writing about resources and best practices, which until then were all but invisible. In 2008 the scope is far wider, encompassing comprehensive guidance, collaborative community environments, and introduction and demonstration of advanced technologies to an increasingly interested scientific domain. MMI's technical leadership, based on experiences gained in the hydrologic community, emphasized the role ontologies could play in marine science. An early MMI workshop successfully incorporated a large number of community vocabularies, tools to harmonize them in a common ontological format, and the mapping of terms from vocabularies expressed in that format. That 2005 workshop demonstrated the connections to be made among different community vocabularies, and was well regarded by participants, but did not lead to widespread adoption of the tools, technologies, or even the vocabularies. Ontology development efforts for marine sensors and platforms showed intermittent progress, but again were not adopted or pushed toward completion. It is now 2008, and the marine community is increasingly attentive to a wide range of interoperability issues. A large part of the community has at least heard of "semantic interoperability", and many understand its critical role in finding and working with data. Demand for specific solutions, and for workable approaches, is becoming more vocal in the marine community. Yet there is still no encompassing model in place for achieving semantic interoperability, only simple operational registries have been set up for oceanographic community vocabularies, and only a few isolated applications demonstrate how semantic barriers can be overcome. Why has progress been so slow? Are good answers on the horizon? And if we build it, will the

  5. Ontological foundations for evolutionary economics: A Darwinian social ontology

    Stoelhorst, J.W.

    2008-01-01

    The purpose of this paper is to further the project of generalized Darwinism by developing a social ontology on the basis of a combined commitment to ontological continuity and ontological commonality. Three issues that are central to the development of a social ontology are addressed: (1) the

  6. A filtering approach to edge preserving MAP estimation of images.

    Humphrey, David; Taubman, David

    2011-05-01

    The authors present a computationally efficient technique for maximum a posteriori (MAP) estimation of images in the presence of both blur and noise. The image is divided into statistically independent regions. Each region is modelled with a WSS Gaussian prior. Classical Wiener filter theory is used to generate a set of convex sets in the solution space, with the solution to the MAP estimation problem lying at the intersection of these sets. The proposed algorithm uses an underlying segmentation of the image, and a means of determining the segmentation and refining it are described. The algorithm is suitable for a range of image restoration problems, as it provides a computationally efficient means to deal with the shortcomings of Wiener filtering without sacrificing the computational simplicity of the filtering approach. The algorithm is also of interest from a theoretical viewpoint as it provides a continuum of solutions between Wiener filtering and Inverse filtering depending upon the segmentation used. We do not attempt to show here that the proposed method is the best general approach to the image reconstruction problem. However, related work referenced herein shows excellent performance in the specific problem of demosaicing.

  7. A standards-based ontology and support for Big Data Analytics in the insurance industry

    Dimitrios A. Koutsomitropoulos

    2017-06-01

    Full Text Available Standardization efforts have led to the emergence of conceptual models in the insurance industry. Simultaneously, the proliferation of digital information poses new challenges for the efficient management and analysis of available data. Based on the property and casualty data model, we propose an OWL ontology to represent insurance processes and to map large data volumes collected in traditional data stores. By the virtue of reasoning, we demonstrate a set of semantic queries using the ontology vocabulary that can simplify analytics and deduce implicit facts from these data. We compare this mapping approach to data in native RDF format, as in a triple store. As proof-of-concept, we use a large anonymized dataset for car policies from an actual insurance company.

  8. AN ONTOLOGY-BASED COMPETENCE MANAGEMENT SYSTEM FOR IT COMPANIES

    Cristina NICULESCU; Stefan TRAUSAN-MATU

    2009-01-01

    The paper presents a generic framework of an intelligent information system for competence management based on ontologies for information technology companies. The advantage of using an ontology-based system is the possibility of the identification of new relations among concepts based on inferences starting from the existing knowledge. The inferences may be performed in our approach by a reasoning engine, using classifiers in the Descriptions Logics tab associated with the Protégé ontology e...

  9. Perspectives on ontology learning

    Lehmann, J

    2014-01-01

    Perspectives on Ontology Learning brings together researchers and practitioners from different communities − natural language processing, machine learning, and the semantic web − in order to give an interdisciplinary overview of recent advances in ontology learning.Starting with a comprehensive introduction to the theoretical foundations of ontology learning methods, the edited volume presents the state-of-the-start in automated knowledge acquisition and maintenance. It outlines future challenges in this area with a special focus on technologies suitable for pushing the boundaries beyond the c

  10. Localization of canine brachycephaly using an across breed mapping approach.

    Danika Bannasch

    2010-03-01

    Full Text Available The domestic dog, Canis familiaris, exhibits profound phenotypic diversity and is an ideal model organism for the genetic dissection of simple and complex traits. However, some of the most interesting phenotypes are fixed in particular breeds and are therefore less tractable to genetic analysis using classical segregation-based mapping approaches. We implemented an across breed mapping approach using a moderately dense SNP array, a low number of animals and breeds carefully selected for the phenotypes of interest to identify genetic variants responsible for breed-defining characteristics. Using a modest number of affected (10-30 and control (20-60 samples from multiple breeds, the correct chromosomal assignment was identified in a proof of concept experiment using three previously defined loci; hyperuricosuria, white spotting and chondrodysplasia. Genome-wide association was performed in a similar manner for one of the most striking morphological traits in dogs: brachycephalic head type. Although candidate gene approaches based on comparable phenotypes in mice and humans have been utilized for this trait, the causative gene has remained elusive using this method. Samples from nine affected breeds and thirteen control breeds identified strong genome-wide associations for brachycephalic head type on Cfa 1. Two independent datasets identified the same genomic region. Levels of relative heterozygosity in the associated region indicate that it has been subjected to a selective sweep, consistent with it being a breed defining morphological characteristic. Genotyping additional dogs in the region confirmed the association. To date, the genetic structure of dog breeds has primarily been exploited for genome wide association for segregating traits. These results demonstrate that non-segregating traits under strong selection are equally tractable to genetic analysis using small sample numbers.

  11. Ontology-based approaches for cross-enterprise collaboration: a literature review on semantic business process management

    Hoang, Hanh H.; Jung, Jason J.; Tran, Chi P.

    2014-11-01

    Based on an in-depth analysis of the existing approaches in applying semantic technologies to business process management (BPM) research in the perspective of cross-enterprise collaboration or so-called business-to-business integration, we analyse, discuss and compare methodologies, applications and best practices of the surveyed approaches with the proposed criteria. This article identifies various relevant research directions in semantic BPM (SBPM). Founded on the result of our investigation, we summarise the state of art of SBPM. We also address areas and directions for further research activities.

  12. BiOSS: A system for biomedical ontology selection.

    Martínez-Romero, Marcos; Vázquez-Naya, José M; Pereira, Javier; Pazos, Alejandro

    2014-04-01

    In biomedical informatics, ontologies are considered a key technology for annotating, retrieving and sharing the huge volume of publicly available data. Due to the increasing amount, complexity and variety of existing biomedical ontologies, choosing the ones to be used in a semantic annotation problem or to design a specific application is a difficult task. As a consequence, the design of approaches and tools addressed to facilitate the selection of biomedical ontologies is becoming a priority. In this paper we present BiOSS, a novel system for the selection of biomedical ontologies. BiOSS evaluates the adequacy of an ontology to a given domain according to three different criteria: (1) the extent to which the ontology covers the domain; (2) the semantic richness of the ontology in the domain; (3) the popularity of the ontology in the biomedical community. BiOSS has been applied to 5 representative problems of ontology selection. It also has been compared to existing methods and tools. Results are promising and show the usefulness of BiOSS to solve real-world ontology selection problems. BiOSS is openly available both as a web tool and a web service. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  13. Designing Network-based Business Model Ontology

    Hashemi Nekoo, Ali Reza; Ashourizadeh, Shayegheh; Zarei, Behrouz

    2015-01-01

    Survival on dynamic environment is not achieved without a map. Scanning and monitoring of the market show business models as a fruitful tool. But scholars believe that old-fashioned business models are dead; as they are not included the effect of internet and network in themselves. This paper...... is going to propose e-business model ontology from the network point of view and its application in real world. The suggested ontology for network-based businesses is composed of individuals` characteristics and what kind of resources they own. also, their connections and pre-conceptions of connections...... such as shared-mental model and trust. However, it mostly covers previous business model elements. To confirm the applicability of this ontology, it has been implemented in business angel network and showed how it works....

  14. Player-Specific Conflict Handling Ontology

    Charline Hondrou

    2014-09-01

    Full Text Available This paper presents an ontology that leads the player of a serious game - regarding conflict handling - to the educative experience from which they will benefit the most. It provides a clearly defined tree of axioms that maps the player’s visually manifested affective cues and emotional stimuli from the serious game to conflict handling styles and proposes interventions. The importance of this ontology lies in the fact that it promotes natural interaction (non-invasive methods and at the same time makes the game as player-specific as it can be for its educational goal. It is an ontology that can be adapted to different educational theories and serve various educational purposes.

  15. The Cognitive Paradigm Ontology: Design and Application

    Laird, Angela R.

    2013-01-01

    We present the basic structure of the Cognitive Paradigm Ontology (CogPO) for human behavioral experiments. While the experimental psychology and cognitive neuroscience literature may refer to certain behavioral tasks by name (e.g., the Stroop paradigm or the Sternberg paradigm) or by function (a working memory task, a visual attention task), these paradigms can vary tremendously in the stimuli that are presented to the subject, the response expected from the subject, and the instructions given to the subject. Drawing from the taxonomy developed and used by the BrainMap project (www.brainmap.org) for almost two decades to describe key components of published functional imaging results, we have developed an ontology capable of representing certain characteristics of the cognitive paradigms used in the fMRI and PET literature. The Cognitive Paradigm Ontology is being developed to be compliant with the Basic Formal Ontology (BFO), and to harmonize where possible with larger ontologies such as RadLex, NeuroLex, or the Ontology of Biomedical Investigations (OBI). The key components of CogPO include the representation of experimental conditions focused on the stimuli presented, the instructions given, and the responses requested. The use of alternate and even competitive terminologies can often impede scientific discoveries. Categorization of paradigms according to stimulus, response, and instruction has been shown to allow advanced data retrieval techniques by searching for similarities and contrasts across multiple paradigm levels. The goal of CogPO is to develop, evaluate, and distribute a domain ontology of cognitive paradigms for application and use in the functional neuroimaging community. PMID:21643732

  16. Expert Judgement Assessment & SCENT Ontological Analysis

    NICHERSU Iulian

    2018-05-01

    Full Text Available This study aims to provide insights in the starting point of the Horizon 2020 ECfunded project SCENT (Smart Toolbox for Εngaging Citizens into a People-Centric Observation Web Citizen Observatory (CO in terms of existing infrastructure, existing monitoring systems and some discussion on the existing legal and administrative framework that relate to flood monitoring and management in the area of Danube Delta. The methodology used in this approach is based on expert judgement and ontological analysis, using the information collected from the identified end-users of the SCENT toolbox. In this type of analysis the stages of flood monitoring and management that the experts are involved in are detailed. This is done through an Expert Judgement Assessment analysis. The latter is complemented by a set of Key Performance Indicators that the stakeholders have assessed and/or proposed for the evaluation of the SCENT demonstrations, for the impact of the project and finally for SCENT toolbox performance and usefulness. The second part of the study presents an analysis that attempts to map the interactions between different organizations and components of the existing monitoring systems in the Danube Delta case study. Expert Judgement (EJ allows to gain information from specialists in a specific field through a consultation process with one or more experts that have experience in similar and complementary topics. Expert judgment, expert estimates, or expert opinion are all terms that refer to the contents of the problem; estimates, outcomes, predictions, uncertainties, and their corresponding assumptions and conditions are all examples of expert judgment. Expert Judgement is affected by the process used to gather it. On the other hand, the ontological analysis comes to complete this study, by organizing and presenting the connections behind the flood management and land use systems in the three phases of the flood event.

  17. An ontology for major histocompatibility restriction.

    Vita, Randi; Overton, James A; Seymour, Emily; Sidney, John; Kaufman, Jim; Tallmadge, Rebecca L; Ellis, Shirley; Hammond, John; Butcher, Geoff W; Sette, Alessandro; Peters, Bjoern

    2016-01-01

    MHC molecules are a highly diverse family of proteins that play a key role in cellular immune recognition. Over time, different techniques and terminologies have been developed to identify the specific type(s) of MHC molecule involved in a specific immune recognition context. No consistent nomenclature exists across different vertebrate species. To correctly represent MHC related data in The Immune Epitope Database (IEDB), we built upon a previously established MHC ontology and created an ontology to represent MHC molecules as they relate to immunological experiments. This ontology models MHC protein chains from 16 species, deals with different approaches used to identify MHC, such as direct sequencing verses serotyping, relates engineered MHC molecules to naturally occurring ones, connects genetic loci, alleles, protein chains and multi-chain proteins, and establishes evidence codes for MHC restriction. Where available, this work is based on existing ontologies from the OBO foundry. Overall, representing MHC molecules provides a challenging and practically important test case for ontology building, and could serve as an example of how to integrate other ontology building efforts into web resources.

  18. Ontology of fractures

    Zhong, Jian; Aydina, Atilla; McGuinness, Deborah L.

    2009-03-01

    Fractures are fundamental structures in the Earth's crust and they can impact many societal and industrial activities including oil and gas exploration and production, aquifer management, CO 2 sequestration, waste isolation, the stabilization of engineering structures, and assessing natural hazards (earthquakes, volcanoes, and landslides). Therefore, an ontology which organizes the concepts of fractures could help facilitate a sound education within, and communication among, the highly diverse professional and academic community interested in the problems cited above. We developed a process-based ontology that makes explicit specifications about fractures, their properties, and the deformation mechanisms which lead to their formation and evolution. Our ontology emphasizes the relationships among concepts such as the factors that influence the mechanism(s) responsible for the formation and evolution of specific fracture types. Our ontology is a valuable resource with a potential to applications in a number of fields utilizing recent advances in Information Technology, specifically for digital data and information in computers, grids, and Web services.

  19. A Method for Evaluating and Standardizing Ontologies

    Seyed, Ali Patrice

    2012-01-01

    The Open Biomedical Ontology (OBO) Foundry initiative is a collaborative effort for developing interoperable, science-based ontologies. The Basic Formal Ontology (BFO) serves as the upper ontology for the domain-level ontologies of OBO. BFO is an upper ontology of types as conceived by defenders of realism. Among the ontologies developed for OBO…

  20. Application of Pareto optimization method for ontology matching in nuclear reactor domain

    Meenachi, N. Madurai; Baba, M. Sai

    2017-01-01

    This article describes the need for ontology matching and describes the methods to achieve the same. Efforts are put in the implementation of the semantic web based knowledge management system for nuclear domain which necessitated use of the methods for development of ontology matching. In order to exchange information in a distributed environment, ontology mapping has been used. The constraints in matching the ontology are also discussed. Pareto based ontology matching algorithm is used to find the similarity between two ontologies in the nuclear reactor domain. Algorithms like Jaro Winkler distance, Needleman Wunsch algorithm, Bigram, Kull Back and Cosine divergence are employed to demonstrate ontology matching. A case study was carried out to analysis the ontology matching in diversity in the nuclear reactor domain and same was illustrated.

  1. Application of Pareto optimization method for ontology matching in nuclear reactor domain

    Meenachi, N. Madurai [Indira Gandhi Centre for Atomic Research, HBNI, Tamil Nadu (India). Planning and Human Resource Management Div.; Baba, M. Sai [Indira Gandhi Centre for Atomic Research, HBNI, Tamil Nadu (India). Resources Management Group

    2017-12-15

    This article describes the need for ontology matching and describes the methods to achieve the same. Efforts are put in the implementation of the semantic web based knowledge management system for nuclear domain which necessitated use of the methods for development of ontology matching. In order to exchange information in a distributed environment, ontology mapping has been used. The constraints in matching the ontology are also discussed. Pareto based ontology matching algorithm is used to find the similarity between two ontologies in the nuclear reactor domain. Algorithms like Jaro Winkler distance, Needleman Wunsch algorithm, Bigram, Kull Back and Cosine divergence are employed to demonstrate ontology matching. A case study was carried out to analysis the ontology matching in diversity in the nuclear reactor domain and same was illustrated.

  2. DMTO: a realistic ontology for standard diabetes mellitus treatment.

    El-Sappagh, Shaker; Kwak, Daehan; Ali, Farman; Kwak, Kyung-Sup

    2018-02-06

    Treatment of type 2 diabetes mellitus (T2DM) is a complex problem. A clinical decision support system (CDSS) based on massive and distributed electronic health record data can facilitate the automation of this process and enhance its accuracy. The most important component of any CDSS is its knowledge base. This knowledge base can be formulated using ontologies. The formal description logic of ontology supports the inference of hidden knowledge. Building a complete, coherent, consistent, interoperable, and sharable ontology is a challenge. This paper introduces the first version of the newly constructed Diabetes Mellitus Treatment Ontology (DMTO) as a basis for shared-semantics, domain-specific, standard, machine-readable, and interoperable knowledge relevant to T2DM treatment. It is a comprehensive ontology and provides the highest coverage and the most complete picture of coded knowledge about T2DM patients' current conditions, previous profiles, and T2DM-related aspects, including complications, symptoms, lab tests, interactions, treatment plan (TP) frameworks, and glucose-related diseases and medications. It adheres to the design principles recommended by the Open Biomedical Ontologies Foundry and is based on ontological realism that follows the principles of the Basic Formal Ontology and the Ontology for General Medical Science. DMTO is implemented under Protégé 5.0 in Web Ontology Language (OWL) 2 format and is publicly available through the National Center for Biomedical Ontology's BioPortal at http://bioportal.bioontology.org/ontologies/DMTO . The current version of DMTO includes more than 10,700 classes, 277 relations, 39,425 annotations, 214 semantic rules, and 62,974 axioms. We provide proof of concept for this approach to modeling TPs. The ontology is able to collect and analyze most features of T2DM as well as customize chronic TPs with the most appropriate drugs, foods, and physical exercises. DMTO is ready to be used as a knowledge base for

  3. The Electronic Notebook Ontology

    Chalk, Stuart

    2016-01-01

    Science is rapidly being brought into the electronic realm and electronic laboratory notebooks (ELN) are a big part of this activity. The representation of the scientific process in the context of an ELN is an important component to making the data recorded in ELNs semantically integrated. This presentation will outline initial developments of an Electronic Notebook Ontology (ENO) that will help tie together the ExptML ontology, HCLS Community Profile data descriptions, and the VIVO-ISF ontol...

  4. Mapping radon-prone areas - a geophysical approach

    Shirav, M. [Geological Survey of Israel, Jerusalem (Israel); Vulkan, U. [Soreq Nuclear Research Center, Yavne (Israel)

    1997-06-01

    Radon-prone areas in Israel were mapped on the basis of direct measurements of radon ({sup 222}Rn) in the soil/rock gas of all exposed geological units, supported by the accumulated knowledge of local stratigraphy and sub-surface geology. Measurements were carried out by a modified alpha-track detection system, resulting in high radon levels mainly in rocks of the Senonian-Paleocene-aged Mount Scopus Group, comprised of chert-bearing marly chalks, rich in phosphorite which acts as the major uranium source. Issues of source depth, seasonal variations and comparison with indoor radon levels are addressed as well. This approach could be applied to other similar terrains, especially the Mediterranean Phosphate Belt. (orig.)

  5. Mapping radon-prone areas - a geophysical approach

    Shirav, M.; Vulkan, U.

    1997-01-01

    Radon-prone areas in Israel were mapped on the basis of direct measurements of radon ( 222 Rn) in the soil/rock gas of all exposed geological units, supported by the accumulated knowledge of local stratigraphy and sub-surface geology. Measurements were carried out by a modified alpha-track detection system, resulting in high radon levels mainly in rocks of the Senonian-Paleocene-aged Mount Scopus Group, comprised of chert-bearing marly chalks, rich in phosphorite which acts as the major uranium source. Issues of source depth, seasonal variations and comparison with indoor radon levels are addressed as well. This approach could be applied to other similar terrains, especially the Mediterranean Phosphate Belt. (orig.)

  6. Local Relation Map: A Novel Illumination Invariant Face Recognition Approach

    Lian Zhichao

    2012-10-01

    Full Text Available In this paper, a novel illumination invariant face recognition approach is proposed. Different from most existing methods, an additive term as noise is considered in the face model under varying illuminations in addition to a multiplicative illumination term. High frequency coefficients of Discrete Cosine Transform (DCT are discarded to eliminate the effect caused by noise. Based on the local characteristics of the human face, a simple but effective illumination invariant feature local relation map is proposed. Experimental results on the Yale B, Extended Yale B and CMU PIE demonstrate the outperformance and lower computational burden of the proposed method compared to other existing methods. The results also demonstrate the validity of the proposed face model and the assumption on noise.

  7. Mapping site-based construction workers’ motivation: Expectancy theory approach

    Parviz Ghoddousi

    2014-03-01

    Full Text Available The aim of this study is to apply a recently proposed model of motivation based on expectancy theory to site-based workers in the construction context and confirm the validity of this model for the construction industry. The study drew upon data from 194 site-based construction workers in Iran to test the proposed model of motivation. To this end, the structural equation modelling (SEM approach based on the confirmatory factor analysis (CFA technique was deployed. The study reveals that the proposed model of expectancy theory incorporating five indicators (i.e. intrinsic instrumentality, extrinsic instrumentality, intrinsic valence, extrinsic valence and expectancy is able to map the process of construction workers’ motivation. Nonetheless, the findings posit that intrinsic indicators could be more effective than extrinsic ones. This proffers the necessity of construction managers placing further focus on intrinsic motivators to motivate workers. 

  8. Mapping site-based construction workers’ motivation: Expectancy theory approach

    Parviz Ghoddousi

    2014-03-01

    Full Text Available The aim of this study is to apply a recently proposed model of motivation based on expectancy theory to site-based workers in the construction context and confirm the validity of this model for the construction industry. The study drew upon data from 194 site-based construction workers in Iran to test the proposed model of motivation. To this end, the structural equation modelling (SEM approach based on the confirmatory factor analysis (CFA technique was deployed. The study reveals that the proposed model of expectancy theory incorporating five indicators (i.e. intrinsic instrumentality, extrinsic instrumentality, intrinsic valence, extrinsic valence and expectancy is able to map the process of construction workers’ motivation. Nonetheless, the findings posit that intrinsic indicators could be more effective than extrinsic ones. This proffers the necessity of construction managers placing further focus on intrinsic motivators to motivate workers.

  9. Nuclear component design ontology building based on ASME codes

    Bao Shiyi; Zhou Yu; He Shuyan

    2005-01-01

    The adoption of ontology analysis in the study of concept knowledge acquisition and representation for the nuclear component design process based on computer-supported cooperative work (CSCW) makes it possible to share and reuse numerous concept knowledge of multi-disciplinary domains. A practical ontology building method is accordingly proposed based on Protege knowledge model in combination with both top-down and bottom-up approaches together with Formal Concept Analysis (FCA). FCA exhibits its advantages in the way it helps establish and improve taxonomic hierarchy of concepts and resolve concept conflict occurred in modeling multi-disciplinary domains. With Protege-3.0 as the ontology building tool, a nuclear component design ontology based ASME codes is developed by utilizing the ontology building method. The ontology serves as the basis to realize concept knowledge sharing and reusing of nuclear component design. (authors)

  10. An Ontology for Modeling Complex Inter-relational Organizations

    Wautelet, Yves; Neysen, Nicolas; Kolp, Manuel

    This paper presents an ontology for organizational modeling through multiple complementary aspects. The primary goal of the ontology is to dispose of an adequate set of related concepts for studying complex organizations involved in a lot of relationships at the same time. In this paper, we define complex organizations as networked organizations involved in a market eco-system that are playing several roles simultaneously. In such a context, traditional approaches focus on the macro analytic level of transactions; this is supplemented here with a micro analytic study of the actors' rationale. At first, the paper overviews enterprise ontologies literature to position our proposal and exposes its contributions and limitations. The ontology is then brought to an advanced level of formalization: a meta-model in the form of a UML class diagram allows to overview the ontology concepts and their relationships which are formally defined. Finally, the paper presents the case study on which the ontology has been validated.

  11. Ontology matters: a commentary on contribution to cultural historical activity

    Martin, Jenny

    2017-10-01

    This commentary promotes discussion on the imaginary provided by Sanaz Farhangi in her article entitled, Contribution to activity: a lens for understanding students' potential and agency in physics education. The commentary is concerned with aligning ontological assumptions in research accounts of learning and development with transformative aims. A broad definition of ontology as the theory of existence is preferred. Sociocultural approaches share relational ontology as a common foundation. I agree with scholars elaborating Vygotsky's Transformative Activist Stance that a relational ontology does not imply activism. However, I argue that relational ontology provides a necessary and sufficient theoretical grounding for intentional transformation. I draw upon positioning theory to elaborate the moral aspects of language use and to illustrate that a theory of being as relational already eliminates the transcendental position. I draw on Farhangi's article to further the discussion on the necessity and sufficiency of relational ontology and associated grammars in accounting for activism.

  12. Using ontology network structure in text mining.

    Berndt, Donald J; McCart, James A; Luther, Stephen L

    2010-11-13

    Statistical text mining treats documents as bags of words, with a focus on term frequencies within documents and across document collections. Unlike natural language processing (NLP) techniques that rely on an engineered vocabulary or a full-featured ontology, statistical approaches do not make use of domain-specific knowledge. The freedom from biases can be an advantage, but at the cost of ignoring potentially valuable knowledge. The approach proposed here investigates a hybrid strategy based on computing graph measures of term importance over an entire ontology and injecting the measures into the statistical text mining process. As a starting point, we adapt existing search engine algorithms such as PageRank and HITS to determine term importance within an ontology graph. The graph-theoretic approach is evaluated using a smoking data set from the i2b2 National Center for Biomedical Computing, cast as a simple binary classification task for categorizing smoking-related documents, demonstrating consistent improvements in accuracy.

  13. A multi-model ensemble approach to seabed mapping

    Diesing, Markus; Stephens, David

    2015-06-01

    Seabed habitat mapping based on swath acoustic data and ground-truth samples is an emergent and active marine science discipline. Significant progress could be achieved by transferring techniques and approaches that have been successfully developed and employed in such fields as terrestrial land cover mapping. One such promising approach is the multiple classifier system, which aims at improving classification performance by combining the outputs of several classifiers. Here we present results of a multi-model ensemble applied to multibeam acoustic data covering more than 5000 km2 of seabed in the North Sea with the aim to derive accurate spatial predictions of seabed substrate. A suite of six machine learning classifiers (k-Nearest Neighbour, Support Vector Machine, Classification Tree, Random Forest, Neural Network and Naïve Bayes) was trained with ground-truth sample data classified into seabed substrate classes and their prediction accuracy was assessed with an independent set of samples. The three and five best performing models were combined to classifier ensembles. Both ensembles led to increased prediction accuracy as compared to the best performing single classifier. The improvements were however not statistically significant at the 5% level. Although the three-model ensemble did not perform significantly better than its individual component models, we noticed that the five-model ensemble did perform significantly better than three of the five component models. A classifier ensemble might therefore be an effective strategy to improve classification performance. Another advantage is the fact that the agreement in predicted substrate class between the individual models of the ensemble could be used as a measure of confidence. We propose a simple and spatially explicit measure of confidence that is based on model agreement and prediction accuracy.

  14. Ontology Update in the Cognitive Model of Ontology Learning

    Zhang De-Hai

    2016-01-01

    Full Text Available Ontology has been used in many hot-spot fields, but most ontology construction methods are semiautomatic, and the construction process of ontology is still a tedious and painstaking task. In this paper, a kind of cognitive models is presented for ontology learning which can simulate human being’s learning from world. In this model, the cognitive strategies are applied with the constrained axioms. Ontology update is a key step when the new knowledge adds into the existing ontology and conflict with old knowledge in the process of ontology learning. This proposal designs and validates the method of ontology update based on the axiomatic cognitive model, which include the ontology update postulates, axioms and operations of the learning model. It is proved that these operators subject to the established axiom system.

  15. Building an Ontology of Tablewares using 'Legacy Data'

    Daniël van Helden

    2018-05-01

    Full Text Available This article aims to demonstrate how an ontology can be constructed to encompass many of the criteria needed for more consumption-orientated approaches to Roman tablewares. For this it demonstrates how a dataset in a relational database can be organised for the format and capabilities of an ontology, and then how these data are input into the ontology model. Finally it includes some sample analyses to show the effectiveness of such an ontology for types of analyses that are relevant to this network.

  16. Map Archive Mining: Visual-Analytical Approaches to Explore Large Historical Map Collections

    Johannes H. Uhl

    2018-04-01

    Full Text Available Historical maps are unique sources of retrospective geographical information. Recently, several map archives containing map series covering large spatial and temporal extents have been systematically scanned and made available to the public. The geographical information contained in such data archives makes it possible to extend geospatial analysis retrospectively beyond the era of digital cartography. However, given the large data volumes of such archives (e.g., more than 200,000 map sheets in the United States Geological Survey topographic map archive and the low graphical quality of older, manually-produced map sheets, the process to extract geographical information from these map archives needs to be automated to the highest degree possible. To understand the potential challenges (e.g., salient map characteristics and data quality variations in automating large-scale information extraction tasks for map archives, it is useful to efficiently assess spatio-temporal coverage, approximate map content, and spatial accuracy of georeferenced map sheets at different map scales. Such preliminary analytical steps are often neglected or ignored in the map processing literature but represent critical phases that lay the foundation for any subsequent computational processes including recognition. Exemplified for the United States Geological Survey topographic map and the Sanborn fire insurance map archives, we demonstrate how such preliminary analyses can be systematically conducted using traditional analytical and cartographic techniques, as well as visual-analytical data mining tools originating from machine learning and data science.

  17. An Optimization Approach to Improving Collections of Shape Maps

    Nguyen, Andy; Ben‐Chen, Mirela; Welnicka, Katarzyna

    2011-01-01

    pairwise map independently does not take full advantage of all existing information. For example, a notorious problem with computing shape maps is the ambiguity introduced by the symmetry problem — for two similar shapes which have reflectional symmetry there exist two maps which are equally favorable...... shape maps connecting our collection, we propose to add the constraint of global map consistency, requiring that any composition of maps between two shapes should be independent of the path chosen in the network. This requirement can help us choose among the equally good symmetric alternatives, or help...

  18. Interestingness measures and strategies for mining multi-ontology multi-level association rules from gene ontology annotations for the discovery of new GO relationships.

    Manda, Prashanti; McCarthy, Fiona; Bridges, Susan M

    2013-10-01

    The Gene Ontology (GO), a set of three sub-ontologies, is one of the most popular bio-ontologies used for describing gene product characteristics. GO annotation data containing terms from multiple sub-ontologies and at different levels in the ontologies is an important source of implicit relationships between terms from the three sub-ontologies. Data mining techniques such as association rule mining that are tailored to mine from multiple ontologies at multiple levels of abstraction are required for effective knowledge discovery from GO annotation data. We present a data mining approach, Multi-ontology data mining at All Levels (MOAL) that uses the structure and relationships of the GO to mine multi-ontology multi-level association rules. We introduce two interestingness measures: Multi-ontology Support (MOSupport) and Multi-ontology Confidence (MOConfidence) customized to evaluate multi-ontology multi-level association rules. We also describe a variety of post-processing strategies for pruning uninteresting rules. We use publicly available GO annotation data to demonstrate our methods with respect to two applications (1) the discovery of co-annotation suggestions and (2) the discovery of new cross-ontology relationships. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.

  19. A Probabilistic Approach for Improved Sequence Mapping in Metatranscriptomic Studies

    Mapping millions of short DNA sequences a reference genome is a necessary step in many experiments designed to investigate the expression of genes involved in disease resistance. This is a difficult task in which several challenges often arise resulting in a suboptimal mapping. This mapping process ...

  20. Comparing alternative data-driven ontological vistas of natural history

    van Erp, M.G.J.; Lendvai, P.K.; van den Bosch, A.; Bunt, H.; Petukhova, V.; Wubben, S.

    2009-01-01

    Traditionally, ontologies are created manually, based on human experts' view of the concepts and relations of the domain at hand. We present ongoing work on two approaches to the automatic construction of ontologies from a flat database of records, and compare them to a manually constructed

  1. The Ontological Politics of Evidence and Policy Enablement

    Carusi, F. Tony; Rawlins, Peter; Ashton, Karen

    2018-01-01

    Ontological politics has received increasing attention within education policy studies, particularly as a support for the notion of policy enactment. While policy enactment offers serious challenges to traditional approaches toward policy implementation, this paper takes up ontological politics as a concept that extends beyond implementation and…

  2. Mining rare associations between biological ontologies.

    Benites, Fernando; Simon, Svenja; Sapozhnikova, Elena

    2014-01-01

    The constantly increasing volume and complexity of available biological data requires new methods for their management and analysis. An important challenge is the integration of information from different sources in order to discover possible hidden relations between already known data. In this paper we introduce a data mining approach which relates biological ontologies by mining cross and intra-ontology pairwise generalized association rules. Its advantage is sensitivity to rare associations, for these are important for biologists. We propose a new class of interestingness measures designed for hierarchically organized rules. These measures allow one to select the most important rules and to take into account rare cases. They favor rules with an actual interestingness value that exceeds the expected value. The latter is calculated taking into account the parent rule. We demonstrate this approach by applying it to the analysis of data from Gene Ontology and GPCR databases. Our objective is to discover interesting relations between two different ontologies or parts of a single ontology. The association rules that are thus discovered can provide the user with new knowledge about underlying biological processes or help improve annotation consistency. The obtained results show that produced rules represent meaningful and quite reliable associations.

  3. Mining rare associations between biological ontologies.

    Fernando Benites

    Full Text Available The constantly increasing volume and complexity of available biological data requires new methods for their management and analysis. An important challenge is the integration of information from different sources in order to discover possible hidden relations between already known data. In this paper we introduce a data mining approach which relates biological ontologies by mining cross and intra-ontology pairwise generalized association rules. Its advantage is sensitivity to rare associations, for these are important for biologists. We propose a new class of interestingness measures designed for hierarchically organized rules. These measures allow one to select the most important rules and to take into account rare cases. They favor rules with an actual interestingness value that exceeds the expected value. The latter is calculated taking into account the parent rule. We demonstrate this approach by applying it to the analysis of data from Gene Ontology and GPCR databases. Our objective is to discover interesting relations between two different ontologies or parts of a single ontology. The association rules that are thus discovered can provide the user with new knowledge about underlying biological processes or help improve annotation consistency. The obtained results show that produced rules represent meaningful and quite reliable associations.

  4. Ontology Design Patterns for Combining Pathology and Anatomy: Application to Study Aging and Longevity in Inbred Mouse Strains

    Alghamdi, Sarah M.

    2018-01-01

    To evaluate the generated ontologies, we utilize these in ontology-based data analysis, including ontology enrichment analysis and computation of semantic similarity. We demonstrate that there are significant differences between the four ontologies in different analysis approaches. In addition, when using semantic similarity to confirm the hypothesis that genetically identical mice should develop more similar diseases, the generated combined ontologies lead to significantly better analysis results compared to using each ontology individually. Our results reveal that using ontology design patterns to combine different facets characterizing a dataset can improve established analysis methods.

  5. Feature-opinion pair identification of product reviews in Chinese: a domain ontology modeling method

    Yin, Pei; Wang, Hongwei; Guo, Kaiqiang

    2013-03-01

    With the emergence of the new economy based on social media, a great amount of consumer feedback on particular products are conveyed through wide-spreading product online reviews, making opinion mining a growing interest for both academia and industry. According to the characteristic mode of expression in Chinese, this research proposes an ontology-based linguistic model to identify the basic appraisal expression in Chinese product reviews-"feature-opinion pair (FOP)." The product-oriented domain ontology is constructed automatically at first, then algorithms to identify FOP are designed by mapping product features and opinions to the conceptual space of the domain ontology, and finally comparative experiments are conducted to evaluate the model. Experimental results indicate that the performance of the proposed approach in this paper is efficient in obtaining a more accurate result compared to the state-of-art algorithms. Furthermore, through identifying and analyzing FOPs, the unstructured product reviews are converted into structured and machine-sensible expression, which provides valuable information for business application. This paper contributes to the related research in opinion mining by developing a solid foundation for further sentiment analysis at a fine-grained level and proposing a general way for automatic ontology construction.

  6. Toward a general ontology for digital forensic disciplines.

    Karie, Nickson M; Venter, Hein S

    2014-09-01

    Ontologies are widely used in different disciplines as a technique for representing and reasoning about domain knowledge. However, despite the widespread ontology-related research activities and applications in different disciplines, the development of ontologies and ontology research activities is still wanting in digital forensics. This paper therefore presents the case for establishing an ontology for digital forensic disciplines. Such an ontology would enable better categorization of the digital forensic disciplines, as well as assist in the development of methodologies and specifications that can offer direction in different areas of digital forensics. This includes such areas as professional specialization, certifications, development of digital forensic tools, curricula, and educational materials. In addition, the ontology presented in this paper can be used, for example, to better organize the digital forensic domain knowledge and explicitly describe the discipline's semantics in a common way. Finally, this paper is meant to spark discussions and further research on an internationally agreed ontological distinction of the digital forensic disciplines. Digital forensic disciplines ontology is a novel approach toward organizing the digital forensic domain knowledge and constitutes the main contribution of this paper. © 2014 American Academy of Forensic Sciences.

  7. Process attributes in bio-ontologies

    Andrade André Q

    2012-08-01

    Full Text Available Abstract Background Biomedical processes can provide essential information about the (mal- functioning of an organism and are thus frequently represented in biomedical terminologies and ontologies, including the GO Biological Process branch. These processes often need to be described and categorised in terms of their attributes, such as rates or regularities. The adequate representation of such process attributes has been a contentious issue in bio-ontologies recently; and domain ontologies have correspondingly developed ad hoc workarounds that compromise interoperability and logical consistency. Results We present a design pattern for the representation of process attributes that is compatible with upper ontology frameworks such as BFO and BioTop. Our solution rests on two key tenets: firstly, that many of the sorts of process attributes which are biomedically interesting can be characterised by the ways that repeated parts of such processes constitute, in combination, an overall process; secondly, that entities for which a full logical definition can be assigned do not need to be treated as primitive within a formal ontology framework. We apply this approach to the challenge of modelling and automatically classifying examples of normal and abnormal rates and patterns of heart beating processes, and discuss the expressivity required in the underlying ontology representation language. We provide full definitions for process attributes at increasing levels of domain complexity. Conclusions We show that a logical definition of process attributes is feasible, though limited by the expressivity of DL languages so that the creation of primitives is still necessary. This finding may endorse current formal upper-ontology frameworks as a way of ensuring consistency, interoperability and clarity.

  8. An optimization approach for extracting and encoding consistent maps in a shape collection

    Huang, Qi-Xing

    2012-11-01

    We introduce a novel approach for computing high quality point-topoint maps among a collection of related shapes. The proposed approach takes as input a sparse set of imperfect initial maps between pairs of shapes and builds a compact data structure which implicitly encodes an improved set of maps between all pairs of shapes. These maps align well with point correspondences selected from initial maps; they map neighboring points to neighboring points; and they provide cycle-consistency, so that map compositions along cycles approximate the identity map. The proposed approach is motivated by the fact that a complete set of maps between all pairs of shapes that admits nearly perfect cycleconsistency are highly redundant and can be represented by compositions of maps through a single base shape. In general, multiple base shapes are needed to adequately cover a diverse collection. Our algorithm sequentially extracts such a small collection of base shapes and creates correspondences from each of these base shapes to all other shapes. These correspondences are found by global optimization on candidate correspondences obtained by diffusing initial maps. These are then used to create a compact graphical data structure from which globally optimal cycle-consistent maps can be extracted using simple graph algorithms. Experimental results on benchmark datasets show that the proposed approach yields significantly better results than state-of-theart data-driven shape matching methods. © 2012 ACM.

  9. Mapping of multi-floor buildings: A barometric approach

    Özkil, Ali Gürcan; Fan, Zhun; Xiao, Jizhong

    2011-01-01

    This paper presents a new method for mapping multi5floor buildings. The method combines laser range sensor for metric mapping and barometric pressure sensor for detecting floor transitions and map segmentation. We exploit the fact that the barometric pressure is a function of the elevation......, and it varies between different floors. The method is tested with a real robot in a typical indoor environment, and the results show that physically consistent multi5floor representations are achievable....

  10. Ontology and medical diagnosis.

    Bertaud-Gounot, Valérie; Duvauferrier, Régis; Burgun, Anita

    2012-03-01

    Ontology and associated generic tools are appropriate for knowledge modeling and reasoning, but most of the time, disease definitions in existing description logic (DL) ontology are not sufficient to classify patient's characteristics under a particular disease because they do not formalize operational definitions of diseases (association of signs and symptoms=diagnostic criteria). The main objective of this study is to propose an ontological representation which takes into account the diagnostic criteria on which specific patient conditions may be classified under a specific disease. This method needs as a prerequisite a clear list of necessary and sufficient diagnostic criteria as defined for lots of diseases by learned societies. It does not include probability/uncertainty which Web Ontology Language (OWL 2.0) cannot handle. We illustrate it with spondyloarthritis (SpA). Ontology has been designed in Protégé 4.1 OWL-DL2.0. Several kinds of criteria were formalized: (1) mandatory criteria, (2) picking two criteria among several diagnostic criteria, (3) numeric criteria. Thirty real patient cases were successfully classified with the reasoner. This study shows that it is possible to represent operational definitions of diseases with OWL and successfully classify real patient cases. Representing diagnostic criteria as descriptive knowledge (instead of rules in Semantic Web Rule Language or Prolog) allows us to take advantage of tools already available for OWL. While we focused on Assessment of SpondyloArthritis international Society SpA criteria, we believe that many of the representation issues addressed here are relevant to using OWL-DL for operational definition of other diseases in ontology.

  11. Duelling Ontologies: Might Vitalism Offer Balance and Value?

    Richards, Dennis; Emmanuel, Elizabeth; Grace, Sandra

    This article is part of a project investigating chiropractors' beliefs on the role of vitalism in their philosophical and practice approaches and how that might contribute to addressing current epidemics of non-communicable diseases. It aims to present atomism, reductionism, materialism and mechanism as fundamental ontologies in biomedicine and to examine what role these might play in its struggle to deal with these epidemics; to present vitalism as a fundamental ontology existing in chiropractic along with these ontologies of biomedicine; and to discuss how imbalances in the use of these ontologies and practices stemming from them might be contributing to difficulties in addressing these epidemics. The use of more balanced approaches by chiropractors involving not only mechanistic biomedical ontologies but also an increased focus on vitalism might offer value in addressing these epidemics and should be investigated. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. From Patient Discharge Summaries to an Ontology for Psychiatry.

    Richard, Marion; Aimé, Xavier; Jaulent, Marie-Christine; Krebs, Marie-Odile; Charlet, Jean

    2017-01-01

    Psychiatry aims at detecting symptoms, providing diagnoses and treating mental disorders. We developed ONTOPSYCHIA, an ontology for psychiatry in three modules: social and environmental factors of mental disorders, mental disorders, and treatments. The use of ONTOPSYCHIA, associated with dedicated tools, will facilitate semantic research in Patient Discharge Summaries (PDS). To develop the first module of the ontology we propose a PDS text analysis in order to explicit psychiatry concepts. We decided to set aside classifications during the construction of the modu le, to focus only on the information contained in PDS (bottom-up approach) and to return to domain classifications solely for the enrichment phase (top-down approach). Then, we focused our work on the development of the LOVMI methodology (Les Ontologies Validées par Méthode Interactive - Ontologies Validated by Interactive Method), which aims to provide a methodological framework to validate the structure and the semantic of an ontology.

  13. Core Semantics for Public Ontologies

    Suni, Niranjan

    2005-01-01

    ... (schemas or ontologies) with respect to objects. The DARPA Agent Markup Language (DAML) through the use of ontologies provides a very powerful way to describe objects and their relationships to other objects...

  14. ONTOLOGY IN PHARMACY

    L. Yu. Babintseva

    2015-05-01

    Full Text Available It’s considered ontological models for formalization of knowledge in pharmacy. There is emphasized the view that the possibility of rapid exchange of information in the pharmaceutical industry, it is necessary to create a single information space. This means not only the establishment of uniform standards for the presentation of information on pharmaceutical groups pharmacotherapeutic classifications, but also the creation of a unified and standardized system for the transfer and renewal of knowledge. It is the organization of information in the ontology helps quickly in the future to build expert systems and applications to work with data.

  15. A Hierarchical and Distributed Approach for Mapping Large Applications to Heterogeneous Grids using Genetic Algorithms

    Sanyal, Soumya; Jain, Amit; Das, Sajal K.; Biswas, Rupak

    2003-01-01

    In this paper, we propose a distributed approach for mapping a single large application to a heterogeneous grid environment. To minimize the execution time of the parallel application, we distribute the mapping overhead to the available nodes of the grid. This approach not only provides a fast mapping of tasks to resources but is also scalable. We adopt a hierarchical grid model and accomplish the job of mapping tasks to this topology using a scheduler tree. Results show that our three-phase algorithm provides high quality mappings, and is fast and scalable.

  16. Integrating systems biology models and biomedical ontologies.

    Hoehndorf, Robert; Dumontier, Michel; Gennari, John H; Wimalaratne, Sarala; de Bono, Bernard; Cook, Daniel L; Gkoutos, Georgios V

    2011-08-11

    Systems biology is an approach to biology that emphasizes the structure and dynamic behavior of biological systems and the interactions that occur within them. To succeed, systems biology crucially depends on the accessibility and integration of data across domains and levels of granularity. Biomedical ontologies were developed to facilitate such an integration of data and are often used to annotate biosimulation models in systems biology. We provide a framework to integrate representations of in silico systems biology with those of in vivo biology as described by biomedical ontologies and demonstrate this framework using the Systems Biology Markup Language. We developed the SBML Harvester software that automatically converts annotated SBML models into OWL and we apply our software to those biosimulation models that are contained in the BioModels Database. We utilize the resulting knowledge base for complex biological queries that can bridge levels of granularity, verify models based on the biological phenomenon they represent and provide a means to establish a basic qualitative layer on which to express the semantics of biosimulation models. We establish an information flow between biomedical ontologies and biosimulation models and we demonstrate that the integration of annotated biosimulation models and biomedical ontologies enables the verification of models as well as expressive queries. Establishing a bi-directional information flow between systems biology and biomedical ontologies has the potential to enable large-scale analyses of biological systems that span levels of granularity from molecules to organisms.

  17. Validating EHR clinical models using ontology patterns.

    Martínez-Costa, Catalina; Schulz, Stefan

    2017-12-01

    Clinical models are artefacts that specify how information is structured in electronic health records (EHRs). However, the makeup of clinical models is not guided by any formal constraint beyond a semantically vague information model. We address this gap by advocating ontology design patterns as a mechanism that makes the semantics of clinical models explicit. This paper demonstrates how ontology design patterns can validate existing clinical models using SHACL. Based on the Clinical Information Modelling Initiative (CIMI), we show how ontology patterns detect both modeling and terminology binding errors in CIMI models. SHACL, a W3C constraint language for the validation of RDF graphs, builds on the concept of "Shape", a description of data in terms of expected cardinalities, datatypes and other restrictions. SHACL, as opposed to OWL, subscribes to the Closed World Assumption (CWA) and is therefore more suitable for the validation of clinical models. We have demonstrated the feasibility of the approach by manually describing the correspondences between six CIMI clinical models represented in RDF and two SHACL ontology design patterns. Using a Java-based SHACL implementation, we found at least eleven modeling and binding errors within these CIMI models. This demonstrates the usefulness of ontology design patterns not only as a modeling tool but also as a tool for validation. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Methodology of decreasing software complexity using ontology

    DÄ browska-Kubik, Katarzyna

    2015-09-01

    In this paper a model of web application`s source code, based on the OSD ontology (Ontology for Software Development), is proposed. This model is applied to implementation and maintenance phase of software development process through the DevOntoCreator tool [5]. The aim of this solution is decreasing software complexity of that source code, using many different maintenance techniques, like creation of documentation, elimination dead code, cloned code or bugs, which were known before [1][2]. Due to this approach saving on software maintenance costs of web applications will be possible.

  19. Identification of probabilistic approaches and map-based navigation ...

    B Madhevan

    2018-02-07

    Feb 7, 2018 ... consists of three processes: map learning (ML), localization and PP [73–76]. (i) ML ...... [83] Thrun S 2001 A probabilistic online mapping algorithm for teams of .... for target tracking using fuzzy logic controller in game theoretic ...

  20. Quantitative Architectural Analysis: A New Approach to Cortical Mapping

    Schleicher, Axel; Morosan, Patricia; Amunts, Katrin; Zilles, Karl

    2009-01-01

    Results from functional imaging studies are often still interpreted using the classical architectonic brain maps of Brodmann and his successors. One obvious weakness in traditional, architectural mapping is the subjective nature of localizing borders between cortical areas by means of a purely visual, microscopical examination of histological…

  1. Toward a formal ontology for narrative

    Ciotti, Fabio

    2016-03-01

    Full Text Available In this paper the rationale and the first draft of a formal ontology for modeling narrative texts are presented. Building on the semiotic and structuralist narratology, and on the work carried out in the late 1980s by Giuseppe Gigliozzi in Italy, the focus of my research are the concepts of character and of narrative world/space. This formal model is expressed in the OWL 2 ontology language. The main reason to adopt a formal modeling approach is that I consider the purely probabilistic-quantitative methods (now widespread in digital literary studies inadequate. An ontology, on one hand provides a tool for the analysis of strictly literary texts. On the other hand (though beyond the scope of the present work, its formalization can also represent a significant contribution towards grounding the application of storytelling methods outside of scholarly contexts.

  2. Exploiting Surroundedness for Saliency Detection: A Boolean Map Approach.

    Zhang, Jianming; Sclaroff, Stan

    2016-05-01

    We demonstrate the usefulness of surroundedness for eye fixation prediction by proposing a Boolean Map based Saliency model (BMS). In our formulation, an image is characterized by a set of binary images, which are generated by randomly thresholding the image's feature maps in a whitened feature space. Based on a Gestalt principle of figure-ground segregation, BMS computes a saliency map by discovering surrounded regions via topological analysis of Boolean maps. Furthermore, we draw a connection between BMS and the Minimum Barrier Distance to provide insight into why and how BMS can properly captures the surroundedness cue via Boolean maps. The strength of BMS is verified by its simplicity, efficiency and superior performance compared with 10 state-of-the-art methods on seven eye tracking benchmark datasets.

  3. Using a Foundational Ontology for Reengineering a Software Enterprise Ontology

    Perini Barcellos, Monalessa; de Almeida Falbo, Ricardo

    The knowledge about software organizations is considerably relevant to software engineers. The use of a common vocabulary for representing the useful knowledge about software organizations involved in software projects is important for several reasons, such as to support knowledge reuse and to allow communication and interoperability between tools. Domain ontologies can be used to define a common vocabulary for sharing and reuse of knowledge about some domain. Foundational ontologies can be used for evaluating and re-designing domain ontologies, giving to these real-world semantics. This paper presents an evaluating of a Software Enterprise Ontology that was reengineered using the Unified Foundation Ontology (UFO) as basis.

  4. Ontological support for web courseware authoring

    Aroyo, L.M.; Dicheva, D.; Cristea, A.I.; Cerri, S.A.; Gouardères, G.; Paraguaçu, F.

    2002-01-01

    In this paper we present an ontology- oriented authoring support system for Web-based courseware. This is an elaboration of our approach to knowledge classification and indexing in the previously developed system AIMS (Agent-based Information Management System) aimed at supporting students while

  5. Ontology matching evaluation : A statistical perspective

    Mohammadi, M.; Hofman, W.J.; Tan, Y.H.

    2016-01-01

    This paper proposes statistical approaches to test if the difference between two ontology matchers is real. Specifically, the performances of the matchers over multiple data sets are obtained and based on their performances, the conclusion can be drawn whether one method is better than one another

  6. Ontology matching evaluation : A statistical perspective

    Mohammadi, M.; Hofman, Wout; Tan, Y.

    2016-01-01

    This paper proposes statistical approaches to test if the difference between two ontology matchers is real. Specifically, the performances of the matchers over multiple data sets are obtained and based on their performances, the conclusion can be drawn whether one method is better than one

  7. Adaptive e-learning system using ontology

    Yarandi, Maryam; Tawil, Abdel-Rahman; Jahankhani, Hossein

    2011-01-01

    This paper proposes an innovative ontological approach to design a personalised e-learning system which creates a tailored workflow for individual learner. Moreover, the learning content and sequencing logic is separated into content model and pedagogical model to increase the reusability and flexibility of the system.

  8. Ontology-Based Retrieval of Spatially Related Objects for Location Based Services

    Haav, Hele-Mai; Kaljuvee, Aivi; Luts, Martin; Vajakas, Toivo

    Advanced Location Based Service (LBS) applications have to integrate information stored in GIS, information about users' preferences (profile) as well as contextual information and information about application itself. Ontology engineering provides methods to semantically integrate several data sources. We propose an ontology-driven LBS development framework: the paper describes the architecture of ontologies and their usage for retrieval of spatially related objects relevant to the user. Our main contribution is to enable personalised ontology driven LBS by providing a novel approach for defining personalised semantic spatial relationships by means of ontologies. The approach is illustrated by an industrial case study.

  9. The design ontology

    Storga, Mario; Andreasen, Mogens Myrup; Marjanovic, Dorian

    2010-01-01

    The article presents the research of the nature, building and practical role of a Design Ontology as a potential framework for the more efficient product development (PD) data-, information- and knowledge- description, -explanation, -understanding and -reusing. In the methodology for development ...

  10. Dahlbeck and Pure Ontology

    Mackenzie, Jim

    2016-01-01

    This article responds to Johan Dahlbeck's "Towards a pure ontology: Children's bodies and morality" ["Educational Philosophy and Theory," vol. 46 (1), 2014, pp. 8-23 (EJ1026561)]. His arguments from Nietzsche and Spinoza do not carry the weight he supposes, and the conclusions he draws from them about pedagogy would be…

  11. Audit Validation Using Ontologies

    Ion IVAN

    2015-01-01

    Full Text Available Requirements to increase quality audit processes in enterprises are defined. It substantiates the need for assessment and management audit processes using ontologies. Sets of rules, ways to assess the consistency of rules and behavior within the organization are defined. Using ontologies are obtained qualifications that assess the organization's audit. Elaboration of the audit reports is a perfect algorithm-based activity characterized by generality, determinism, reproducibility, accuracy and a well-established. The auditors obtain effective levels. Through ontologies obtain the audit calculated level. Because the audit report is qualitative structure of information and knowledge it is very hard to analyze and interpret by different groups of users (shareholders, managers or stakeholders. Developing ontology for audit reports validation will be a useful instrument for both auditors and report users. In this paper we propose an instrument for validation of audit reports contain a lot of keywords that calculates indicators, a lot of indicators for each key word there is an indicator, qualitative levels; interpreter who builds a table of indicators, levels of actual and calculated levels.

  12. Biomedicine: an ontological dissection.

    Baronov, David

    2008-01-01

    Though ubiquitous across the medical social sciences literature, the term "biomedicine" as an analytical concept remains remarkably slippery. It is argued here that this imprecision is due in part to the fact that biomedicine is comprised of three interrelated ontological spheres, each of which frames biomedicine as a distinct subject of investigation. This suggests that, depending upon one's ontological commitment, the meaning of biomedicine will shift. From an empirical perspective, biomedicine takes on the appearance of a scientific enterprise and is defined as a derivative category of Western science more generally. From an interpretive perspective, biomedicine represents a symbolic-cultural expression whose adherence to the principles of scientific objectivity conceals an ideological agenda. From a conceptual perspective, biomedicine represents an expression of social power that reflects structures of power and privilege within capitalist society. No one perspective exists in isolation and so the image of biomedicine from any one presents an incomplete understanding. It is the mutually-conditioning interrelations between these ontological spheres that account for biomedicine's ongoing development. Thus, the ontological dissection of biomedicine that follows, with particular emphasis on the period of its formal crystallization in the latter nineteenth and early twentieth century, is intended to deepen our understanding of biomedicine as an analytical concept across the medical social sciences literature.

  13. Un enfoque basado en ontología para la gestión integrada del medio ambiente y de la seguridad y la salud en obra An ontology-based approach for on-site integrated environmental and health and safety management

    Marta Gangolells

    2012-12-01

    Full Text Available Este artículo tiene como objetivo favorecer la implementación de sistemas integrados de gestión ambiental, de seguridad y salud en empresas constructoras, centrándose en el subsistema de control de los impactos ambientales y los riesgos de seguridad y salud en obra. La gran compatibilidad que presentan los requerimientos vinculados al control operacional establecidos en la normas ISO 14001:2004 y OHSAS 18001:2007 así como las interacciones existentes entre impactos ambientales y riesgos de seguridad y salud (Gangolells et al., 2009, Gangolells et al., 2010 han motivado el desarrollo de una ontología que permite construir un modelo integrado para el control operacional en obras de construcción. El enfoque desarrollado está fuertemente influenciado por la metodología de Noy y McGuiness (2001 y modela los conceptos clave y las relaciones del campo de forma estructurada, extensibe, flexible, reutilizable y compartible. Este enfoque basado en ontologías ha sido implementado mediante Protégé 3.4 beta y correctamente evaluado utilizando preguntas de competencia, verificaciones internas y entrevistas de validación con expertos. Este artículo desarrolla el primer enfoque que permite representar, compartir, reutilizar y gestionar el conocimiento relacionado con el control operacional integrado en obra de las incidencias medioambientales, de seguridad y salud y sienta las bases para poder superar la mayoría de las barreras que las empresas constructoras deben afrontar durante el proceso de implementación de un sistema de gestión integrada.This paper presents an innovative approach to implementing integrated environmental and health and safety management systems in construction companies. It focuses on the sub-system for operational control of on-site environmental impacts and health and safety risks. The high compatibility between the operational control requirements that are stated in ISO 14001:2004 and OHSAS 18001:2007 and the

  14. Comparison of spatial association approaches for landscape mapping of soil organic carbon stocks

    Miller, B. A.; Koszinski, S.; Wehrhan, M.; Sommer, M.

    2015-03-01

    The distribution of soil organic carbon (SOC) can be variable at small analysis scales, but consideration of its role in regional and global issues demands the mapping of large extents. There are many different strategies for mapping SOC, among which is to model the variables needed to calculate the SOC stock indirectly or to model the SOC stock directly. The purpose of this research is to compare direct and indirect approaches to mapping SOC stocks from rule-based, multiple linear regression models applied at the landscape scale via spatial association. The final products for both strategies are high-resolution maps of SOC stocks (kg m-2), covering an area of 122 km2, with accompanying maps of estimated error. For the direct modelling approach, the estimated error map was based on the internal error estimations from the model rules. For the indirect approach, the estimated error map was produced by spatially combining the error estimates of component models via standard error propagation equations. We compared these two strategies for mapping SOC stocks on the basis of the qualities of the resulting maps as well as the magnitude and distribution of the estimated error. The direct approach produced a map with less spatial variation than the map produced by the indirect approach. The increased spatial variation represented by the indirect approach improved R2 values for the topsoil and subsoil stocks. Although the indirect approach had a lower mean estimated error for the topsoil stock, the mean estimated error for the total SOC stock (topsoil + subsoil) was lower for the direct approach. For these reasons, we recommend the direct approach to modelling SOC stocks be considered a more conservative estimate of the SOC stocks' spatial distribution.

  15. Epistemology and ontology in core ontologies: FOLaw and LRI-Core, two core ontologies for law

    Breukers, J.A.P.J.; Hoekstra, R.J.

    2004-01-01

    For more than a decade constructing ontologies for legal domains, we, at the Leibniz Center for Law, felt really the need to develop a core ontology for law that would enable us to re-use the common denominator of the various legal domains. In this paper we present two core ontologies for law. The

  16. An ontology based trust verification of software license agreement

    Lu, Wenhuan; Li, Xiaoqing; Gan, Zengqin; Wei, Jianguo

    2017-08-01

    When we install software or download software, there will show up so big mass document to state the rights and obligations, for which lots of person are not patient to read it or understand it. That would may make users feel distrust for the software. In this paper, we propose an ontology based verification for Software License Agreement. First of all, this work proposed an ontology model for domain of Software License Agreement. The domain ontology is constructed by proposed methodology according to copyright laws and 30 software license agreements. The License Ontology can act as a part of generalized copyright law knowledge model, and also can work as visualization of software licenses. Based on this proposed ontology, a software license oriented text summarization approach is proposed which performances showing that it can improve the accuracy of software licenses summarizing. Based on the summarization, the underline purpose of the software license can be explicitly explored for trust verification.

  17. Prioritising coastal zone management issues through fuzzy cognitive mapping approach.

    Meliadou, Aleka; Santoro, Francesca; Nader, Manal R; Dagher, Manale Abou; Al Indary, Shadi; Salloum, Bachir Abi

    2012-04-30

    Effective public participation is an essential component of Integrated Coastal Zone Management implementation. To promote such participation, a shared understanding of stakeholders' objectives has to be built to ultimately result in common coastal management strategies. The application of quantitative and semi-quantitative methods involving tools such as Fuzzy Cognitive Mapping is presently proposed for reaching such understanding. In this paper we apply the Fuzzy Cognitive Mapping tool to elucidate the objectives and priorities of North Lebanon's coastal productive sectors, and to formalize their coastal zone perceptions and knowledge. Then, we investigate the potential of Fuzzy Cognitive Mapping as tool for support coastal zone management. Five round table discussions were organized; one for the municipalities of the area and one for each of the main coastal productive sectors (tourism, industry, fisheries, agriculture), where the participants drew cognitive maps depicting their views. The analysis of the cognitive maps showed a large number of factors perceived as affecting the current situation of the North Lebanon coastal zone that were classified into five major categories: governance, infrastructure, environment, intersectoral interactions and sectoral initiatives. Furthermore, common problems, expectations and management objectives for all sectors were exposed. Within this context, Fuzzy Cognitive Mapping proved to be an essential tool for revealing stakeholder knowledge and perception and understanding complex relationships. Copyright © 2011 Elsevier Ltd. All rights reserved.

  18. Providing visualisation support for the analysis of anatomy ontology data

    Burger Albert

    2005-03-01

    Full Text Available Abstract Background Improvements in technology have been accompanied by the generation of large amounts of complex data. This same technology must be harnessed effectively if the knowledge stored within the data is to be retrieved. Storing data in ontologies aids its management; ontologies serve as controlled vocabularies that promote data exchange and re-use, improving analysis. The Edinburgh Mouse Atlas Project stores the developmental stages of the mouse embryo in anatomy ontologies. This project is looking at the use of visual data overviews for intuitive analysis of the ontology data. Results A prototype has been developed that visualises the ontologies using directed acyclic graphs in two dimensions, with the ability to study detail in regions of interest in isolation or within the context of the overview. This is followed by the development of a technique that layers individual anatomy ontologies in three-dimensional space, so that relationships across multiple data sets may be mapped using physical links drawn along the third axis. Conclusion Usability evaluations of the applications confirmed advantages in visual analysis of complex data. This project will look next at data input from multiple sources, and continue to develop the techniques presented to provide intuitive identification of relationships that span multiple ontologies.

  19. Application of Alignment Methodologies to Spatial Ontologies in the Hydro Domain

    Lieberman, J. E.; Cheatham, M.; Varanka, D.

    2015-12-01

    Ontologies are playing an increasing role in facilitating mediation and translation between datasets representing diverse schemas, vocabularies, or knowledge communities. This role is relatively straightforward when there is one ontology comprising all relevant common concepts that can be mapped to entities in each dataset. Frequently, one common ontology has not been agreed to. Either each dataset is represented by a distinct ontology, or there are multiple candidates for commonality. Either the one most appropriate (expressive, relevant, correct) ontology must be chosen, or else concepts and relationships matched across multiple ontologies through an alignment process so that they may be used in concert to carry out mediation or other semantic operations. A resulting alignment can be effective to the extent that entities in in the ontologies represent differing terminology for comparable conceptual knowledge. In cases such as spatial ontologies, though, ontological entities may also represent disparate conceptualizations of space according to the discernment methods and application domains on which they are based. One ontology's wetland concept may overlap in space with another ontology's recharge zone or wildlife range or water feature. In order to evaluate alignment with respect to spatial ontologies, alignment has been applied to a series of ontologies pertaining to surface water that are used variously in hydrography (characterization of water features), hydrology (study of water cycling), and water quality (nutrient and contaminant transport) application domains. There is frequently a need to mediate between datasets in each domain in order to develop broader understanding of surface water systems, so there is a practical as well theoretical value in the alignment. From a domain expertise standpoint, the ontologies under consideration clearly contain some concepts that are spatially as well as conceptually identical and then others with less clear

  20. ONTOLOGY-DRIVEN TOOL FOR UTILIZING PROGRAMMING STYLES

    Nikolay Sidorov

    2017-07-01

    Full Text Available Activities of a programmer will be more effective and the software will be more understandable when within the process of software development, programming styles (standards are used, providing clarity of software texts. Purpose: In this research, we present the tool for the realization of new ontology-based methodology automated reasoning techniques for utilizing programming styles. In particular, we focus on representing programming styles in the form of formal ontologies, and study how description logic reasoner can assist programmers in utilizing programming standards. Our research hypothesis is as follows: ontological representation of programming styles can provide additional benefits over existing approaches in utilizing programmer of programming standards. Our research goal is to develop a tool to support the ontology-based utilizing programming styles. Methods: ontological representation of programming styles; object-oriented programming; ontology-driven utilizing of programming styles. Results: the architecture was obtained and the tool was developed in the Java language, which provide tool support of ontology-driven programming styles application method. On the example of naming of the Java programming language standard, features of implementation and application of the tool are provided. Discussion: application of programming styles in coding of program; lack of automated tools for the processes of programming standards application; tool based on new method of ontology-driven application of programming styles; an example of the implementation of tool architecture for naming rules of the Java language standard.

  1. BioPortal: enhanced functionality via new Web services from the National Center for Biomedical Ontology to access and use ontologies in software applications.

    Whetzel, Patricia L; Noy, Natalya F; Shah, Nigam H; Alexander, Paul R; Nyulas, Csongor; Tudorache, Tania; Musen, Mark A

    2011-07-01

    The National Center for Biomedical Ontology (NCBO) is one of the National Centers for Biomedical Computing funded under the NIH Roadmap Initiative. Contributing to the national computing infrastructure, NCBO has developed BioPortal, a web portal that provides access to a library of biomedical ontologies and terminologies (http://bioportal.bioontology.org) via the NCBO Web services. BioPortal enables community participation in the evaluation and evolution of ontology content by providing features to add mappings between terms, to add comments linked to specific ontology terms and to provide ontology reviews. The NCBO Web services (http://www.bioontology.org/wiki/index.php/NCBO_REST_services) enable this functionality and provide a uniform mechanism to access ontologies from a variety of knowledge representation formats, such as Web Ontology Language (OWL) and Open Biological and Biomedical Ontologies (OBO) format. The Web services provide multi-layered access to the ontology content, from getting all terms in an ontology to retrieving metadata about a term. Users can easily incorporate the NCBO Web services into software applications to generate semantically aware applications and to facilitate structured data collection.

  2. A holy Quran ontology construction with semiautomatic population ...

    Log in or Register to get access to full text downloads. ... The major contribution of this approach is to harness the benefits of learning methods, conjoined with statistical ... Keywords: Ontology; Holy Quran; named entity; machine learning ...

  3. Modelling and approaching pragmatic interoperability of distributed geoscience data

    Ma, Xiaogang

    2010-05-01

    Interoperability of geodata, which is essential for sharing information and discovering insights within a cyberinfrastructure, is receiving increasing attention. A key requirement of interoperability in the context of geodata sharing is that data provided by local sources can be accessed, decoded, understood and appropriately used by external users. Various researchers have discussed that there are four levels in data interoperability issues: system, syntax, schematics and semantics, which respectively relate to the platform, encoding, structure and meaning of geodata. Ontology-driven approaches have been significantly studied addressing schematic and semantic interoperability issues of geodata in the last decade. There are different types, e.g. top-level ontologies, domain ontologies and application ontologies and display forms, e.g. glossaries, thesauri, conceptual schemas and logical theories. Many geodata providers are maintaining their identified local application ontologies in order to drive standardization in local databases. However, semantic heterogeneities often exist between these local ontologies, even though they are derived from equivalent disciplines. In contrast, common ontologies are being studied in different geoscience disciplines (e.g., NAMD, SWEET, etc.) as a standardization procedure to coordinate diverse local ontologies. Semantic mediation, e.g. mapping between local ontologies, or mapping local ontologies to common ontologies, has been studied as an effective way of achieving semantic interoperability between local ontologies thus reconciling semantic heterogeneities in multi-source geodata. Nevertheless, confusion still exists in the research field of semantic interoperability. One problem is caused by eliminating elements of local pragmatic contexts in semantic mediation. Comparing to the context-independent feature of a common domain ontology, local application ontologies are closely related to elements (e.g., people, time, location

  4. An integrated approach to shoreline mapping for spill response planning

    Owens, E.H.; LeBlanc, S.R.; Percy, R.J.

    1996-01-01

    A desktop mapping package was introduced which has the capability to provide consistent and standardized application of mapping and data collection/generation techniques. Its application in oil spill cleanup was discussed. The data base can be updated easily as new information becomes available. This provides a response team with access to a wide range of information that would otherwise be difficult to obtain. Standard terms and definitions and shoreline segmentation procedures are part of the system to describe the shore-zone character and shore-zone oiling conditions. The program that is in place for Atlantic Canada involves the integration of (1) Environment Canada's SCAT methodology in pre-spill data generation, (2) shoreline segmentation, (3) response management by objectives, (4) Environment Canada's national sensitivity mapping program, and (5) Environment Canada's field guide for the protection and treatment of oiled shorelines. 7 refs., 6 figs

  5. Completeness, supervenience and ontology

    Maudlin, Tim W E

    2007-01-01

    In 1935, Einstein, Podolsky and Rosen raised the issue of the completeness of the quantum description of a physical system. What they had in mind is whether or not the quantum description is informationally complete, in that all physical features of a system can be recovered from it. In a collapse theory such as the theory of Ghirardi, Rimini and Weber, the quantum wavefunction is informationally complete, and this has often been taken to suggest that according to that theory the wavefunction is all there is. If we distinguish the ontological completeness of a description from its informational completeness, we can see that the best interpretations of the GRW theory must postulate more physical ontology than just the wavefunction

  6. Completeness, supervenience and ontology

    Maudlin, Tim W E [Department of Philosophy, Rutgers University, 26 Nichol Avenue, New Brunswick, NJ 08901-1411 (United States)

    2007-03-23

    In 1935, Einstein, Podolsky and Rosen raised the issue of the completeness of the quantum description of a physical system. What they had in mind is whether or not the quantum description is informationally complete, in that all physical features of a system can be recovered from it. In a collapse theory such as the theory of Ghirardi, Rimini and Weber, the quantum wavefunction is informationally complete, and this has often been taken to suggest that according to that theory the wavefunction is all there is. If we distinguish the ontological completeness of a description from its informational completeness, we can see that the best interpretations of the GRW theory must postulate more physical ontology than just the wavefunction.

  7. LOGISTICS OPTIMIZATION USING ONTOLOGIES

    Hendi , Hayder; Ahmad , Adeel; Bouneffa , Mourad; Fonlupt , Cyril

    2014-01-01

    International audience; Logistics processes involve complex physical flows and integration of different elements. It is widely observed that the uncontrolled processes can decline the state of logistics. The optimization of logistic processes can support the desired growth and consistent continuity of logistics. In this paper, we present a software framework for logistic processes optimization. It primarily defines logistic ontologies and then optimize them. It intends to assist the design of...

  8. Ontology evolution in physics

    Chan, Michael

    2013-01-01

    With the advent of reasoning problems in dynamic environments, there is an increasing need for automated reasoning systems to automatically adapt to unexpected changes in representations. In particular, the automation of the evolution of their ontologies needs to be enhanced without substantially sacrificing expressivity in the underlying representation. Revision of beliefs is not enough, as adding to or removing from beliefs does not change the underlying formal language. Gene...

  9. Mapping embedded applications on MPSoCs : the MNEMEE approach

    Baloukas, C.; Papadopoulos, L.; Soudris, D.; Stuijk, S.; Jovanovic, O.; Schmoll, F.; Cordes, D.; Pyka, A.; Mallik, A.; Mamagkakis, S.; Capman, F.; Collet, S.; Mitas, N.; Kritharidis, D.

    2010-01-01

    As embedded systems are becoming the center of our digital life, system design becomes progressively harder. The integration of multiple features on devices with limited resources requires careful and exhaustive exploration of the design search space in order to efficiently map modern applications

  10. The Facebook Influence Model: A Concept Mapping Approach

    Kota, Rajitha; Schoohs, Shari; Whitehill, Jennifer M.

    2013-01-01

    Abstract Facebook is a popular social media Web site that has been hypothesized to exert potential influence over users' attitudes, intentions, or behaviors. The purpose of this study was to develop a conceptual framework to explain influential aspects of Facebook. This mixed methods study applied concept mapping methodology, a validated five-step method to visually represent complex topics. The five steps comprise preparation, brainstorming, sort and rank, analysis, and interpretation. College student participants were identified using purposeful sampling. The 80 participants had a mean age of 20.5 years, and included 36% males. A total of 169 statements were generated during brainstorming, and sorted into between 6 and 22 groups. The final concept map included 13 clusters. Interpretation data led to grouping of clusters into four final domains, including connection, comparison, identification, and Facebook as an experience. The Facebook Influence Concept Map illustrates key constructs that contribute to influence, incorporating perspectives of older adolescent Facebook users. While Facebook provides a novel lens through which to consider behavioral influence, it can best be considered in the context of existing behavioral theory. The concept map may be used toward development of potential future intervention efforts. PMID:23621717

  11. Mapping of health facilities in Jimeta Metropolis: a digital approach ...

    In planning for any suitable development in any field, the primary requirement is the relevant data and maps. This is one of the major problems hindering the proper planning and monitoring of the various health facilities located in Jimeta metropolis. Survey techniques -were employed for the acquisition of data, GPS was ...

  12. The Facebook influence model: a concept mapping approach.

    Moreno, Megan A; Kota, Rajitha; Schoohs, Shari; Whitehill, Jennifer M

    2013-07-01

    Facebook is a popular social media Web site that has been hypothesized to exert potential influence over users' attitudes, intentions, or behaviors. The purpose of this study was to develop a conceptual framework to explain influential aspects of Facebook. This mixed methods study applied concept mapping methodology, a validated five-step method to visually represent complex topics. The five steps comprise preparation, brainstorming, sort and rank, analysis, and interpretation. College student participants were identified using purposeful sampling. The 80 participants had a mean age of 20.5 years, and included 36% males. A total of 169 statements were generated during brainstorming, and sorted into between 6 and 22 groups. The final concept map included 13 clusters. Interpretation data led to grouping of clusters into four final domains, including connection, comparison, identification, and Facebook as an experience. The Facebook Influence Concept Map illustrates key constructs that contribute to influence, incorporating perspectives of older adolescent Facebook users. While Facebook provides a novel lens through which to consider behavioral influence, it can best be considered in the context of existing behavioral theory. The concept map may be used toward development of potential future intervention efforts.

  13. A National Approach to Map and Quantify Terrestrial Vertebrate Biodiversity

    Biodiversity is crucial for the functioning of ecosystems and the products and services from which we transform natural assets of the Earth for human survival, security, and well-being. The ability to assess, report, map, and forecast the life support functions of ecosystems is a...

  14. Effect of concept mapping approach on students' achievement in ...

    The quasi-experimental research design was used in carrying out the study adopting the pre-test – post-test control type. The sample consists of 180 Senior Secondary One (SS1) Students comprising of 88 males and 92 females. In each ... The experimental group was taught mathematical concepts using concept mapping ...

  15. Partnering with Youth to Map Their Neighborhood Environments: A Multi-Layered GIS Approach

    Topmiller, Michael; Jacquez, Farrah; Vissman, Aaron T.; Raleigh, Kevin; Miller-Francis, Jenni

    2014-01-01

    Mapping approaches offer great potential for community-based participatory researchers interested in displaying youth perceptions and advocating for change. We describe a multi-layered approach for gaining local knowledge of neighborhood environments that engages youth as co-researchers and active knowledge producers. By integrating geographic information systems (GIS) with environmental audits, an interactive focus group, and sketch mapping, the approach provides a place-based understanding of physical activity resources from the situated experience of youth. Youth report safety and a lack of recreational resources as inhibiting physical activity. Maps reflecting youth perceptions aid policy-makers in making place-based improvements for youth neighborhood environments. PMID:25423245

  16. The ontological model and the hybrid expert system for products and processes quality identification involving the approach based on system analysis and quality function deployment

    Dmitriev Aleksandr

    2016-01-01

    Full Text Available Discussed model of quality of identification has improved mathematical tools and allows you to use a variety of additional information. The proposed robust method is a matrix MTQFD (Matrix Technique Quality Function Deployment allows you to determine not only the priorities but also the assessment of the target values of the product characteristics and process parameters, with the possible use of the information on the negative relationship. Designed ontological model, method and model of expert system versatile and can be used to identify the quality of services.

  17. Ontology Design Patterns for Combining Pathology and Anatomy: Application to Study Aging and Longevity in Inbred Mouse Strains

    Alghamdi, Sarah M.

    2018-05-13

    In biomedical research, ontologies are widely used to represent knowledge as well as to annotate datasets. Many of the existing ontologies cover a single type of phenomena, such as a process, cell type, gene, pathological entity or anatomical structure. Consequently, there is a requirement to use multiple ontologies to fully characterize the observations in the datasets. Although this allows precise annotation of different aspects of a given dataset, it limits our ability to use the ontologies in data analysis, as the ontologies are usually disconnected and their combinations cannot be exploited. Motivated by this, here we present novel ontology design methods for combining pathology and anatomy concepts. To this end, we use a dataset of mouse models which has been characterized through two ontologies: one of them is the mouse pathology ontology (MPATH) covering pathological lesions while the other is the mouse anatomy ontology (MA) covering the anatomical site of the lesions. We propose four novel ontology design patterns for combining these ontologies, and use these patterns to generate four ontologies in a data-driven way. To evaluate the generated ontologies, we utilize these in ontology-based data analysis, including ontology enrichment analysis and computation of semantic similarity. We demonstrate that there are significant differences between the four ontologies in different analysis approaches. In addition, when using semantic similarity to confirm the hypothesis that genetically identical mice should develop more similar diseases, the generated combined ontologies lead to significantly better analysis results compared to using each ontology individually. Our results reveal that using ontology design patterns to combine different facets characterizing a dataset can improve established analysis methods.

  18. Soil erodibility mapping using three approaches in the Tangiers province –Northern Morocco

    Hamza Iaaich

    2016-09-01

    Full Text Available Soil erodibility is a key factor in assessing soil loss rates. In fact, soil loss is the most occurring land degradation form in Morocco, affecting rural and urban vulnerable areas. This work deals with large scale mapping of soil erodibility using three mapping approaches: (i the CORINE approach developed for Europe by the JRC; (ii the UNEP/FAO approach developed within the frame of the United Nations Environmental Program for the Mediterranean area; (iii the Universal Soil Loss Equation (USLE K factor. Our study zone is the province of Tangiers, North-West of Morocco. For each approach, we mapped and analyzed different erodibility factors in terms of parent material, topography and soil attributes. The thematic maps were then integrated using a Geographic Information System to elaborate a soil erodibility map for each of the three approaches. Finally, the validity of each approach was checked in the field, focusing on highly eroded areas, by confronting the estimated soil erodibility and the erosion state as observed in the field. We used three statistical indicators for validation: overall accuracy, weighted Kappa factor and omission/commission errors. We found that the UNEP/FAO approach, based principally on lithofacies and topography as mapping inputs, is the most adapted for the case of our study zone, followed by the CORINE approach. The USLE K factor underestimated the soil erodibility, especially for highly eroded areas.

  19. An Ontology for Software Engineering Education

    Ling, Thong Chee; Jusoh, Yusmadi Yah; Adbullah, Rusli; Alwi, Nor Hayati

    2013-01-01

    Software agents communicate using ontology. It is important to build an ontology for specific domain such as Software Engineering Education. Building an ontology from scratch is not only hard, but also incur much time and cost. This study aims to propose an ontology through adaptation of the existing ontology which is originally built based on a…

  20. SELECTION OF ONTOLOGY FOR WEB SERVICE DESCRIPTION LANGUAGE TO ONTOLOGY WEB LANGUAGE CONVERSION

    J. Mannar Mannan; M. Sundarambal; S. Raghul

    2014-01-01

    Semantic web is to extend the current human readable web to encoding some of the semantic of resources in a machine processing form. As a Semantic web component, Semantic Web Services (SWS) uses a mark-up that makes the data into detailed and sophisticated machine readable way. One such language is Ontology Web Language (OWL). Existing conventional web service annotation can be changed to semantic web service by mapping Web Service Description Language (WSDL) with the semantic annotation of O...

  1. A taxonomy of behaviour change methods: an Intervention Mapping approach

    Kok, Gerjo; Gottlieb, Nell H.; Peters, Gjalt-Jorn Y.; Mullen, Patricia Dolan; Parcel, Guy S.; Ruiter, Robert A.C.; Fern?ndez, Mar?a E.; Markham, Christine; Bartholomew, L. Kay

    2015-01-01

    ABSTRACT In this paper, we introduce the Intervention Mapping (IM) taxonomy of behaviour change methods and its potential to be developed into a coding taxonomy. That is, although IM and its taxonomy of behaviour change methods are not in fact new, because IM was originally developed as a tool for intervention development, this potential was not immediately apparent. Second, in explaining the IM taxonomy and defining the relevant constructs, we call attention to the existence of parameters fo...

  2. Concept Mapping as an Approach to Facilitate Participatory Intervention Building.

    L Allen, Michele; Schaleben-Boateng, Dane; Davey, Cynthia S; Hang, Mikow; Pergament, Shannon

    2015-01-01

    A challenge to addressing community-defined need through community-based participatory intervention building is ensuring that all collaborators' opinions are represented. Concept mapping integrates perspectives of individuals with differing experiences, interests, or expertise into a common visually depicted framework, and ranks composite views on importance and feasibility. To describe the use of concept mapping to facilitate participatory intervention building for a school-based, teacher-focused, positive youth development (PYD) promotion program for Latino, Hmong, and Somali youth. Particiants were teachers, administrators, youth, parents, youth workers, and community and university researchers on the projects' community collaborative board. We incorporated previously collected qualitative data into the process. In a mixed-methods process we 1) generated statements based on key informant interview and focus group data from youth workers, teachers, parents, and youth in multiple languages regarding ways teachers promote PYD for Somali, Latino and Hmong youth; 2) guided participants to individually sort statements into meaningful groupings and rate them by importance and feasibility; 3) mapped the statements based on their relation to each other using multivariate statistical analyses to identify concepts, and as a group identified labels for each concept; and 4) used labels and statement ratings to identify feasible and important concepts as priorities for intervention development. We identified 12 concepts related to PYD promotion in schools and prioritized 8 for intervention development. Concept mapping facilitated participatory intervention building by formally representing all participants' opinions, generating visual representation of group thinking, and supporting priority setting. Use of prior qualitative work increased the diversity of viewpoints represented.

  3. Delineation and interpretation of gene networks towards their effect in cellular physiology- a reverse engineering approach for the identification of critical molecular players, through the use of ontologies.

    Moutselos, K; Maglogiannis, I; Chatziioannou, A

    2010-01-01

    Exploiting ontologies, provides clues regarding the involvement of certain molecular processes in the cellular phenotypic manifestation. However, identifying individual molecular actors (genes, proteins, etc.) for targeted biological validation in a generic, prioritized, fashion, based in objective measures of their effects in the cellular physiology, remains a challenge. In this work, a new meta-analysis algorithm is proposed for the holistic interpretation of the information captured in -omic experiments, that is showcased in a transcriptomic, dynamic, DNA microarray dataset, which examines the effect of mastic oil treatment in Lewis lung carcinoma cells. Through the use of the Gene Ontology this algorithm relates genes to specific cellular pathways and vice versa in order to further reverse engineer the critical role of specific genes, starting from the results of various statistical enrichment analyses. The algorithm is able to discriminate candidate hub-genes, implying critical biochemical cross-talk. Moreover, performance measures of the algorithm are derived, when evaluated with respect to the differential expression gene list of the dataset.

  4. Evaluating the Use of an Object-Based Approach to Lithological Mapping in Vegetated Terrain

    Stephen Grebby

    2016-10-01

    Full Text Available Remote sensing-based approaches to lithological mapping are traditionally pixel-oriented, with classification performed on either a per-pixel or sub-pixel basis with complete disregard for contextual information about neighbouring pixels. However, intra-class variability due to heterogeneous surface cover (i.e., vegetation and soil or regional variations in mineralogy and chemical composition can result in the generation of unrealistic, generalised lithological maps that exhibit the “salt-and-pepper” artefact of spurious pixel classifications, as well as poorly defined contacts. In this study, an object-based image analysis (OBIA approach to lithological mapping is evaluated with respect to its ability to overcome these issues by instead classifying groups of contiguous pixels (i.e., objects. Due to significant vegetation cover in the study area, the OBIA approach incorporates airborne multispectral and LiDAR data to indirectly map lithologies by exploiting associations with both topography and vegetation type. The resulting lithological maps were assessed both in terms of their thematic accuracy and ability to accurately delineate lithological contacts. The OBIA approach is found to be capable of generating maps with an overall accuracy of 73.5% through integrating spectral and topographic input variables. When compared to equivalent per-pixel classifications, the OBIA approach achieved thematic accuracy increases of up to 13.1%, whilst also reducing the “salt-and-pepper” artefact to produce more realistic maps. Furthermore, the OBIA approach was also generally capable of mapping lithological contacts more accurately. The importance of optimising the segmentation stage of the OBIA approach is also highlighted. Overall, this study clearly demonstrates the potential of OBIA for lithological mapping applications, particularly in significantly vegetated and heterogeneous terrain.

  5. Application of the Financial Industry Business Ontology (FIBO) for development of a financial organization ontology

    Petrova, G. G.; Tuzovsky, A. F.; Aksenova, N. V.

    2017-01-01

    The article considers an approach to a formalized description and meaning harmonization for financial terms and means of semantic modeling. Ontologies for the semantic models are described with the help of special languages developed for the Semantic Web. Results of FIBO application to solution of different tasks in the Russian financial sector are given.

  6. Fast gene ontology based clustering for microarray experiments.

    Ovaska, Kristian; Laakso, Marko; Hautaniemi, Sampsa

    2008-11-21

    Analysis of a microarray experiment often results in a list of hundreds of disease-associated genes. In order to suggest common biological processes and functions for these genes, Gene Ontology annotations with statistical testing are widely used. However, these analyses can produce a very large number of significantly altered biological processes. Thus, it is often challenging to interpret GO results and identify novel testable biological hypotheses. We present fast software for advanced gene annotation using semantic similarity for Gene Ontology terms combined with clustering and heat map visualisation. The methodology allows rapid identification of genes sharing the same Gene Ontology cluster. Our R based semantic similarity open-source package has a speed advantage of over 2000-fold compared to existing implementations. From the resulting hierarchical clustering dendrogram genes sharing a GO term can be identified, and their differences in the gene expression patterns can be seen from the heat map. These methods facilitate advanced annotation of genes resulting from data analysis.

  7. Stakeholder approach, Stakeholders mental model: A visualization test with cognitive mapping technique

    Garoui Nassreddine

    2012-04-01

    Full Text Available The idea of this paper is to determine the mental models of actors in the firm with respect to the stakeholder approach of corporate governance. The use of the cognitive map to view these diagrams to show the ways of thinking and conceptualization of the stakeholder approach. The paper takes a corporate governance perspective, discusses stakeholder model. It takes also a cognitive mapping technique.

  8. Mapping Partners Master Drug Dictionary to RxNorm using an NLP-based approach.

    Zhou, Li; Plasek, Joseph M; Mahoney, Lisa M; Chang, Frank Y; DiMaggio, Dana; Rocha, Roberto A

    2012-08-01

    To develop an automated method based on natural language processing (NLP) to facilitate the creation and maintenance of a mapping between RxNorm and a local medication terminology for interoperability and meaningful use purposes. We mapped 5961 terms from Partners Master Drug Dictionary (MDD) and 99 of the top prescribed medications to RxNorm. The mapping was conducted at both term and concept levels using an NLP tool, called MTERMS, followed by a manual review conducted by domain experts who created a gold standard mapping. The gold standard was used to assess the overall mapping between MDD and RxNorm and evaluate the performance of MTERMS. Overall, 74.7% of MDD terms and 82.8% of the top 99 terms had an exact semantic match to RxNorm. Compared to the gold standard, MTERMS achieved a precision of 99.8% and a recall of 73.9% when mapping all MDD terms, and a precision of 100% and a recall of 72.6% when mapping the top prescribed medications. The challenges and gaps in mapping MDD to RxNorm are mainly due to unique user or application requirements for representing drug concepts and the different modeling approaches inherent in the two terminologies. An automated approach based on NLP followed by human expert review is an efficient and feasible way for conducting dynamic mapping. Copyright © 2011 Elsevier Inc. All rights reserved.

  9. Using a Similarity Matrix Approach to Evaluate the Accuracy of Rescaled Maps

    Peijun Sun

    2018-03-01

    Full Text Available Rescaled maps have been extensively utilized to provide data at the appropriate spatial resolution for use in various Earth science models. However, a simple and easy way to evaluate these rescaled maps has not been developed. We propose a similarity matrix approach using a contingency table to compute three measures: overall similarity (OS, omission error (OE, and commission error (CE to evaluate the rescaled maps. The Majority Rule Based aggregation (MRB method was employed to produce the upscaled maps to demonstrate this approach. In addition, previously created, coarser resolution land cover maps from other research projects were also available for comparison. The question of which is better, a map initially produced at coarse resolution or a fine resolution map rescaled to a coarse resolution, has not been quantitatively investigated. To address these issues, we selected study sites at three different extent levels. First, we selected twelve regions covering the continental USA, then we selected nine states (from the whole continental USA, and finally we selected nine Agriculture Statistical Districts (ASDs (from within the nine selected states as study sites. Crop/non-crop maps derived from the USDA Crop Data Layer (CDL at 30 m as base maps were used for the upscaling and existing maps at 250 m and 1 km were utilized for the comparison. The results showed that a similarity matrix can effectively provide the map user with the information needed to assess the rescaling. Additionally, the upscaled maps can provide higher accuracy and better represent landscape pattern compared to the existing coarser maps. Therefore, we strongly recommend that an evaluation of the upscaled map and the existing coarser resolution map using a similarity matrix should be conducted before deciding which dataset to use for the modelling. Overall, extending our understanding on how to perform an evaluation of the rescaled map and investigation of the applicability

  10. AERIAL TERRAIN MAPPING USING UNMANNED AERIAL VEHICLE APPROACH

    K. N. Tahar

    2012-08-01

    Full Text Available This paper looks into the latest achievement in the low-cost Unmanned Aerial Vehicle (UAV technology in their capacity to map the semi-development areas. The objectives of this study are to establish a new methodology or a new algorithm in image registration during interior orientation process and to determine the accuracy of the photogrammetric products by using UAV images. Recently, UAV technology has been used in several applications such as mapping, agriculture and surveillance. The aim of this study is to scrutinize the usage of UAV to map the semi-development areas. The performance of the low cost UAV mapping study was established on a study area with two image processing methods so that the results could be comparable. A non-metric camera was attached at the bottom of UAV and it was used to capture images at both sites after it went through several calibration steps. Calibration processes were carried out to determine focal length, principal distance, radial lens distortion, tangential lens distortion and affinity. A new method in image registration for a non-metric camera is discussed in this paper as a part of new methodology of this study. This method used the UAV Global Positioning System (GPS onboard to register the UAV image for interior orientation process. Check points were established randomly at both sites using rapid static Global Positioning System. Ground control points are used for exterior orientation process, and check point is used for accuracy assessment of photogrammetric product. All acquired images were processed in a photogrammetric software. Two methods of image registration were applied in this study, namely, GPS onboard registration and ground control point registration. Both registrations were processed by using photogrammetric software and the result is discussed. Two results were produced in this study, which are the digital orthophoto and the digital terrain model. These results were analyzed by using the root

  11. An image encryption approach based on chaotic maps

    Zhang Linhua; Liao Xiaofeng; Wang Xuebing

    2005-01-01

    It is well-known that images are different from texts in many aspects, such as highly redundancy and correlation, the local structure and the characteristics of amplitude-frequency. As a result, the methods of conventional encryption cannot be applicable to images. In this paper, we improve the properties of confusion and diffusion in terms of discrete exponential chaotic maps, and design a key scheme for the resistance to statistic attack, differential attack and grey code attack. Experimental and theoretical results also show that our scheme is efficient and very secure

  12. ONSET: Automated foundational ontology selection and explanation

    Khan, Z

    2012-10-01

    Full Text Available It has been shown that using a foundational ontology for domain ontology development is beneficial in theory and practice. However, developers have difficulty with choosing the appropriate foundational ontology, and why. In order to solve...

  13. Semi Automatic Ontology Instantiation in the domain of Risk Management

    Makki, Jawad; Alquier, Anne-Marie; Prince, Violaine

    One of the challenging tasks in the context of Ontological Engineering is to automatically or semi-automatically support the process of Ontology Learning and Ontology Population from semi-structured documents (texts). In this paper we describe a Semi-Automatic Ontology Instantiation method from natural language text, in the domain of Risk Management. This method is composed from three steps 1 ) Annotation with part-of-speech tags, 2) Semantic Relation Instances Extraction, 3) Ontology instantiation process. It's based on combined NLP techniques using human intervention between steps 2 and 3 for control and validation. Since it heavily relies on linguistic knowledge it is not domain dependent which is a good feature for portability between the different fields of risk management application. The proposed methodology uses the ontology of the PRIMA1 project (supported by the European community) as a Generic Domain Ontology and populates it via an available corpus. A first validation of the approach is done through an experiment with Chemical Fact Sheets from Environmental Protection Agency2.

  14. CONCEPTION OF ONTOLOGY-BASED SECTOR EDUCATIONAL SPACE

    V. I. Khabarov

    2014-09-01

    Full Text Available PurposeThe aim of the research is to demonstrate the need for the Conception of Ontology-based Sector Educational Space. This Conception could become the basis for the integration of transport sector university information resources into the open virtual network information resource and global educational space. Its content will be presented by standardized ontology-based knowledge packages for educational programs in Russian and English languages.MethodologyComplex-based, ontological, content-based approaches and scientific principles of interdisciplinarity and standardization of knowledge are suggested as the methodological basis of the research. ResultsThe Conception of Ontology-based Sector Educational Space (railway transport, the method of the development of knowledge packages as ontologies in Russian and English languages, the Russian-English Transport Glossary as a separate ontology are among the expected results of the project implementation.Practical implicationsThe Conception could become the basis for the open project to establish the common resource center for transport universities (railway transport. The Conception of ontology-based sector educational space (railway transport could be adapted to the activity of universities of other economic sectors.

  15. Design of Vulnerability Ontologies and Application of Logical Inference for Security Information and Events Menegement

    Olga Vitalievna Polubelova

    2013-02-01

    Full Text Available In the paper the usage of the ontological approach, description logics and logical inference to the design of the data model of computer vulnerabilities and attacks is suggested. The challenge of construction of ontological models is discussed, and an example of its solutions for vulnerabilities is shown. To analyze the benefits of the ontological approach we compare it with the relational representation of vulnerabilities and attacks.

  16. Concept mapping and network analysis: an analytic approach to measure ties among constructs.

    Goldman, Alyssa W; Kane, Mary

    2014-12-01

    Group concept mapping is a mixed-methods approach that helps a group visually represent its ideas on a topic of interest through a series of related maps. The maps and additional graphics are useful for planning, evaluation and theory development. Group concept maps are typically described, interpreted and utilized through points, clusters and distances, and the implications of these features in understanding how constructs relate to one another. This paper focuses on the application of network analysis to group concept mapping to quantify the strength and directionality of relationships among clusters. The authors outline the steps of this analysis, and illustrate its practical use through an organizational strategic planning example. Additional benefits of this analysis to evaluation projects are also discussed, supporting the overall utility of this supplemental technique to the standard concept mapping methodology. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Integrating Ontological Knowledge and Textual Evidence in Estimating Gene and Gene Product Similarity

    Sanfilippo, Antonio P.; Posse, Christian; Gopalan, Banu; Tratz, Stephen C.; Gregory, Michelle L.

    2006-06-08

    With the rising influence of the Gene On-tology, new approaches have emerged where the similarity between genes or gene products is obtained by comparing Gene Ontology code annotations associ-ated with them. So far, these approaches have solely relied on the knowledge en-coded in the Gene Ontology and the gene annotations associated with the Gene On-tology database. The goal of this paper is to demonstrate that improvements to these approaches can be obtained by integrating textual evidence extracted from relevant biomedical literature.

  18. Towards Self-managed Pervasive Middleware using OWL/SWRL ontologies

    Zhang, Weishan; Hansen, Klaus Marius

    2008-01-01

    Self-management for pervasive middleware is important to realize the Ambient Intelligence vision. In this paper, we present an OWL/SWRL context ontologies based self-management approach for pervasive middleware where OWL ontology is used as means for context modeling. The context ontologies....../SWRL context ontologies based self-management approach with the self-diagnosis in Hydra middleware, using device state machine and other dynamic context information, for example web service calls. The evaluations in terms of extensibility, performance and scalability show that this approach is effective...

  19. The Ontology for Biomedical Investigations.

    Bandrowski, Anita; Brinkman, Ryan; Brochhausen, Mathias; Brush, Matthew H; Bug, Bill; Chibucos, Marcus C; Clancy, Kevin; Courtot, Mélanie; Derom, Dirk; Dumontier, Michel; Fan, Liju; Fostel, Jennifer; Fragoso, Gilberto; Gibson, Frank; Gonzalez-Beltran, Alejandra; Haendel, Melissa A; He, Yongqun; Heiskanen, Mervi; Hernandez-Boussard, Tina; Jensen, Mark; Lin, Yu; Lister, Allyson L; Lord, Phillip; Malone, James; Manduchi, Elisabetta; McGee, Monnie; Morrison, Norman; Overton, James A; Parkinson, Helen; Peters, Bjoern; Rocca-Serra, Philippe; Ruttenberg, Alan; Sansone, Susanna-Assunta; Scheuermann, Richard H; Schober, Daniel; Smith, Barry; Soldatova, Larisa N; Stoeckert, Christian J; Taylor, Chris F; Torniai, Carlo; Turner, Jessica A; Vita, Randi; Whetzel, Patricia L; Zheng, Jie

    2016-01-01

    The Ontology for Biomedical Investigations (OBI) is an ontology that provides terms with precisely defined meanings to describe all aspects of how investigations in the biological and medical domains are conducted. OBI re-uses ontologies that provide a representation of biomedical knowledge from the Open Biological and Biomedical Ontologies (OBO) project and adds the ability to describe how this knowledge was derived. We here describe the state of OBI and several applications that are using it, such as adding semantic expressivity to existing databases, building data entry forms, and enabling interoperability between knowledge resources. OBI covers all phases of the investigation process, such as planning, execution and reporting. It represents information and material entities that participate in these processes, as well as roles and functions. Prior to OBI, it was not possible to use a single internally consistent resource that could be applied to multiple types of experiments for these applications. OBI has made this possible by creating terms for entities involved in biological and medical investigations and by importing parts of other biomedical ontologies such as GO, Chemical Entities of Biological Interest (ChEBI) and Phenotype Attribute and Trait Ontology (PATO) without altering their meaning. OBI is being used in a wide range of projects covering genomics, multi-omics, immunology, and catalogs of services. OBI has also spawned other ontologies (Information Artifact Ontology) and methods for importing parts of ontologies (Minimum information to reference an external ontology term (MIREOT)). The OBI project is an open cross-disciplinary collaborative effort, encompassing multiple research communities from around the globe. To date, OBI has created 2366 classes and 40 relations along with textual and formal definitions. The OBI Consortium maintains a web resource (http://obi-ontology.org) providing details on the people, policies, and issues being addressed

  20. Mass spectrometry imaging enriches biomarker discovery approaches with candidate mapping.

    Scott, Alison J; Jones, Jace W; Orschell, Christie M; MacVittie, Thomas J; Kane, Maureen A; Ernst, Robert K

    2014-01-01

    Integral to the characterization of radiation-induced tissue damage is the identification of unique biomarkers. Biomarker discovery is a challenging and complex endeavor requiring both sophisticated experimental design and accessible technology. The resources within the National Institute of Allergy and Infectious Diseases (NIAID)-sponsored Consortium, Medical Countermeasures Against Radiological Threats (MCART), allow for leveraging robust animal models with novel molecular imaging techniques. One such imaging technique, MALDI (matrix-assisted laser desorption ionization) mass spectrometry imaging (MSI), allows for the direct spatial visualization of lipids, proteins, small molecules, and drugs/drug metabolites-or biomarkers-in an unbiased manner. MALDI-MSI acquires mass spectra directly from an intact tissue slice in discrete locations across an x, y grid that are then rendered into a spatial distribution map composed of ion mass and intensity. The unique mass signals can be plotted to generate a spatial map of biomarkers that reflects pathology and molecular events. The crucial unanswered questions that can be addressed with MALDI-MSI include identification of biomarkers for radiation damage that reflect the response to radiation dose over time and the efficacy of therapeutic interventions. Techniques in MALDI-MSI also enable integration of biomarker identification among diverse animal models. Analysis of early, sublethally irradiated tissue injury samples from diverse mouse tissues (lung and ileum) shows membrane phospholipid signatures correlated with histological features of these unique tissues. This paper will discuss the application of MALDI-MSI for use in a larger biomarker discovery pipeline.

  1. The Human Phenotype Ontology in 2017

    Köhler, Sebastian; Vasilevsky, Nicole A.; Engelstad, Mark; Foster, Erin; McMurry, Julie

    2016-01-01

    Deep phenotyping has been defined as the precise and comprehensive analysis of phenotypic abnormalities in which the individual components of the phenotype are observed and described. The three components of the Human PhenotypeOntology (HPO; www.human-phenotype-ontology.org) project are the phenotype vocabulary, disease-phenotype annotations and the algorithms that operate on these. These components are being used for computational deep phenotyping and precision medicine as well as integration of clinical data into translational research. The HPO is being increasingly adopted as a standard for phenotypic abnormalities by diverse groups such as international rare disease organizations, registries, clinical labs, biomedical resources, and clinical software tools and will thereby contribute toward nascent efforts at global data exchange for identifying disease etiologies. This update article reviews the progress of the HPO project since the debut Nucleic Acids Research database article in 2014, including specific areas of expansion such as common (complex) disease, new algorithms for phenotype driven genomic discovery and diagnostics, integration of cross-species mapping efforts with the Mammalian Phenotype Ontology, an improved quality control pipeline, and the addition of patient-friendly terminology.

  2. Information Pre-Processing using Domain Meta-Ontology and Rule Learning System

    Ranganathan, Girish R.; Biletskiy, Yevgen

    Around the globe, extraordinary amounts of documents are being created by Enterprises and by users outside these Enterprises. The documents created in the Enterprises constitute the main focus of the present chapter. These documents are used to perform numerous amounts of machine processing. While using thesedocuments for machine processing, lack of semantics of the information in these documents may cause misinterpretation of the information, thereby inhibiting the productiveness of computer assisted analytical work. Hence, it would be profitable to the Enterprises if they use well defined domain ontologies which will serve as rich source(s) of semantics for the information in the documents. These domain ontologies can be created manually, semi-automatically or fully automatically. The focus of this chapter is to propose an intermediate solution which will enable relatively easy creation of these domain ontologies. The process of extracting and capturing domain ontologies from these voluminous documents requires extensive involvement of domain experts and application of methods of ontology learning that are substantially labor intensive; therefore, some intermediate solutions which would assist in capturing domain ontologies must be developed. This chapter proposes a solution in this direction which involves building a meta-ontology that will serve as an intermediate information source for the main domain ontology. This chapter proposes a solution in this direction which involves building a meta-ontology as a rapid approach in conceptualizing a domain of interest from huge amount of source documents. This meta-ontology can be populated by ontological concepts, attributes and relations from documents, and then refined in order to form better domain ontology either through automatic ontology learning methods or some other relevant ontology building approach.

  3. SSDOnt: An Ontology for Representing Single-Subject Design Studies.

    Berges, Idoia; Bermúdez, Jesus; Illarramendi, Arantza

    2018-02-01

    Single-Subject Design is used in several areas such as education and biomedicine. However, no suited formal vocabulary exists for annotating the detailed configuration and the results of this type of research studies with the appropriate granularity for looking for information about them. Therefore, the search for those study designs relies heavily on a syntactical search on the abstract, keywords or full text of the publications about the study, which entails some limitations. To present SSDOnt, a specific purpose ontology for describing and annotating single-subject design studies, so that complex questions can be asked about them afterwards. The ontology was developed following the NeOn methodology. Once the requirements of the ontology were defined, a formal model was described in a Description Logic and later implemented in the ontology language OWL 2 DL. We show how the ontology provides a reference model with a suitable terminology for the annotation and searching of single-subject design studies and their main components, such as the phases, the intervention types, the outcomes and the results. Some mappings with terms of related ontologies have been established. We show as proof-of-concept that classes in the ontology can be easily extended to annotate more precise information about specific interventions and outcomes such as those related to autism. Moreover, we provide examples of some types of queries that can be posed to the ontology. SSDOnt has achieved the purpose of covering the descriptions of the domain of single-subject research studies. Schattauer GmbH.

  4. Development and application of an interaction network ontology for literature mining of vaccine-associated gene-gene interactions.

    Hur, Junguk; Özgür, Arzucan; Xiang, Zuoshuang; He, Yongqun

    2015-01-01

    Literature mining of gene-gene interactions has been enhanced by ontology-based name classifications. However, in biomedical literature mining, interaction keywords have not been carefully studied and used beyond a collection of keywords. In this study, we report the development of a new Interaction Network Ontology (INO) that classifies >800 interaction keywords and incorporates interaction terms from the PSI Molecular Interactions (PSI-MI) and Gene Ontology (GO). Using INO-based literature mining results, a modified Fisher's exact test was established to analyze significantly over- and under-represented enriched gene-gene interaction types within a specific area. Such a strategy was applied to study the vaccine-mediated gene-gene interactions using all PubMed abstracts. The Vaccine Ontology (VO) and INO were used to support the retrieval of vaccine terms and interaction keywords from the literature. INO is aligned with the Basic Formal Ontology (BFO) and imports terms from 10 other existing ontologies. Current INO includes 540 terms. In terms of interaction-related terms, INO imports and aligns PSI-MI and GO interaction terms and includes over 100 newly generated ontology terms with 'INO_' prefix. A new annotation property, 'has literature mining keywords', was generated to allow the listing of different keywords mapping to the interaction types in INO. Using all PubMed documents published as of 12/31/2013, approximately 266,000 vaccine-associated documents were identified, and a total of 6,116 gene-pairs were associated with at least one INO term. Out of 78 INO interaction terms associated with at least five gene-pairs of the vaccine-associated sub-network, 14 terms were significantly over-represented (i.e., more frequently used) and 17 under-represented based on our modified Fisher's exact test. These over-represented and under-represented terms share some common top-level terms but are distinct at the bottom levels of the INO hierarchy. The analysis of these

  5. HuPSON: the human physiology simulation ontology.

    Gündel, Michaela; Younesi, Erfan; Malhotra, Ashutosh; Wang, Jiali; Li, Hui; Zhang, Bijun; de Bono, Bernard; Mevissen, Heinz-Theodor; Hofmann-Apitius, Martin

    2013-11-22

    Large biomedical simulation initiatives, such as the Virtual Physiological Human (VPH), are substantially dependent on controlled vocabularies to facilitate the exchange of information, of data and of models. Hindering these initiatives is a lack of a comprehensive ontology that covers the essential concepts of the simulation domain. We propose a first version of a newly constructed ontology, HuPSON, as a basis for shared semantics and interoperability of simulations, of models, of algorithms and of other resources in this domain. The ontology is based on the Basic Formal Ontology, and adheres to the MIREOT principles; the constructed ontology has been evaluated via structural features, competency questions and use case scenarios.The ontology is freely available at: http://www.scai.fraunhofer.de/en/business-research-areas/bioinformatics/downloads.html (owl files) and http://bishop.scai.fraunhofer.de/scaiview/ (browser). HuPSON provides a framework for a) annotating simulation experiments, b) retrieving relevant information that are required for modelling, c) enabling interoperability of algorithmic approaches used in biomedical simulation, d) comparing simulation results and e) linking knowledge-based approaches to simulation-based approaches. It is meant to foster a more rapid uptake of semantic technologies in the modelling and simulation domain, with particular focus on the VPH domain.

  6. Adding a little reality to building ontologies for biology.

    Phillip Lord

    Full Text Available BACKGROUND: Many areas of biology are open to mathematical and computational modelling. The application of discrete, logical formalisms defines the field of biomedical ontologies. Ontologies have been put to many uses in bioinformatics. The most widespread is for description of entities about which data have been collected, allowing integration and analysis across multiple resources. There are now over 60 ontologies in active use, increasingly developed as large, international collaborations. There are, however, many opinions on how ontologies should be authored; that is, what is appropriate for representation. Recently, a common opinion has been the "realist" approach that places restrictions upon the style of modelling considered to be appropriate. METHODOLOGY/PRINCIPAL FINDINGS: Here, we use a number of case studies for describing the results of biological experiments. We investigate the ways in which these could be represented using both realist and non-realist approaches; we consider the limitations and advantages of each of these models. CONCLUSIONS/SIGNIFICANCE: From our analysis, we conclude that while realist principles may enable straight-forward modelling for some topics, there are crucial aspects of science and the phenomena it studies that do not fit into this approach; realism appears to be over-simplistic which, perversely, results in overly complex ontological models. We suggest that it is impossible to avoid compromise in modelling ontology; a clearer understanding of these compromises will better enable appropriate modelling, fulfilling the many needs for discrete mathematical models within computational biology.

  7. The foundational ontology library ROMULUS

    Khan, ZC

    2013-09-01

    Full Text Available . We present here a basic step in that direction with the Repository of Ontologies for MULtiple USes, ROMULUS, which is the first online library of machine-processable, modularised, aligned, and logic-based merged foundational ontologies. In addition...

  8. Tracking Changes during Ontology Evolution

    Noy, Natalya F.; Kunnatur, Sandhya; Klein, Michel; Musen, Mark A.

    2004-01-01

    As ontology development becomes a collaborative process, developers face the problem of maintaining versions of ontologies akin to maintaining versions of software code or versions of documents in large projects. Traditional versioning systems enable users to compare versions, examine changes, and

  9. Application of neuroanatomical ontologies for neuroimaging data annotation

    Jessica A Turner

    2010-06-01

    Full Text Available The annotation of functional neuroimaging results for data sharing and reuse is particularly challenging, due to the diversity of terminologies of neuroanatomical structures and cortical parcellation schemes. To address this challenge, we extended the Foundational Model of Anatomy Ontology (FMA to include cytoarchitectural, Brodmann area labels, and a morphological cortical labeling scheme (e.g., the part of Brodmann area 6 in the left precentral gyrus. This representation was also used to augment the neuroanatomical axis of RadLex, the ontology for clinical imaging. The resulting neuroanatomical ontology contains explicit relationships indicating which brain regions are “part of” which other regions, across cytoarchitectural and morphological labeling schemas. We annotated a large functional neuroimaging dataset with terms from the ontology and applied a reasoning engine to analyze this dataset in conjunction with the ontology, and achieved successful inferences from the most specific level (e.g., how many subjects showed activation in a sub-part of the middle frontal gyrus to more general (how many activations were found in areas connected via a known white matter tract?. In summary, we have produced a neuroanatomical ontology that harmonizes several different terminologies of neuroanatomical structures and cortical parcellation schemes. This neuranatomical ontology is publicly available as a view of FMA at the Bioportal website at http://rest.bioontology.org/bioportal/ontologies/download/10005. The ontological encoding of anatomic knowledge can be exploited by computer reasoning engines to make inferences about neuroanatomical relationships described in imaging datasets using different terminologies. This approach could ultimately enable knowledge discovery from large, distributed fMRI studies or medical record mining.

  10. Ontology-Based e-Assessment for Accounting Education

    Litherland, Kate; Carmichael, Patrick; Martínez-García, Agustina

    2013-01-01

    This summary reports on a pilot of a novel, ontology-based e-assessment system in accounting. The system, OeLe, uses emerging semantic technologies to offer an online assessment environment capable of marking students' free text answers to questions of a conceptual nature. It does this by matching their response with a "concept map" or…

  11. An ontology of and roadmap for mHealth research.

    Cameron, Joshua D; Ramaprasad, Arkalgud; Syn, Thant

    2017-04-01

    Mobile health or mHealth research has been growing exponentially in recent years. However, the research on mHealth has been ad-hoc and selective without a clear definition of the mHealth domain. Without a roadmap for research we may not realize the full potential of mHealth. In this paper, we present an ontological framework to define the mHealth domain and illuminate a roadmap. We present an ontology of mHealth. The ontology is developed by systematically deconstructing the domain into its primary dimensions and elements. We map the extent research on mHealth in 2014 onto the ontology and highlight the bright, light, and blind/blank spots which represent the emphasis of mHealth research. The emphases of mHealth research in 2014 are very uneven. There are a few bright spots and many light spots. The research predominantly focuses on individuals' use of mobile devices and applications to capture or obtain health-related data mostly to improve quality of care through mobile intervention. We argue that the emphases can be balanced in the roadmap for mHealth research. The ontological mapping plays an integral role in developing and maintaining the roadmap which can be updated periodically to continuously assess and guide mHealth research. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Defining functional distances over Gene Ontology

    del Pozo Angela

    2008-01-01

    Full Text Available Abstract Background A fundamental problem when trying to define the functional relationships between proteins is the difficulty in quantifying functional similarities, even when well-structured ontologies exist regarding the activity of proteins (i.e. 'gene ontology' -GO-. However, functional metrics can overcome the problems in the comparing and evaluating functional assignments and predictions. As a reference of proximity, previous approaches to compare GO terms considered linkage in terms of ontology weighted by a probability distribution that balances the non-uniform 'richness' of different parts of the Direct Acyclic Graph. Here, we have followed a different approach to quantify functional similarities between GO terms. Results We propose a new method to derive 'functional distances' between GO terms that is based on the simultaneous occurrence of terms in the same set of Interpro entries, instead of relying on the structure of the GO. The coincidence of GO terms reveals natural biological links between the GO functions and defines a distance model Df which fulfils the properties of a Metric Space. The distances obtained in this way can be represented as a hierarchical 'Functional Tree'. Conclusion The method proposed provides a new definition of distance that enables the similarity between GO terms to be quantified. Additionally, the 'Functional Tree' defines groups with biological meaning enhancing its utility for protein function comparison and prediction. Finally, this approach could be for function-based protein searches in databases, and for analysing the gene clusters produced by DNA array experiments.

  13. Proposed actions are no actions: re-modeling an ontology design pattern with a realist top-level ontology.

    Seddig-Raufie, Djamila; Jansen, Ludger; Schober, Daniel; Boeker, Martin; Grewe, Niels; Schulz, Stefan

    2012-09-21

    Ontology Design Patterns (ODPs) are representational artifacts devised to offer solutions for recurring ontology design problems. They promise to enhance the ontology building process in terms of flexibility, re-usability and expansion, and to make the result of ontology engineering more predictable. In this paper, we analyze ODP repositories and investigate their relation with upper-level ontologies. In particular, we compare the BioTop upper ontology to the Action ODP from the NeOn an ODP repository. In view of the differences in the respective approaches, we investigate whether the Action ODP can be embedded into BioTop. We demonstrate that this requires re-interpreting the meaning of classes of the NeOn Action ODP in the light of the precepts of realist ontologies. As a result, the re-design required clarifying the ontological commitment of the ODP classes by assigning them to top-level categories. Thus, ambiguous definitions are avoided. Classes of real entities are clearly distinguished from classes of information artifacts. The proposed approach avoids the commitment to the existence of unclear future entities which underlies the NeOn Action ODP. Our re-design is parsimonious in the sense that existing BioTop content proved to be largely sufficient to define the different types of actions and plans. The proposed model demonstrates that an expressive upper-level ontology provides enough resources and expressivity to represent even complex ODPs, here shown with the different flavors of Action as proposed in the NeOn ODP. The advantage of ODP inclusion into a top-level ontology is the given predetermined dependency of each class, an existing backbone structure and well-defined relations. Our comparison shows that the use of some ODPs is more likely to cause problems for ontology developers, rather than to guide them. Besides the structural properties, the explanation of classification results were particularly hard to grasp for 'self-sufficient' ODPs as

  14. Conjecture Mapping: An Approach to Systematic Educational Design Research

    Sandoval, William

    2014-01-01

    Design research is strongly associated with the learning sciences community, and in the 2 decades since its conception it has become broadly accepted. Yet within and without the learning sciences there remains confusion about how to do design research, with most scholarship on the approach describing what it is rather than how to do it. This…

  15. Mind Map Marketing: A Creative Approach in Developing Marketing Skills

    Eriksson, Lars Torsten; Hauer, Amie M.

    2004-01-01

    In this conceptual article, the authors describe an alternative course structure that joins learning key marketing concepts to creative problem solving. The authors describe an approach using a convergent-divergent-convergent (CDC) process: key concepts are first derived from case material to be organized in a marketing matrix, which is then used…

  16. Applied Ontology Engineering in Cloud Services, Networks and Management Systems

    Serrano Orozco, J Martín

    2012-01-01

    Metadata standards in today’s ICT sector are proliferating at unprecedented levels, while automated information management systems collect and process exponentially increasing quantities of data. With interoperability and knowledge exchange identified as a core challenge in the sector, this book examines the role ontology engineering can play in providing solutions to the problems of information interoperability and linked data. At the same time as introducing basic concepts of ontology engineering, the book discusses methodological approaches to formal representation of data and information models, thus facilitating information interoperability between heterogeneous, complex and distributed communication systems. In doing so, the text advocates the advantages of using ontology engineering in telecommunications systems. In addition, it offers a wealth of guidance and best-practice techniques for instances in which ontology engineering is applied in cloud services, computer networks and management systems. �...

  17. Ontological Issues and the Possible Development of Cultural Psychology.

    Pérez-Campos, Gilberto

    2017-12-01

    Ontological issues have a bad reputation within mainstream psychology. This paper, however, is an attempt to argue that ontological reflection may play an important role in the development of cultural psychology. A cross-reading of two recent papers on the subject (Mammen & Mironenko, Integrative Psychological and Behavioral Science, 49(4), 681-713, 2015; Simão Integrative Psychological and Behavioral Science, 50, 568-585, 2016), aimed at characterizing their respective approaches to ontological issues, sets the stage for a presentation of Cornelius Castoriadis' ontological reflections. On this basis, a dialogue is initiated with E.E. Boesch's Symbolic Activity Theory that could contribute to a more refined understanding of human psychological functioning in its full complexity.

  18. Semantator: annotating clinical narratives with semantic web ontologies.

    Song, Dezhao; Chute, Christopher G; Tao, Cui

    2012-01-01

    To facilitate clinical research, clinical data needs to be stored in a machine processable and understandable way. Manual annotating clinical data is time consuming. Automatic approaches (e.g., Natural Language Processing systems) have been adopted to convert such data into structured formats; however, the quality of such automatically extracted data may not always be satisfying. In this paper, we propose Semantator, a semi-automatic tool for document annotation with Semantic Web ontologies. With a loaded free text document and an ontology, Semantator supports the creation/deletion of ontology instances for any document fragment, linking/disconnecting instances with the properties in the ontology, and also enables automatic annotation by connecting to the NCBO annotator and cTAKES. By representing annotations in Semantic Web standards, Semantator supports reasoning based upon the underlying semantics of the owl:disjointWith and owl:equivalentClass predicates. We present discussions based on user experiences of using Semantator.

  19. Exploring teacher's perceptions of concept mapping as a teaching strategy in science: An action research approach

    Marks Krpan, Catherine Anne

    In order to promote science literacy in the classroom, students need opportunities in which they can personalize their understanding of the concepts they are learning. Current literature supports the use of concept maps in enabling students to make personal connections in their learning of science. Because they involve creating explicit connections between concepts, concept maps can assist students in developing metacognitive strategies and assist educators in identifying misconceptions in students' thinking. The literature also notes that concept maps can improve student achievement and recall. Much of the current literature focuses primarily on concept mapping at the secondary and university levels, with limited focus on the elementary panel. The research rarely considers teachers' thoughts and ideas about the concept mapping process. In order to effectively explore concept mapping from the perspective of elementary teachers, I felt that an action research approach would be appropriate. Action research enabled educators to debate issues about concept mapping and test out ideas in their classrooms. It also afforded the participants opportunities to explore their own thinking, reflect on their personal journeys as educators and play an active role in their professional development. In an effort to explore concept mapping from the perspective of elementary educators, an action research group of 5 educators and myself was established and met regularly from September 1999 until June 2000. All of the educators taught in the Toronto area. These teachers were interested in exploring how concept mapping could be used as a learning tool in their science classrooms. In summary, this study explores the journey of five educators and myself as we engaged in collaborative action research. This study sets out to: (1) Explore how educators believe concept mapping can facilitate teaching and student learning in the science classroom. (2) Explore how educators implement concept

  20. CLASSIFICATION ALGORITHMS FOR BIG DATA ANALYSIS, A MAP REDUCE APPROACH

    V. A. Ayma

    2015-03-01

    Full Text Available Since many years ago, the scientific community is concerned about how to increase the accuracy of different classification methods, and major achievements have been made so far. Besides this issue, the increasing amount of data that is being generated every day by remote sensors raises more challenges to be overcome. In this work, a tool within the scope of InterIMAGE Cloud Platform (ICP, which is an open-source, distributed framework for automatic image interpretation, is presented. The tool, named ICP: Data Mining Package, is able to perform supervised classification procedures on huge amounts of data, usually referred as big data, on a distributed infrastructure using Hadoop MapReduce. The tool has four classification algorithms implemented, taken from WEKA’s machine learning library, namely: Decision Trees, Naïve Bayes, Random Forest and Support Vector Machines (SVM. The results of an experimental analysis using a SVM classifier on data sets of different sizes for different cluster configurations demonstrates the potential of the tool, as well as aspects that affect its performance.

  1. A National Approach to Quantify and Map Biodiversity ...

    Ecosystem services, i.e., "services provided to humans from natural systems," have become a key issue of this century in resource management, conservation planning, human well-being, and environmental decision analysis. Mapping and quantifying ecosystem services have become strategic national interests for integrating ecology with economics to help understand the effects of human policies and actions and their subsequent impacts on both ecosystem function and human welfare. The degradation of natural ecosystems and climate variation impact the environment and society by affecting ecological integrity and ecosystems’ capacity to provide critical services (i.e., the contributions of ecosystems to human well-being). These challenges will require complex management decisions that can often involve significant trade-offs between societal desires and environmental needs. Evaluating trade-offs in terms of ecosystem services and human well-being provides an intuitive and comprehensive way to assess the broad implications of our decisions and to help shape policies that enhance environmental and social sustainability. In answer to this challenge, the U.S. government has created a partnership among the U.S. Environmental Protection Agency, other Federal agencies, academic institutions, and, Non-Governmental Organizations to develop the EnviroAtlas, an online Decision Support Tool that allows users (e.g., planners, policy-makers, resource managers, NGOs, private indu

  2. Constructivist-Visual Mind Map Teaching Approach and the Quality of Students' Cognitive Structures

    Dhindsa, Harkirat S.; Makarimi-Kasim; Anderson, O. Roger

    2011-01-01

    This study compared the effects of a constructivist-visual mind map teaching approach (CMA) and of a traditional teaching approach (TTA) on (a) the quality and richness of students' knowledge structures and (b) TTA and CMA students' perceptions of the extent that a constructivist learning environment (CLE) was created in their classes. The sample…

  3. Ontology modularization to improve semantic medical image annotation.

    Wennerberg, Pinar; Schulz, Klaus; Buitelaar, Paul

    2011-02-01

    Searching for medical images and patient reports is a significant challenge in a clinical setting. The contents of such documents are often not described in sufficient detail thus making it difficult to utilize the inherent wealth of information contained within them. Semantic image annotation addresses this problem by describing the contents of images and reports using medical ontologies. Medical images and patient reports are then linked to each other through common annotations. Subsequently, search algorithms can more effectively find related sets of documents on the basis of these semantic descriptions. A prerequisite to realizing such a semantic search engine is that the data contained within should have been previously annotated with concepts from medical ontologies. One major challenge in this regard is the size and complexity of medical ontologies as annotation sources. Manual annotation is particularly time consuming labor intensive in a clinical environment. In this article we propose an approach to reducing the size of clinical ontologies for more efficient manual image and text annotation. More precisely, our goal is to identify smaller fragments of a large anatomy ontology that are relevant for annotating medical images from patients suffering from lymphoma. Our work is in the area of ontology modularization, which is a recent and active field of research. We describe our approach, methods and data set in detail and we discuss our results. Copyright © 2010 Elsevier Inc. All rights reserved.

  4. The Cell Ontology 2016: enhanced content, modularization, and ontology interoperability.

    Diehl, Alexander D; Meehan, Terrence F; Bradford, Yvonne M; Brush, Matthew H; Dahdul, Wasila M; Dougall, David S; He, Yongqun; Osumi-Sutherland, David; Ruttenberg, Alan; Sarntivijai, Sirarat; Van Slyke, Ceri E; Vasilevsky, Nicole A; Haendel, Melissa A; Blake, Judith A; Mungall, Christopher J

    2016-07-04

    The Cell Ontology (CL) is an OBO Foundry candidate ontology covering the domain of canonical, natural biological cell types. Since its inception in 2005, the CL has undergone multiple rounds of revision and expansion, most notably in its representation of hematopoietic cells. For in vivo cells, the CL focuses on vertebrates but provides general classes that can be used for other metazoans, which can be subtyped in species-specific ontologies. Recent work on the CL has focused on extending the representation of various cell types, and developing new modules in the CL itself, and in related ontologies in coordination with the CL. For example, the Kidney and Urinary Pathway Ontology was used as a template to populate the CL with additional cell types. In addition, subtypes of the class 'cell in vitro' have received improved definitions and labels to provide for modularity with the representation of cells in the Cell Line Ontology and Reagent Ontology. Recent changes in the ontology development methodology for CL include a switch from OBO to OWL for the primary encoding of the ontology, and an increasing reliance on logical definitions for improved reasoning. The CL is now mandated as a metadata standard for large functional genomics and transcriptomics projects, and is used extensively for annotation, querying, and analyses of cell type specific data in sequencing consortia such as FANTOM5 and ENCODE, as well as for the NIAID ImmPort database and the Cell Image Library. The CL is also a vital component used in the modular construction of other biomedical ontologies-for example, the Gene Ontology and the cross-species anatomy ontology, Uberon, use CL to support the consistent representation of cell types across different levels of anatomical granularity, such as tissues and organs. The ongoing improvements to the CL make it a valuable resource to both the OBO Foundry community and the wider scientific community, and we continue to experience increased interest in the

  5. Mapping between Classical Risk Management and Game Theoretical Approaches

    Rajbhandari , Lisa; Snekkenes , Einar ,

    2011-01-01

    Part 2: Work in Progress; International audience; In a typical classical risk assessment approach, the probabilities are usually guessed and not much guidance is provided on how to get the probabilities right. When coming up with probabilities, people are generally not well calibrated. History may not always be a very good teacher. Hence, in this paper, we explain how game theory can be integrated into classical risk management. Game theory puts emphasis on collecting representative data on h...

  6. Automatic Generation of Tests from Domain and Multimedia Ontologies

    Papasalouros, Andreas; Kotis, Konstantinos; Kanaris, Konstantinos

    2011-01-01

    The aim of this article is to present an approach for generating tests in an automatic way. Although other methods have been already reported in the literature, the proposed approach is based on ontologies, representing both domain and multimedia knowledge. The article also reports on a prototype implementation of this approach, which…

  7. Logic and Ontology

    Newton C. A. da Costa

    2002-12-01

    Full Text Available In view of the present state of development of non classical logic, especially of paraconsistent logic, a new stand regarding the relations between logic and ontology is defended In a parody of a dictum of Quine, my stand May be summarized as follows. To be is to be the value of a variable a specific language with a given underlying logic Yet my stand differs from Quine’s, because, among other reasons, I accept some first order heterodox logics as genuine alternatives to classical logic I also discuss some questions of non classical logic to substantiate my argument, and suggest that may position complements and extends some ideas advanced by L Apostel.

  8. Effective information flow through efficient supply chain management - Value stream mapping approach Case Outokumpu Tornio Works

    Juvonen, Piia

    2012-01-01

    ABSTRACT Juvonen, Piia Suvi Päivikki 2012. Effective information flow through efficient supply chain management -Value stream mapping approach - Case Outokumpu Tornio Works. Master`s Thesis. Kemi-Tornio University of Applied Sciences. Business and Culture. Pages 63. Appendices 2. The general aim of this thesis is to explore effective information flow through efficient supply chain management by following one of the lean management principles, value stream mapping. The specific research...

  9. Gene Ontology Consortium: going forward.

    2015-01-01

    The Gene Ontology (GO; http://www.geneontology.org) is a community-based bioinformatics resource that supplies information about gene product function using ontologies to represent biological knowledge. Here we describe improvements and expansions to several branches of the ontology, as well as updates that have allowed us to more efficiently disseminate the GO and capture feedback from the research community. The Gene Ontology Consortium (GOC) has expanded areas of the ontology such as cilia-related terms, cell-cycle terms and multicellular organism processes. We have also implemented new tools for generating ontology terms based on a set of logical rules making use of templates, and we have made efforts to increase our use of logical definitions. The GOC has a new and improved web site summarizing new developments and documentation, serving as a portal to GO data. Users can perform GO enrichment analysis, and search the GO for terms, annotations to gene products, and associated metadata across multiple species using the all-new AmiGO 2 browser. We encourage and welcome the input of the research community in all biological areas in our continued effort to improve the Gene Ontology. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  10. Analytic mappings: a new approach in particle production by accelerated observers

    Sanchez, N.

    1982-01-01

    This is a summary of the authors recent results about physical consequences of analytic mappings in the space-time. Classically, the mapping defines an accelerated frame. At the quantum level it gives rise to particle production. Statistically, the real singularities of the mapping have associated temperatures. This concerns a new approach in Q.F.T. as formulated in accelerated frames. It has been considered as a first step in the understanding of the deep connection that could exist between the structure (geometry and topology) of the space-time and thermodynamics, mainly motivated by the works of Hawking since 1975. (Auth.)

  11. A Self-Adaptive Evolutionary Approach to the Evolution of Aesthetic Maps for a RTS Game

    Lara-Cabrera, Raúl; Cotta, Carlos; Fernández-Leiva, Antonio J.

    2014-01-01

    Procedural content generation (PCG) is a research eld on the rise,with numerous papers devoted to this topic. This paper presents a PCG method based on a self-adaptive evolution strategy for the automatic generation of maps for the real-time strategy (RTS) game PlanetWars. These maps are generated in order to ful ll the aesthetic preferences of the user, as implied by her assessment of a collection of maps used as training set. A topological approach is used for the characterization of th...

  12. Rapid Construction of Fe-Co-Ni Composition-Phase Map by Combinatorial Materials Chip Approach.

    Xing, Hui; Zhao, Bingbing; Wang, Yujie; Zhang, Xiaoyi; Ren, Yang; Yan, Ningning; Gao, Tieren; Li, Jindong; Zhang, Lanting; Wang, Hong

    2018-03-12

    One hundred nanometer thick Fe-Co-Ni material chips were prepared and isothermally annealed at 500, 600, and 700 °C, respectively. Pixel-by-pixel composition and structural mapping was performed by microbeam X-ray at synchrotron light source. Diffraction images were recorded at a rate of 1 pattern/s. The XRD patterns were automatically processed, phase-identified, and categorized by hierarchical clustering algorithm to construct the composition-phase map. The resulting maps are consistent with corresponding isothermal sections reported in the ASM Alloy Phase Diagram Database, verifying the effectiveness of the present approach in phase diagram construction.

  13. A taxonomy of behaviour change methods: an Intervention Mapping approach.

    Kok, Gerjo; Gottlieb, Nell H; Peters, Gjalt-Jorn Y; Mullen, Patricia Dolan; Parcel, Guy S; Ruiter, Robert A C; Fernández, María E; Markham, Christine; Bartholomew, L Kay

    2016-09-01

    In this paper, we introduce the Intervention Mapping (IM) taxonomy of behaviour change methods and its potential to be developed into a coding taxonomy. That is, although IM and its taxonomy of behaviour change methods are not in fact new, because IM was originally developed as a tool for intervention development, this potential was not immediately apparent. Second, in explaining the IM taxonomy and defining the relevant constructs, we call attention to the existence of parameters for effectiveness of methods, and explicate the related distinction between theory-based methods and practical applications and the probability that poor translation of methods may lead to erroneous conclusions as to method-effectiveness. Third, we recommend a minimal set of intervention characteristics that may be reported when intervention descriptions and evaluations are published. Specifying these characteristics can greatly enhance the quality of our meta-analyses and other literature syntheses. In conclusion, the dynamics of behaviour change are such that any taxonomy of methods of behaviour change needs to acknowledge the importance of, and provide instruments for dealing with, three conditions for effectiveness for behaviour change methods. For a behaviour change method to be effective: (1) it must target a determinant that predicts behaviour; (2) it must be able to change that determinant; (3) it must be translated into a practical application in a way that preserves the parameters for effectiveness and fits with the target population, culture, and context. Thus, taxonomies of methods of behaviour change must distinguish the specific determinants that are targeted, practical, specific applications, and the theory-based methods they embody. In addition, taxonomies should acknowledge that the lists of behaviour change methods will be used by, and should be used by, intervention developers. Ideally, the taxonomy should be readily usable for this goal; but alternatively, it should be

  14. A New role of ontologies and advanced scientific visualization in big data analytics

    Chuprina, Svetlana

    2016-01-01

    Accessing and contextual semantic searching structured, semi-structured and unstructured information resources and their ontology based analysis in a uniform way across text-free Big Data query implementation is a main feature of approach under discussion. To increase the semantic power of query results’ analysis the ontology based implementation of multiplatform adaptive tools of scientific visualization are demonstrated. The ontologies are used not for integration of heterogeneous resources...

  15. Witnessing stressful events induces glutamatergic synapse pathway alterations and gene set enrichment of positive EPSP regulation within the VTA of adult mice: An ontology based approach

    Brewer, Jacob S.

    It is well known that exposure to severe stress increases the risk for developing mood disorders. Currently, the neurobiological and genetic mechanisms underlying the functional effects of psychological stress are poorly understood. Presenting a major obstacle to the study of psychological stress is the inability of current animal models of stress to distinguish between physical and psychological stressors. A novel paradigm recently developed by Warren et al., is able to tease apart the effects of physical and psychological stress in adult mice by allowing these mice to "witness," the social defeat of another mouse thus removing confounding variables associated with physical stressors. Using this 'witness' model of stress and RNA-Seq technology, the current study aims to study the genetic effects of psychological stress. After, witnessing the social defeat of another mouse, VTA tissue was extracted, sequenced, and analyzed for differential expression. Since genes often work together in complex networks, a pathway and gene ontology (GO) analysis was performed using data from the differential expression analysis. The pathway and GO analyzes revealed a perturbation of the glutamatergic synapse pathway and an enrichment of positive excitatory post-synaptic potential regulation. This is consistent with the excitatory synapse theory of depression. Together these findings demonstrate a dysregulation of the mesolimbic reward pathway at the gene level as a result of psychological stress potentially contributing to depressive like behaviors.

  16. PDON: Parkinson's disease ontology for representation and modeling of the Parkinson's disease knowledge domain.

    Younesi, Erfan; Malhotra, Ashutosh; Gündel, Michaela; Scordis, Phil; Kodamullil, Alpha Tom; Page, Matt; Müller, Bernd; Springstubbe, Stephan; Wüllner, Ullrich; Scheller, Dieter; Hofmann-Apitius, Martin

    2015-09-22

    Despite the unprecedented and increasing amount of data, relatively little progress has been made in molecular characterization of mechanisms underlying Parkinson's disease. In the area of Parkinson's research, there is a pressing need to integrate various pieces of information into a meaningful context of presumed disease mechanism(s). Disease ontologies provide a novel means for organizing, integrating, and standardizing the knowledge domains specific to disease in a compact, formalized and computer-readable form and serve as a reference for knowledge exchange or systems modeling of disease mechanism. The Parkinson's disease ontology was built according to the life cycle of ontology building. Structural, functional, and expert evaluation of the ontology was performed to ensure the quality and usability of the ontology. A novelty metric has been introduced to measure the gain of new knowledge using the ontology. Finally, a cause-and-effect model was built around PINK1 and two gene expression studies from the Gene Expression Omnibus database were re-annotated to demonstrate the usability of the ontology. The Parkinson's disease ontology with a subclass-based taxonomic hierarchy covers the broad spectrum of major biomedical concepts from molecular to clinical features of the disease, and also reflects different views on disease features held by molecular biologists, clinicians and drug developers. The current version of the ontology contains 632 concepts, which are organized under nine views. The structural evaluation showed the balanced dispersion of concept classes throughout the ontology. The functional evaluation demonstrated that the ontology-driven literature search could gain novel knowledge not present in the reference Parkinson's knowledge map. The ontology was able to answer specific questions related to Parkinson's when evaluated by experts. Finally, the added value of the Parkinson's disease ontology is demonstrated by ontology-driven modeling of PINK1

  17. There is no quantum ontology without classical ontology

    Fink, Helmut [Institut fuer Theoretische Physik, Univ. Erlangen-Nuernberg (Germany)

    2011-07-01

    The relation between quantum physics and classical physics is still under debate. In his recent book ''Rational Reconstructions of Modern Physics'', Peter Mittelstaedt explores a route from classical to quantum mechanics by reduction and elimination of (some of) the ontological hypotheses underlying classical mechanics. While, according to Mittelstaedt, classical mechanics describes a fictitious world that does not exist in reality, he claims to achieve a universal quantum ontology that can be improved by incorporating unsharp properties and equipped with Planck's constant without any need to refer to classical concepts. In this talk, we argue that quantum ontology in Mittelstaedt's sense is not enough. Quantum ontology can never be universal as long as the difference between potential and real properties is not represented adequately. Quantum properties are potential, not (yet) real, be they sharp or unsharp. Hence, preparation and measurement presuppose classical concepts, even in quantum theory. We end up with a classical-quantum sandwich ontology, which is still less extravagant than Bohmian or many-worlds ontologies are.

  18. In search of a primitive ontology for relativistic quantum field theory

    Lam, Vincent [University of Lausanne, CH-1015 Lausanne (Switzerland)

    2014-07-01

    There is a recently much discussed approach to the ontology of quantum mechanics according to which the theory is ultimately about entities in 3-dimensional space and their temporal evolution. Such an ontology postulating from the start matter localized in usual physical space or spacetime, by contrast to an abstract high-dimensional space such as the configuration space of wave function realism, is called primitive ontology in the recent literature on the topic and finds its roots in Bell's notion of local beables. The main motivation for a primitive ontology lies in its explanatory power: the primitive ontology allows for a direct account of the behaviour and properties of familiar macroscopic objects. In this context, it is natural to look for a primitive ontology for relativistic quantum field theory (RQFT). The aim of this talk is to critically discuss this interpretative move within RQFT, in particular with respect to the foundational issue of the existence of unitarily inequivalent representations. Indeed the proposed primitive ontologies for RQFT rely either on a Fock space representation or a wave functional representation, which are strictly speaking only unambiguously available for free systems in flat spacetime. As a consequence, it is argued that these primitive ontologies constitute only effective ontologies and are hardly satisfying as a fundamental ontology for RQFT.

  19. Benchmarking the Applicability of Ontology in Geographic Object-Based Image Analysis

    Sachit Rajbhandari

    2017-11-01

    Full Text Available In Geographic Object-based Image Analysis (GEOBIA, identification of image objects is normally achieved using rule-based classification techniques supported by appropriate domain knowledge. However, GEOBIA currently lacks a systematic method to formalise the domain knowledge required for image object identification. Ontology provides a representation vocabulary for characterising domain-specific classes. This study proposes an ontological framework that conceptualises domain knowledge in order to support the application of rule-based classifications. The proposed ontological framework is tested with a landslide case study. The Web Ontology Language (OWL is used to construct an ontology in the landslide domain. The segmented image objects with extracted features are incorporated into the ontology as instances. The classification rules are written in Semantic Web Rule Language (SWRL and executed using a semantic reasoner to assign instances to appropriate landslide classes. Machine learning techniques are used to predict new threshold values for feature attributes in the rules. Our framework is compared with published work on landslide detection where ontology was not used for the image classification. Our results demonstrate that a classification derived from the ontological framework accords with non-ontological methods. This study benchmarks the ontological method providing an alternative approach for image classification in the case study of landslides.

  20. Streamlined approach to mapping the magnetic induction of skyrmionic materials

    Chess, Jordan J.; Montoya, Sergio A.; Harvey, Tyler R.; Ophus, Colin; Couture, Simon; Lomakin, Vitaliy; Fullerton, Eric E.; McMorran, Benjamin J.

    2017-01-01

    Highlights: • A method to reconstruction the phase of electrons after pasting though a sample that requires a single defocused image is presented. • Restrictions as to when it is appropriate to apply this method are described. • The relative error associated with this method is compared to conventional transport of intensity equation analysis. - Abstract: Recently, Lorentz transmission electron microscopy (LTEM) has helped researchers advance the emerging field of magnetic skyrmions. These magnetic quasi-particles, composed of topologically non-trivial magnetization textures, have a large potential for application as information carriers in low-power memory and logic devices. LTEM is one of a very few techniques for direct, real-space imaging of magnetic features at the nanoscale. For Fresnel-contrast LTEM, the transport of intensity equation (TIE) is the tool of choice for quantitative reconstruction of the local magnetic induction through the sample thickness. Typically, this analysis requires collection of at least three images. Here, we show that for uniform, thin, magnetic films, which includes many skyrmionic samples, the magnetic induction can be quantitatively determined from a single defocused image using a simplified TIE approach.

  1. Streamlined approach to mapping the magnetic induction of skyrmionic materials

    Chess, Jordan J., E-mail: jchess@uoregon.edu [Department of Physics, University of Oregon, Eugene, OR 97403 (United States); Montoya, Sergio A. [Center for Memory and Recording Research, University of California, San Diego, CA 92093 (United States); Department of Electrical and Computer Engineering, University of California, San Diego, La Jolla, CA 92093 (United States); Harvey, Tyler R. [Department of Physics, University of Oregon, Eugene, OR 97403 (United States); Ophus, Colin [National Center for Electron Microscopy, Molecular Foundry, Lawrence Berkeley National Laboratory, 1 Cyclotron Road, Berkeley, CA 94720 (United States); Couture, Simon; Lomakin, Vitaliy; Fullerton, Eric E. [Center for Memory and Recording Research, University of California, San Diego, CA 92093 (United States); Department of Electrical and Computer Engineering, University of California, San Diego, La Jolla, CA 92093 (United States); McMorran, Benjamin J. [Department of Physics, University of Oregon, Eugene, OR 97403 (United States)

    2017-06-15

    Highlights: • A method to reconstruction the phase of electrons after pasting though a sample that requires a single defocused image is presented. • Restrictions as to when it is appropriate to apply this method are described. • The relative error associated with this method is compared to conventional transport of intensity equation analysis. - Abstract: Recently, Lorentz transmission electron microscopy (LTEM) has helped researchers advance the emerging field of magnetic skyrmions. These magnetic quasi-particles, composed of topologically non-trivial magnetization textures, have a large potential for application as information carriers in low-power memory and logic devices. LTEM is one of a very few techniques for direct, real-space imaging of magnetic features at the nanoscale. For Fresnel-contrast LTEM, the transport of intensity equation (TIE) is the tool of choice for quantitative reconstruction of the local magnetic induction through the sample thickness. Typically, this analysis requires collection of at least three images. Here, we show that for uniform, thin, magnetic films, which includes many skyrmionic samples, the magnetic induction can be quantitatively determined from a single defocused image using a simplified TIE approach.

  2. Phenotype ontologies and cross-species analysis for translational research.

    Peter N Robinson

    2014-04-01

    Full Text Available The use of model organisms as tools for the investigation of human genetic variation has significantly and rapidly advanced our understanding of the aetiologies underlying hereditary traits. However, while equivalences in the DNA sequence of two species may be readily inferred through evolutionary models, the identification of equivalence in the phenotypic consequences resulting from comparable genetic variation is far from straightforward, limiting the value of the modelling paradigm. In this review, we provide an overview of the emerging statistical and computational approaches to objectively identify phenotypic equivalence between human and model organisms with examples from the vertebrate models, mouse and zebrafish. Firstly, we discuss enrichment approaches, which deem the most frequent phenotype among the orthologues of a set of genes associated with a common human phenotype as the orthologous phenotype, or phenolog, in the model species. Secondly, we introduce and discuss computational reasoning approaches to identify phenotypic equivalences made possible through the development of intra- and interspecies ontologies. Finally, we consider the particular challenges involved in modelling neuropsychiatric disorders, which illustrate many of the remaining difficulties in developing comprehensive and unequivocal interspecies phenotype mappings.

  3. development of ontological knowledge representation

    Preferred Customer

    ABSTRACT. This paper presents the development of an ontological knowledge organization and .... intelligence in order to facilitate knowledge sharing and reuse of acquired knowledge (15). Soon, ..... Water Chemistry, AJCE, 1(2), 50-58. 25.

  4. Semantic Similarity between Web Documents Using Ontology

    Chahal, Poonam; Singh Tomer, Manjeet; Kumar, Suresh

    2018-06-01

    The World Wide Web is the source of information available in the structure of interlinked web pages. However, the procedure of extracting significant information with the assistance of search engine is incredibly critical. This is for the reason that web information is written mainly by using natural language, and further available to individual human. Several efforts have been made in semantic similarity computation between documents using words, concepts and concepts relationship but still the outcome available are not as per the user requirements. This paper proposes a novel technique for computation of semantic similarity between documents that not only takes concepts available in documents but also relationships that are available between the concepts. In our approach documents are being processed by making ontology of the documents using base ontology and a dictionary containing concepts records. Each such record is made up of the probable words which represents a given concept. Finally, document ontology's are compared to find their semantic similarity by taking the relationships among concepts. Relevant concepts and relations between the concepts have been explored by capturing author and user intention. The proposed semantic analysis technique provides improved results as compared to the existing techniques.

  5. Semantic Similarity between Web Documents Using Ontology

    Chahal, Poonam; Singh Tomer, Manjeet; Kumar, Suresh

    2018-03-01

    The World Wide Web is the source of information available in the structure of interlinked web pages. However, the procedure of extracting significant information with the assistance of search engine is incredibly critical. This is for the reason that web information is written mainly by using natural language, and further available to individual human. Several efforts have been made in semantic similarity computation between documents using words, concepts and concepts relationship but still the outcome available are not as per the user requirements. This paper proposes a novel technique for computation of semantic similarity between documents that not only takes concepts available in documents but also relationships that are available between the concepts. In our approach documents are being processed by making ontology of the documents using base ontology and a dictionary containing concepts records. Each such record is made up of the probable words which represents a given concept. Finally, document ontology's are compared to find their semantic similarity by taking the relationships among concepts. Relevant concepts and relations between the concepts have been explored by capturing author and user intention. The proposed semantic analysis technique provides improved results as compared to the existing techniques.

  6. An Ontology to Improve Transparency in Case Definition and Increase Case Finding of Infectious Intestinal Disease: Database Study in English General Practice.

    de Lusignan, Simon; Shinneman, Stacy; Yonova, Ivelina; van Vlymen, Jeremy; Elliot, Alex J; Bolton, Frederick; Smith, Gillian E; O'Brien, Sarah

    2017-09-28

    Infectious intestinal disease (IID) has considerable health impact; there are 2 billion cases worldwide resulting in 1 million deaths and 78.7 million disability-adjusted life years lost. Reported IID incidence rates vary and this is partly because terms such as "diarrheal disease" and "acute infectious gastroenteritis" are used interchangeably. Ontologies provide a method of transparently comparing case definitions and disease incidence rates. This study sought to show how differences in case definition in part account for variation in incidence estimates for IID and how an ontological approach provides greater transparency to IID case finding. We compared three IID case definitions: (1) Royal College of General Practitioners Research and Surveillance Centre (RCGP RSC) definition based on mapping to the Ninth International Classification of Disease (ICD-9), (2) newer ICD-10 definition, and (3) ontological case definition. We calculated incidence rates and examined the contribution of four supporting concepts related to IID: symptoms, investigations, process of care (eg, notification to public health authorities), and therapies. We created a formal ontology using ontology Web language. The ontological approach identified 5712 more cases of IID than the ICD-10 definition and 4482 more than the RCGP RSC definition from an initial cohort of 1,120,490. Weekly incidence using the ontological definition was 17.93/100,000 (95% CI 15.63-20.41), whereas for the ICD-10 definition the rate was 8.13/100,000 (95% CI 6.70-9.87), and for the RSC definition the rate was 10.24/100,000 (95% CI 8.55-12.12). Codes from the four supporting concepts were generally consistent across our three IID case definitions: 37.38% (3905/10,448) (95% CI 36.16-38.5) for the ontological definition, 38.33% (2287/5966) (95% CI 36.79-39.93) for the RSC definition, and 40.82% (1933/4736) (95% CI 39.03-42.66) for the ICD-10 definition. The proportion of laboratory results associated with a positive test

  7. An Odometry-free Approach for Simultaneous Localization and Online Hybrid Map Building

    Wei Hong Chin

    2016-11-01

    Full Text Available In this paper, a new approach is proposed for mobile robot localization and hybrid map building simultaneously without using any odometry hardware system. The proposed method termed as Genetic Bayesian ARAM which comprises two main components: 1 Steady state genetic algorithm (SSGA for self-localization and occupancy grid map building; 2 Bayesian Adaptive Resonance Associative Memory (ARAM for online topological map building. The model of the explored environment is formed as a hybrid representation, both topological and grid-based, and it is incrementally constructed during the exploration process. During occupancy map building, robot estimated self-position is updated by SSGA. At the same time, robot estimated self position is transmit to Bayesian ARAM for topological map building and localization. The effectiveness of our proposed approach is validated by a number of standardized benchmark datasets and real experimental results carried on mobile robot. Benchmark datasets are used to verify the proposed method capable of generating topological map in different environment conditions. Real robot experiment is to verify the proposed method can be implemented in real world.

  8. Force scanning: a rapid, high-resolution approach for spatial mechanical property mapping

    Darling, E M

    2011-01-01

    Atomic force microscopy (AFM) can be used to co-localize mechanical properties and topographical features through property mapping techniques. The most common approach for testing biological materials at the microscale and nanoscale is force mapping, which involves taking individual force curves at discrete sites across a region of interest. The limitations of force mapping include long testing times and low resolution. While newer AFM methodologies, like modulated scanning and torsional oscillation, circumvent this problem, their adoption for biological materials has been limited. This could be due to their need for specialized software algorithms and/or hardware. The objective of this study is to develop a novel force scanning technique using AFM to rapidly capture high-resolution topographical images of soft biological materials while simultaneously quantifying their mechanical properties. Force scanning is a straightforward methodology applicable to a wide range of materials and testing environments, requiring no special modification to standard AFMs. Essentially, if a contact-mode image can be acquired, then force scanning can be used to produce a spatial modulus map. The current study first validates this technique using agarose gels, comparing results to ones achieved by the standard force mapping approach. Biologically relevant demonstrations are then presented for high-resolution modulus mapping of individual cells, cell-cell interfaces, and articular cartilage tissue.

  9. Evolving BioAssay Ontology (BAO): modularization, integration and applications.

    Abeyruwan, Saminda; Vempati, Uma D; Küçük-McGinty, Hande; Visser, Ubbo; Koleti, Amar; Mir, Ahsan; Sakurai, Kunie; Chung, Caty; Bittker, Joshua A; Clemons, Paul A; Brudz, Steve; Siripala, Anosha; Morales, Arturo J; Romacker, Martin; Twomey, David; Bureeva, Svetlana; Lemmon, Vance; Schürer, Stephan C

    2014-01-01

    The lack of established standards to describe and annotate biological assays and screening outcomes in the domain of drug and chemical probe discovery is a severe limitation to utilize public and proprietary drug screening data to their maximum potential. We have created the BioAssay Ontology (BAO) project (http://bioassayontology.org) to develop common reference metadata terms and definitions required for describing relevant information of low-and high-throughput drug and probe screening assays and results. The main objectives of BAO are to enable effective integration, aggregation, retrieval, and analyses of drug screening data. Since we first released BAO on the BioPortal in 2010 we have considerably expanded and enhanced BAO and we have applied the ontology in several internal and external collaborative projects, for example the BioAssay Research Database (BARD). We describe the evolution of BAO with a design that enables modeling complex assays including profile and panel assays such as those in the Library of Integrated Network-based Cellular Signatures (LINCS). One of the critical questions in evolving BAO is the following: how can we provide a way to efficiently reuse and share among various research projects specific parts of our ontologies without violating the integrity of the ontology and without creating redundancies. This paper provides a comprehensive answer to this question with a description of a methodology for ontology modularization using a layered architecture. Our modularization approach defines several distinct BAO components and separates internal from external modules and domain-level from structural components. This approach facilitates the generation/extraction of derived ontologies (or perspectives) that can suit particular use cases or software applications. We describe the evolution of BAO related to its formal structures, engineering approaches, and content to enable modeling of complex assays and integration with other ontologies and

  10. A highly efficient approach to protein interactome mapping based on collaborative filtering framework.

    Luo, Xin; You, Zhuhong; Zhou, Mengchu; Li, Shuai; Leung, Hareton; Xia, Yunni; Zhu, Qingsheng

    2015-01-09

    The comprehensive mapping of protein-protein interactions (PPIs) is highly desired for one to gain deep insights into both fundamental cell biology processes and the pathology of diseases. Finely-set small-scale experiments are not only very expensive but also inefficient to identify numerous interactomes despite their high accuracy. High-throughput screening techniques enable efficient identification of PPIs; yet the desire to further extract useful knowledge from these data leads to the problem of binary interactome mapping. Network topology-based approaches prove to be highly efficient in addressing this problem; however, their performance deteriorates significantly on sparse putative PPI networks. Motivated by the success of collaborative filtering (CF)-based approaches to the problem of personalized-recommendation on large, sparse rating matrices, this work aims at implementing a highly efficient CF-based approach to binary interactome mapping. To achieve this, we first propose a CF framework for it. Under this framework, we model the given data into an interactome weight matrix, where the feature-vectors of involved proteins are extracted. With them, we design the rescaled cosine coefficient to model the inter-neighborhood similarity among involved proteins, for taking the mapping process. Experimental results on three large, sparse datasets demonstrate that the proposed approach outperforms several sophisticated topology-based approaches significantly.

  11. A multi-temporal analysis approach for land cover mapping in support of nuclear incident response

    Sah, Shagan; van Aardt, Jan A. N.; McKeown, Donald M.; Messinger, David W.

    2012-06-01

    Remote sensing can be used to rapidly generate land use maps for assisting emergency response personnel with resource deployment decisions and impact assessments. In this study we focus on constructing accurate land cover maps to map the impacted area in the case of a nuclear material release. The proposed methodology involves integration of results from two different approaches to increase classification accuracy. The data used included RapidEye scenes over Nine Mile Point Nuclear Power Station (Oswego, NY). The first step was building a coarse-scale land cover map from freely available, high temporal resolution, MODIS data using a time-series approach. In the case of a nuclear accident, high spatial resolution commercial satellites such as RapidEye or IKONOS can acquire images of the affected area. Land use maps from the two image sources were integrated using a probability-based approach. Classification results were obtained for four land classes - forest, urban, water and vegetation - using Euclidean and Mahalanobis distances as metrics. Despite the coarse resolution of MODIS pixels, acceptable accuracies were obtained using time series features. The overall accuracies using the fusion based approach were in the neighborhood of 80%, when compared with GIS data sets from New York State. The classifications were augmented using this fused approach, with few supplementary advantages such as correction for cloud cover and independence from time of year. We concluded that this method would generate highly accurate land maps, using coarse spatial resolution time series satellite imagery and a single date, high spatial resolution, multi-spectral image.

  12. An Approach of Dynamic Object Removing for Indoor Mapping Based on UGV SLAM

    Jian Tang

    2015-07-01

    Full Text Available The study of indoor mapping for Location Based Service (LBS becomes more and more popular in recent years. LiDAR SLAM based mapping method seems to be a promising indoor mapping solution. However, there are some dynamic objects such as pedestrians, indoor vehicles, etc. existing in the raw LiDAR range data. They have to be removal for mapping purpose. In this paper, a new approach of dynamic object removing called Likelihood Grid Voting (LGV is presented. It is a model free method and takes full advantage of the high scanning rate of LiDAR, which is moving at a relative low speed in indoor environment. In this method, a counting grid is allocated for recording the occupation of map position by laser scans. The lower counter value of this position can be recognized as dynamic objects and the point cloud will be removed from map. This work is a part of algorithms in our self- developed Unmanned Ground Vehicles (UGV simultaneous localization and Mapping (SLAM system- NAVIS. Field tests are carried in an indoor parking place with NAVIS to evaluate the effectiveness of the proposed method. The result shows that all the small size objects like pedestrians can be detected and removed quickly; large size of objects like cars can be detected and removed partly.

  13. An Effective NoSQL-Based Vector Map Tile Management Approach

    Lin Wan

    2016-11-01

    Full Text Available Within a digital map service environment, the rapid growth of Spatial Big-Data is driving new requirements for effective mechanisms for massive online vector map tile processing. The emergence of Not Only SQL (NoSQL databases has resulted in a new data storage and management model for scalable spatial data deployments and fast tracking. They better suit the scenario of high-volume, low-latency network map services than traditional standalone high-performance computer (HPC or relational databases. In this paper, we propose a flexible storage framework that provides feasible methods for tiled map data parallel clipping and retrieval operations within a distributed NoSQL database environment. We illustrate the parallel vector tile generation and querying algorithms with the MapReduce programming model. Three different processing approaches, including local caching, distributed file storage, and the NoSQL-based method, are compared by analyzing the concurrent load and calculation time. An online geological vector tile map service prototype was developed to embed our processing framework in the China Geological Survey Information Grid. Experimental results show that our NoSQL-based parallel tile management framework can support applications that process huge volumes of vector tile data and improve performance of the tiled map service.

  14. A regularized, model-based approach to phase-based conductivity mapping using MRI.

    Ropella, Kathleen M; Noll, Douglas C

    2017-11-01

    To develop a novel regularized, model-based approach to phase-based conductivity mapping that uses structural information to improve the accuracy of conductivity maps. The inverse of the three-dimensional Laplacian operator is used to model the relationship between measured phase maps and the object conductivity in a penalized weighted least-squares optimization problem. Spatial masks based on structural information are incorporated into the problem to preserve data near boundaries. The proposed Inverse Laplacian method was compared against a restricted Gaussian filter in simulation, phantom, and human experiments. The Inverse Laplacian method resulted in lower reconstruction bias and error due to noise in simulations than the Gaussian filter. The Inverse Laplacian method also produced conductivity maps closer to the measured values in a phantom and with reduced noise in the human brain, as compared to the Gaussian filter. The Inverse Laplacian method calculates conductivity maps with less noise and more accurate values near boundaries. Improving the accuracy of conductivity maps is integral for advancing the applications of conductivity mapping. Magn Reson Med 78:2011-2021, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  15. Crude oil and its’ distillation: an experimental approach in High School using conceptual maps

    Dionísio Borsato

    2006-02-01

    Full Text Available Conceptual maps are representations of ideas organized in the form of bidimensional diagrams. In the present work the theme of oil fractional distillation was explored, and the conceptual maps were elaborated both before and after the activities by 43 students from the 1st and 3rd High School grades of a public school in Londrina – PR. The study was conducted theoretically and in practice, with a daily life approach. The use of the motivational theme and the opening text as previous organizers, enabled the establishment of a cognitive link between the students’ previous knowledge and the new concepts. Differences between the maps were verified before and after the activities as well as among the work groups. The students, stimulated by the technique, created better structured maps.

  16. Protein complex prediction in large ontology attributed protein-protein interaction networks.

    Zhang, Yijia; Lin, Hongfei; Yang, Zhihao; Wang, Jian; Li, Yanpeng; Xu, Bo

    2013-01-01

    Protein complexes are important for unraveling the secrets of cellular organization and function. Many computational approaches have been developed to predict protein complexes in protein-protein interaction (PPI) networks. However, most existing approaches focus mainly on the topological structure of PPI networks, and largely ignore the gene ontology (GO) annotation information. In this paper, we constructed ontology attributed PPI networks with PPI data and GO resource. After constructing ontology attributed networks, we proposed a novel approach called CSO (clustering based on network structure and ontology attribute similarity). Structural information and GO attribute information are complementary in ontology attributed networks. CSO can effectively take advantage of the correlation between frequent GO annotation sets and the dense subgraph for protein complex prediction. Our proposed CSO approach was applied to four different yeast PPI data sets and predicted many well-known protein complexes. The experimental results showed that CSO was valuable in predicting protein complexes and achieved state-of-the-art performance.

  17. Product line based ontology reuse in context-aware e-business environment

    Zhang, Weishan; Kunz, Thomas

    2006-01-01

    Improving the reusability of ontology is recognized as increasingly important due to the prevalence of OWL research and applications. But there exists no convincing methodology and tool support in this direction yet. In this paper, we apply ideas from the research and practice with software product...... lines in order to explore this issue. The ontology is developed and managed according to the commonalities and variabilities underlying a specific problem domain. Meta-ontology is used in order to improve the reusability, evolve-ability and customizability of ontology. Another advantage is being able...... to generate needed ontology with the created meta-ontology implemented with XVCL (XML based Variant Configuration Language) technology. We demonstrate our product line based reuse approach with an example B2C application....

  18. Building a Chemical Ontology using Methontology and the Ontology Design Environment

    Fernández López, Mariano; Gómez-Pérez, A.; Pazos Sierra, Alejandro; Pazos Sierra, Juan

    1999-01-01

    METHONTOLOGY PROVIDES GUIDELINES FOR SPECIFYING ONTOLOGIES AT THE KNOWLEDGE LEVEL, AS A SPECIFICATION OF A CONCEPTUALIZATION. ODE ENABLES ONTOLOGY CONSTRUCTION, COVERING THE ENTIRE LIFE CYCLE AND AUTOMATICALLY IMPLEMENTING ONTOLOGIES

  19. Physical Mapping of Bread Wheat Chromosome 5A: An Integrated Approach

    Delfina Barabaschi

    2015-11-01

    Full Text Available The huge size, redundancy, and highly repetitive nature of the bread wheat [ (L.] genome, makes it among the most difficult species to be sequenced. To overcome these limitations, a strategy based on the separation of individual chromosomes or chromosome arms and the subsequent production of physical maps was established within the frame of the International Wheat Genome Sequence Consortium (IWGSC. A total of 95,812 bacterial artificial chromosome (BAC clones of short-arm chromosome 5A (5AS and long-arm chromosome 5A (5AL arm-specific BAC libraries were fingerprinted and assembled into contigs by complementary analytical approaches based on the FingerPrinted Contig (FPC and Linear Topological Contig (LTC tools. Combined anchoring approaches based on polymerase chain reaction (PCR marker screening, microarray, and sequence homology searches applied to several genomic tools (i.e., genetic maps, deletion bin map, neighbor maps, BAC end sequences (BESs, genome zipper, and chromosome survey sequences allowed the development of a high-quality physical map with an anchored physical coverage of 75% for 5AS and 53% for 5AL with high portions (64 and 48%, respectively of contigs ordered along the chromosome. In the genome of grasses, [ (L. Beauv.], rice ( L., and sorghum [ (L. Moench] homologs of genes on wheat chromosome 5A were separated into syntenic blocks on different chromosomes as a result of translocations and inversions during evolution. The physical map presented represents an essential resource for fine genetic mapping and map-based cloning of agronomically relevant traits and a reference for the 5A sequencing projects.

  20. Deciphering the genomic architecture of the stickleback brain with a novel multilocus gene-mapping approach.

    Li, Zitong; Guo, Baocheng; Yang, Jing; Herczeg, Gábor; Gonda, Abigél; Balázs, Gergely; Shikano, Takahito; Calboli, Federico C F; Merilä, Juha

    2017-03-01

    Quantitative traits important to organismal function and fitness, such as brain size, are presumably controlled by many small-effect loci. Deciphering the genetic architecture of such traits with traditional quantitative trait locus (QTL) mapping methods is challenging. Here, we investigated the genetic architecture of brain size (and the size of five different brain parts) in nine-spined sticklebacks (Pungitius pungitius) with the aid of novel multilocus QTL-mapping approaches based on a de-biased LASSO method. Apart from having more statistical power to detect QTL and reduced rate of false positives than conventional QTL-mapping approaches, the developed methods can handle large marker panels and provide estimates of genomic heritability. Single-locus analyses of an F 2 interpopulation cross with 239 individuals and 15 198, fully informative single nucleotide polymorphisms (SNPs) uncovered 79 QTL associated with variation in stickleback brain size traits. Many of these loci were in strong linkage disequilibrium (LD) with each other, and consequently, a multilocus mapping of individual SNPs, accounting for LD structure in the data, recovered only four significant QTL. However, a multilocus mapping of SNPs grouped by linkage group (LG) identified 14 LGs (1-6 depending on the trait) that influence variation in brain traits. For instance, 17.6% of the variation in relative brain size was explainable by cumulative effects of SNPs distributed over six LGs, whereas 42% of the variation was accounted for by all 21 LGs. Hence, the results suggest that variation in stickleback brain traits is influenced by many small-effect loci. Apart from suggesting moderately heritable (h 2  ≈ 0.15-0.42) multifactorial genetic architecture of brain traits, the results highlight the challenges in identifying the loci contributing to variation in quantitative traits. Nevertheless, the results demonstrate that the novel QTL-mapping approach developed here has distinctive advantages

  1. Data mining approach to bipolar cognitive map development and decision analysis

    Zhang, Wen-Ran

    2002-03-01

    A data mining approach to cognitive mapping is presented based on bipolar logic, bipolar relations, and bipolar clustering. It is shown that a correlation network derived from a database can be converted to a bipolar cognitive map (or bipolar relation). A transitive, symmetric, and reflexive bipolar relation (equilibrium relation) can be used to identify focal links in decision analysis. It can also be used to cluster a set of events or itemsets into three different clusters: coalition sets, conflict sets, and harmony sets. The coalition sets are positively correlated events or itemsets; each conflict set is a negatively correlated set of two coalition subsets; and a harmony set consists of events that are both negatively and positively correlated. A cognitive map and the clusters can then be used for online decision analysis. This approach combines knowledge discovery with the views of decision makers and provides an effective means for online analytical processing (OLAP) and online analytical mining (OLAM).

  2. A New Approach to High-accuracy Road Orthophoto Mapping Based on Wavelet Transform

    Ming Yang

    2011-12-01

    Full Text Available Existing orthophoto map based on satellite photography and aerial photography is not precise enough for road marking. This paper proposes a new approach to high-accuracy orthophoto mapping. The approach uses inverse perspective transformation to process the image information and generates the orthophoto fragment. The offline interpolation algorithm is used to process the location information. It processes the dead reckoning and the EKF location information, and uses the result to transform the fragments to the global coordinate system. At last it uses wavelet transform to divides the image to two frequency bands and uses weighted median algorithm to deal with them separately. The result of experiment shows that the map produced with this method has high accuracy.

  3. Automatic Tamil lyric generation based on ontological interpretation ...

    This system proposes an -gram based approach to automatic Tamil lyric generation, by the ontological semantic interpretation of the input scene. The approach is based on identifying the semantics conveyed in the scenario, thereby making the system understand the situation and generate lyrics accordingly. The heart of ...

  4. ``Force,'' ontology, and language

    Brookes, David T.; Etkina, Eugenia

    2009-06-01

    We introduce a linguistic framework through which one can interpret systematically students’ understanding of and reasoning about force and motion. Some researchers have suggested that students have robust misconceptions or alternative frameworks grounded in everyday experience. Others have pointed out the inconsistency of students’ responses and presented a phenomenological explanation for what is observed, namely, knowledge in pieces. We wish to present a view that builds on and unifies aspects of this prior research. Our argument is that many students’ difficulties with force and motion are primarily due to a combination of linguistic and ontological difficulties. It is possible that students are primarily engaged in trying to define and categorize the meaning of the term “force” as spoken about by physicists. We found that this process of negotiation of meaning is remarkably similar to that engaged in by physicists in history. In this paper we will describe a study of the historical record that reveals an analogous process of meaning negotiation, spanning multiple centuries. Using methods from cognitive linguistics and systemic functional grammar, we will present an analysis of the force and motion literature, focusing on prior studies with interview data. We will then discuss the implications of our findings for physics instruction.

  5. Identification of protein features encoded by alternative exons using Exon Ontology.

    Tranchevent, Léon-Charles; Aubé, Fabien; Dulaurier, Louis; Benoit-Pilven, Clara; Rey, Amandine; Poret, Arnaud; Chautard, Emilie; Mortada, Hussein; Desmet, François-Olivier; Chakrama, Fatima Zahra; Moreno-Garcia, Maira Alejandra; Goillot, Evelyne; Janczarski, Stéphane; Mortreux, Franck; Bourgeois, Cyril F; Auboeuf, Didier

    2017-06-01

    Transcriptomic genome-wide analyses demonstrate massive variation of alternative splicing in many physiological and pathological situations. One major challenge is now to establish the biological contribution of alternative splicing variation in physiological- or pathological-associated cellular phenotypes. Toward this end, we developed a computational approach, named "Exon Ontology," based on terms corresponding to well-characterized protein features organized in an ontology tree. Exon Ontology is conceptually similar to Gene Ontology-based approaches but focuses on exon-encoded protein features instead of gene level functional annotations. Exon Ontology describes the protein features encoded by a selected list of exons and looks for potential Exon Ontology term enrichment. By applying this strategy to exons that are differentially spliced between epithelial and mesenchymal cells and after extensive experimental validation, we demonstrate that Exon Ontology provides support to discover specific protein features regulated by alternative splicing. We also show that Exon Ontology helps to unravel biological processes that depend on suites of coregulated alternative exons, as we uncovered a role of epithelial cell-enriched splicing factors in the AKT signaling pathway and of mesenchymal cell-enriched splicing factors in driving splicing events impacting on autophagy. Freely available on the web, Exon Ontology is the first computational resource that allows getting a quick insight into the protein features encoded by alternative exons and investigating whether coregulated exons contain the same biological information. © 2017 Tranchevent et al.; Published by Cold Spring Harbor Laboratory Press.

  6. Determination of contact maps in proteins: A combination of structural and chemical approaches

    Wołek, Karol; Cieplak, Marek, E-mail: mc@ifpan.edu.pl [Institute of Physics, Polish Academy of Science, Al. Lotników 32/46, 02-668 Warsaw (Poland); Gómez-Sicilia, Àngel [Instituto Cajal, Consejo Superior de Investigaciones Cientificas (CSIC), Av. Doctor Arce, 37, 28002 Madrid (Spain); Instituto Madrileño de Estudios Avanzados en Nanociencia (IMDEA-Nanociencia), C/Faraday 9, 28049 Cantoblanco (Madrid) (Spain)

    2015-12-28

    Contact map selection is a crucial step in structure-based molecular dynamics modelling of proteins. The map can be determined in many different ways. We focus on the methods in which residues are represented as clusters of effective spheres. One contact map, denoted as overlap (OV), is based on the overlap of such spheres. Another contact map, named Contacts of Structural Units (CSU), involves the geometry in a different way and, in addition, brings chemical considerations into account. We develop a variant of the CSU approach in which we also incorporate Coulombic effects such as formation of the ionic bridges and destabilization of possible links through repulsion. In this way, the most essential and well defined contacts are identified. The resulting residue-residue contact map, dubbed repulsive CSU (rCSU), is more sound in its physico-chemical justification than CSU. It also provides a clear prescription for validity of an inter-residual contact: the number of attractive atomic contacts should be larger than the number of repulsive ones — a feature that is not present in CSU. However, both of these maps do not correlate well with the experimental data on protein stretching. Thus, we propose to use rCSU together with the OV map. We find that the combined map, denoted as OV+rCSU, performs better than OV. In most situations, OV and OV+rCSU yield comparable folding properties but for some proteins rCSU provides contacts which improve folding in a substantial way. We discuss the likely residue-specificity of the rCSU contacts. Finally, we make comparisons to the recently proposed shadow contact map, which is derived from different principles.

  7. A geomorphic approach to 100-year floodplain mapping for the Conterminous United States

    Jafarzadegan, Keighobad; Merwade, Venkatesh; Saksena, Siddharth

    2018-06-01

    Floodplain mapping using hydrodynamic models is difficult in data scarce regions. Additionally, using hydrodynamic models to map floodplain over large stream network can be computationally challenging. Some of these limitations of floodplain mapping using hydrodynamic modeling can be overcome by developing computationally efficient statistical methods to identify floodplains in large and ungauged watersheds using publicly available data. This paper proposes a geomorphic model to generate probabilistic 100-year floodplain maps for the Conterminous United States (CONUS). The proposed model first categorizes the watersheds in the CONUS into three classes based on the height of the water surface corresponding to the 100-year flood from the streambed. Next, the probability that any watershed in the CONUS belongs to one of these three classes is computed through supervised classification using watershed characteristics related to topography, hydrography, land use and climate. The result of this classification is then fed into a probabilistic threshold binary classifier (PTBC) to generate the probabilistic 100-year floodplain maps. The supervised classification algorithm is trained by using the 100-year Flood Insurance Rated Maps (FIRM) from the U.S. Federal Emergency Management Agency (FEMA). FEMA FIRMs are also used to validate the performance of the proposed model in areas not included in the training. Additionally, HEC-RAS model generated flood inundation extents are used to validate the model performance at fifteen sites that lack FEMA maps. Validation results show that the probabilistic 100-year floodplain maps, generated by proposed model, match well with both FEMA and HEC-RAS generated maps. On average, the error of predicted flood extents is around 14% across the CONUS. The high accuracy of the validation results shows the reliability of the geomorphic model as an alternative approach for fast and cost effective delineation of 100-year floodplains for the CONUS.

  8. Mapping quantitative trait loci in a selectively genotyped outbred population using a mixture model approach

    Johnson, David L.; Jansen, Ritsert C.; Arendonk, Johan A.M. van

    1999-01-01

    A mixture model approach is employed for the mapping of quantitative trait loci (QTL) for the situation where individuals, in an outbred population, are selectively genotyped. Maximum likelihood estimation of model parameters is obtained from an Expectation-Maximization (EM) algorithm facilitated by

  9. Mapping community vulnerability to poaching: A whole-of-society approach

    Schmitz, Peter

    2017-01-01

    Full Text Available in Cartography and GIScience Mapping community vulnerability to poaching: A whole-of-society approach Peter M.U. Schmitz,1,2,3 Duarte Gonçalves,4 and Merin Jacob4 1. CSIR Built Environment, Meiring Naude Rd, Brummeria, Pretoria, South Africa; pschmitz...

  10. Improving Students' Creative Thinking and Achievement through the Implementation of Multiple Intelligence Approach with Mind Mapping

    Widiana, I. Wayan; Jampel, I. Nyoman

    2016-01-01

    This classroom action research aimed to improve the students' creative thinking and achievement in learning science. It conducted through the implementation of multiple intelligences with mind mapping approach and describing the students' responses. The subjects of this research were the fifth grade students of SD 8 Tianyar Barat, Kubu, and…

  11. A Soft OR Approach to Fostering Systems Thinking: SODA Maps plus Joint Analytical Process

    Wang, Shouhong; Wang, Hai

    2016-01-01

    Higher order thinking skills are important for managers. Systems thinking is an important type of higher order thinking in business education. This article investigates a soft Operations Research approach to teaching and learning systems thinking. It outlines the integrative use of Strategic Options Development and Analysis maps for visualizing…

  12. A novel matrix approach for controlling the invariant densities of chaotic maps

    Rogers, Alan; Shorten, Robert; Heffernan, Daniel M.

    2008-01-01

    Recent work on positive matrices has resulted in a new matrix method for generating chaotic maps with arbitrary piecewise constant invariant densities, sometimes known as the inverse Frobenius-Perron problem (IFPP). In this paper, we give an extensive introduction to the IFPP, describing existing methods for solving it, and we describe our new matrix approach for solving the IFPP

  13. Quantifying Spatial Variation in Ecosystem Services Demand : A Global Mapping Approach

    Wolff, S.; Schulp, C. J E; Kastner, T.; Verburg, P. H.

    2017-01-01

    Understanding the spatial-temporal variability in ecosystem services (ES) demand can help anticipate externalities of land use change. This study presents new operational approaches to quantify and map demand for three non-commodity ES on a global scale: animal pollination, wild medicinal plants and

  14. The role of architecture and ontology for interoperability.

    Blobel, Bernd; González, Carolina; Oemig, Frank; Lopéz, Diego; Nykänen, Pirkko; Ruotsalainen, Pekka

    2010-01-01

    Turning from organization-centric to process-controlled or even to personalized approaches, advanced healthcare settings have to meet special interoperability challenges. eHealth and pHealth solutions must assure interoperability between actors cooperating to achieve common business objectives. Hereby, the interoperability chain also includes individually tailored technical systems, but also sensors and actuators. For enabling corresponding pervasive computing and even autonomic computing, individualized systems have to be based on an architecture framework covering many domains, scientifically managed by specialized disciplines using their specific ontologies in a formalized way. Therefore, interoperability has to advance from a communication protocol to an architecture-centric approach mastering ontology coordination challenges.

  15. An Ontology for a TripTych Formal Software Development

    Bjørner, Dines

    2003-01-01

    An ontology, ie., a formalised set of strongly interrelated definitions, is given for an approach to software development that spans domain engineering, requirements engineering and software design - and which is otherwise based on a judicious use of both informal and formal, mathematics-based te......An ontology, ie., a formalised set of strongly interrelated definitions, is given for an approach to software development that spans domain engineering, requirements engineering and software design - and which is otherwise based on a judicious use of both informal and formal, mathematics...

  16. Supporting ontology-based keyword search over medical databases.

    Kementsietsidis, Anastasios; Lim, Lipyeow; Wang, Min

    2008-11-06

    The proliferation of medical terms poses a number of challenges in the sharing of medical information among different stakeholders. Ontologies are commonly used to establish relationships between different terms, yet their role in querying has not been investigated in detail. In this paper, we study the problem of supporting ontology-based keyword search queries on a database of electronic medical records. We present several approaches to support this type of queries, study the advantages and limitations of each approach, and summarize the lessons learned as best practices.

  17. Single-molecule approach to bacterial genomic comparisons via optical mapping.

    Zhou, Shiguo [Univ. Wisc.-Madison; Kile, A. [Univ. Wisc.-Madison; Bechner, M. [Univ. Wisc.-Madison; Kvikstad, E. [Univ. Wisc.-Madison; Deng, W. [Univ. Wisc.-Madison; Wei, J. [Univ. Wisc.-Madison; Severin, J. [Univ. Wisc.-Madison; Runnheim, R. [Univ. Wisc.-Madison; Churas, C. [Univ. Wisc.-Madison; Forrest, D. [Univ. Wisc.-Madison; Dimalanta, E. [Univ. Wisc.-Madison; Lamers, C. [Univ. Wisc.-Madison; Burland, V. [Univ. Wisc.-Madison; Blattner, F. R. [Univ. Wisc.-Madison; Schwartz, David C. [Univ. Wisc.-Madison

    2004-01-01

    Modern comparative genomics has been established, in part, by the sequencing and annotation of a broad range of microbial species. To gain further insights, new sequencing efforts are now dealing with the variety of strains or isolates that gives a species definition and range; however, this number vastly outstrips our ability to sequence them. Given the availability of a large number of microbial species, new whole genome approaches must be developed to fully leverage this information at the level of strain diversity that maximize discovery. Here, we describe how optical mapping, a single-molecule system, was used to identify and annotate chromosomal alterations between bacterial strains represented by several species. Since whole-genome optical maps are ordered restriction maps, sequenced strains of Shigella flexneri serotype 2a (2457T and 301), Yersinia pestis (CO 92 and KIM), and Escherichia coli were aligned as maps to identify regions of homology and to further characterize them as possible insertions, deletions, inversions, or translocations. Importantly, an unsequenced Shigella flexneri strain (serotype Y strain AMC[328Y]) was optically mapped and aligned with two sequenced ones to reveal one novel locus implicated in serotype conversion and several other loci containing insertion sequence elements or phage-related gene insertions. Our results suggest that genomic rearrangements and chromosomal breakpoints are readily identified and annotated against a prototypic sequenced strain by using the tools of optical mapping.

  18. Supporting collaboration in interdisciplinary research of water–energy–food nexus by means of ontology engineering

    Terukazu Kumazawa

    2017-06-01

    The introduction of ontology engineering approach will enable us to share a common language and a common theoretical basis. But the development of the new method based on ontology engineering is necessary. For example, knowledge structuring according to each perspective of researchers and simple figure accompanied with a reasoned argument in the background are the directions of tool development.

  19. Al-Quran ontology based on knowledge themes | Ta'a | Journal of ...

    Islamic knowledge is gathered through the understanding the Al-Quran. It requires ontology which can capture the knowledge and present it in a machine readable structured. However, current ontology approaches is irrelevant and inaccuracy in producing true concepts of Al-Quran knowledge, because it used traditional ...

  20. An Ontology to Support the Classification of Learning Material in an Organizational Learning Environment: An Evaluation

    Valaski, Joselaine; Reinehr, Sheila; Malucelli, Andreia

    2017-01-01

    Purpose: The purpose of this research was to evaluate whether ontology integrated in an organizational learning environment may support the automatic learning material classification in a specific knowledge area. Design/methodology/approach: An ontology for recommending learning material was integrated in the organizational learning environment…

  1. Engineering geological mapping in Wallonia (Belgium) : present state and recent computerized approach

    Delvoie, S.; Radu, J.-P.; Ruthy, I.; Charlier, R.

    2012-04-01

    An engineering geological map can be defined as a geological map with a generalized representation of all the components of a geological environment which are strongly required for spatial planning, design, construction and maintenance of civil engineering buildings. In Wallonia (Belgium) 24 engineering geological maps have been developed between the 70s and the 90s at 1/5,000 or 1/10,000 scale covering some areas of the most industrialized and urbanized cities (Liège, Charleroi and Mons). They were based on soil and subsoil data point (boring, drilling, penetration test, geophysical test, outcrop…). Some displayed data present the depth (with isoheights) or the thickness (with isopachs) of the different subsoil layers up to about 50 m depth. Information about geomechanical properties of each subsoil layer, useful for engineers and urban planners, is also synthesized. However, these maps were built up only on paper and progressively needed to be updated with new soil and subsoil data. The Public Service of Wallonia and the University of Liège have recently initiated a study to evaluate the feasibility to develop engineering geological mapping with a computerized approach. Numerous and various data (about soil and subsoil) are stored into a georelational database (the geotechnical database - using Access, Microsoft®). All the data are geographically referenced. The database is linked to a GIS project (using ArcGIS, ESRI®). Both the database and GIS project consist of a powerful tool for spatial data management and analysis. This approach involves a methodology using interpolation methods to update the previous maps and to extent the coverage to new areas. The location (x, y, z) of each subsoil layer is then computed from data point. The geomechanical data of these layers are synthesized in an explanatory booklet joined to maps.

  2. Kernel Methods for Mining Instance Data in Ontologies

    Bloehdorn, Stephan; Sure, York

    The amount of ontologies and meta data available on the Web is constantly growing. The successful application of machine learning techniques for learning of ontologies from textual data, i.e. mining for the Semantic Web, contributes to this trend. However, no principal approaches exist so far for mining from the Semantic Web. We investigate how machine learning algorithms can be made amenable for directly taking advantage of the rich knowledge expressed in ontologies and associated instance data. Kernel methods have been successfully employed in various learning tasks and provide a clean framework for interfacing between non-vectorial data and machine learning algorithms. In this spirit, we express the problem of mining instances in ontologies as the problem of defining valid corresponding kernels. We present a principled framework for designing such kernels by means of decomposing the kernel computation into specialized kernels for selected characteristics of an ontology which can be flexibly assembled and tuned. Initial experiments on real world Semantic Web data enjoy promising results and show the usefulness of our approach.

  3. A sensor and video based ontology for activity recognition in smart environments.

    Mitchell, D; Morrow, Philip J; Nugent, Chris D

    2014-01-01

    Activity recognition is used in a wide range of applications including healthcare and security. In a smart environment activity recognition can be used to monitor and support the activities of a user. There have been a range of methods used in activity recognition including sensor-based approaches, vision-based approaches and ontological approaches. This paper presents a novel approach to activity recognition in a smart home environment which combines sensor and video data through an ontological framework. The ontology describes the relationships and interactions between activities, the user, objects, sensors and video data.

  4. OntoCAT--simple ontology search and integration in Java, R and REST/JavaScript.

    Adamusiak, Tomasz; Burdett, Tony; Kurbatova, Natalja; Joeri van der Velde, K; Abeygunawardena, Niran; Antonakaki, Despoina; Kapushesky, Misha; Parkinson, Helen; Swertz, Morris A

    2011-05-29

    Ontologies have become an essential asset in the bioinformatics toolbox and a number of ontology access resources are now available, for example, the EBI Ontology Lookup Service (OLS) and the NCBO BioPortal. However, these resources differ substantially in mode, ease of access, and ontology content. This makes it relatively difficult to access each ontology source separately, map their contents to research data, and much of this effort is being replicated across different research groups. OntoCAT provides a seamless programming interface to query heterogeneous ontology resources including OLS and BioPortal, as well as user-specified local OWL and OBO files. Each resource is wrapped behind easy to learn Java, Bioconductor/R and REST web service commands enabling reuse and integration of ontology software efforts despite variation in technologies. It is also available as a stand-alone MOLGENIS database and a Google App Engine application. OntoCAT provides a robust, configurable solution for accessing ontology terms specified locally and from remote services, is available as a stand-alone tool and has been tested thoroughly in the ArrayExpress, MOLGENIS, EFO and Gen2Phen phenotype use cases. http://www.ontocat.org.

  5. Approaches in Characterizing Genetic Structure and Mapping in a Rice Multiparental Population.

    Raghavan, Chitra; Mauleon, Ramil; Lacorte, Vanica; Jubay, Monalisa; Zaw, Hein; Bonifacio, Justine; Singh, Rakesh Kumar; Huang, B Emma; Leung, Hei

    2017-06-07

    Multi-parent Advanced Generation Intercross (MAGIC) populations are fast becoming mainstream tools for research and breeding, along with the technology and tools for analysis. This paper demonstrates the analysis of a rice MAGIC population from data filtering to imputation and processing of genetic data to characterizing genomic structure, and finally quantitative trait loci (QTL) mapping. In this study, 1316 S6:8 indica MAGIC (MI) lines and the eight founders were sequenced using Genotyping by Sequencing (GBS). As the GBS approach often includes missing data, the first step was to impute the missing SNPs. The observable number of recombinations in the population was then explored. Based on this case study, a general outline of procedures for a MAGIC analysis workflow is provided, as well as for QTL mapping of agronomic traits and biotic and abiotic stress, using the results from both association and interval mapping approaches. QTL for agronomic traits (yield, flowering time, and plant height), physical (grain length and grain width) and cooking properties (amylose content) of the rice grain, abiotic stress (submergence tolerance), and biotic stress (brown spot disease) were mapped. Through presenting this extensive analysis in the MI population in rice, we highlight important considerations when choosing analytical approaches. The methods and results reported in this paper will provide a guide to future genetic analysis methods applied to multi-parent populations. Copyright © 2017 Raghavan et al.

  6. Approaches in Characterizing Genetic Structure and Mapping in a Rice Multiparental Population

    Chitra Raghavan

    2017-06-01

    Full Text Available Multi-parent Advanced Generation Intercross (MAGIC populations are fast becoming mainstream tools for research and breeding, along with the technology and tools for analysis. This paper demonstrates the analysis of a rice MAGIC population from data filtering to imputation and processing of genetic data to characterizing genomic structure, and finally quantitative trait loci (QTL mapping. In this study, 1316 S6:8 indica MAGIC (MI lines and the eight founders were sequenced using Genotyping by Sequencing (GBS. As the GBS approach often includes missing data, the first step was to impute the missing SNPs. The observable number of recombinations in the population was then explored. Based on this case study, a general outline of procedures for a MAGIC analysis workflow is provided, as well as for QTL mapping of agronomic traits and biotic and abiotic stress, using the results from both association and interval mapping approaches. QTL for agronomic traits (yield, flowering time, and plant height, physical (grain length and grain width and cooking properties (amylose content of the rice grain, abiotic stress (submergence tolerance, and biotic stress (brown spot disease were mapped. Through presenting this extensive analysis in the MI population in rice, we highlight important considerations when choosing analytical approaches. The methods and results reported in this paper will provide a guide to future genetic analysis methods applied to multi-parent populations.

  7. An automated approach for mapping persistent ice and snow cover over high latitude regions

    Selkowitz, David J.; Forster, Richard R.

    2016-01-01

    We developed an automated approach for mapping persistent ice and snow cover (glaciers and perennial snowfields) from Landsat TM and ETM+ data across a variety of topography, glacier types, and climatic conditions at high latitudes (above ~65°N). Our approach exploits all available Landsat scenes acquired during the late summer (1 August–15 September) over a multi-year period and employs an automated cloud masking algorithm optimized for snow and ice covered mountainous environments. Pixels from individual Landsat scenes were classified as snow/ice covered or snow/ice free based on the Normalized Difference Snow Index (NDSI), and pixels consistently identified as snow/ice covered over a five-year period were classified as persistent ice and snow cover. The same NDSI and ratio of snow/ice-covered days to total days thresholds applied consistently across eight study regions resulted in persistent ice and snow cover maps that agreed closely in most areas with glacier area mapped for the Randolph Glacier Inventory (RGI), with a mean accuracy (agreement with the RGI) of 0.96, a mean precision (user’s accuracy of the snow/ice cover class) of 0.92, a mean recall (producer’s accuracy of the snow/ice cover class) of 0.86, and a mean F-score (a measure that considers both precision and recall) of 0.88. We also compared results from our approach to glacier area mapped from high spatial resolution imagery at four study regions and found similar results. Accuracy was lowest in regions with substantial areas of debris-covered glacier ice, suggesting that manual editing would still be required in these regions to achieve reasonable results. The similarity of our results to those from the RGI as well as glacier area mapped from high spatial resolution imagery suggests it should be possible to apply this approach across large regions to produce updated 30-m resolution maps of persistent ice and snow cover. In the short term, automated PISC maps can be used to rapidly

  8. Use of the CIM Ontology

    Neumann, Scott; Britton, Jay; Devos, Arnold N.; Widergren, Steven E.

    2006-02-08

    There are many uses for the Common Information Model (CIM), an ontology that is being standardized through Technical Committee 57 of the International Electrotechnical Commission (IEC TC57). The most common uses to date have included application modeling, information exchanges, information management and systems integration. As one should expect, there are many issues that become apparent when the CIM ontology is applied to any one use. Some of these issues are shortcomings within the current draft of the CIM, and others are a consequence of the different ways in which the CIM can be applied using different technologies. As the CIM ontology will and should evolve, there are several dangers that need to be recognized. One is overall consistency and impact upon applications when extending the CIM for a specific need. Another is that a tight coupling of the CIM to specific technologies could limit the value of the CIM in the longer term as an ontology, which becomes a larger issue over time as new technologies emerge. The integration of systems is one specific area of interest for application of the CIM ontology. This is an area dominated by the use of XML for the definition of messages. While this is certainly true when using Enterprise Application Integration (EAI) products, it is even more true with the movement towards the use of Web Services (WS), Service-Oriented Architectures (SOA) and Enterprise Service Buses (ESB) for integration. This general IT industry trend is consistent with trends seen within the IEC TC57 scope of power system management and associated information exchange. The challenge for TC57 is how to best leverage the CIM ontology using the various XML technologies and standards for integration. This paper will provide examples of how the CIM ontology is used and describe some specific issues that should be addressed within the CIM in order to increase its usefulness as an ontology. It will also describe some of the issues and challenges that will

  9. [Implementation of ontology-based clinical decision support system for management of interactions between antihypertensive drugs and diet].

    Park, Jeong Eun; Kim, Hwa Sun; Chang, Min Jung; Hong, Hae Sook

    2014-06-01

    The influence of dietary composition on blood pressure is an important subject in healthcare. Interactions between antihypertensive drugs and diet (IBADD) is the most important factor in the management of hypertension. It is therefore essential to support healthcare providers' decision making role in active and continuous interaction control in hypertension management. The aim of this study was to implement an ontology-based clinical decision support system (CDSS) for IBADD management (IBADDM). We considered the concepts of antihypertensive drugs and foods, and focused on the interchangeability between the database and the CDSS when providing tailored information. An ontology-based CDSS for IBADDM was implemented in eight phases: (1) determining the domain and scope of ontology, (2) reviewing existing ontology, (3) extracting and defining the concepts, (4) assigning relationships between concepts, (5) creating a conceptual map with CmapTools, (6) selecting upper ontology, (7) formally representing the ontology with Protégé (ver.4.3), (8) implementing an ontology-based CDSS as a JAVA prototype application. We extracted 5,926 concepts, 15 properties, and formally represented them using Protégé. An ontology-based CDSS for IBADDM was implemented and the evaluation score was 4.60 out of 5. We endeavored to map functions of a CDSS and implement an ontology-based CDSS for IBADDM.

  10. Spatial cyberinfrastructures, ontologies, and the humanities.

    Sieber, Renee E; Wellen, Christopher C; Jin, Yuan

    2011-04-05

    We report on research into building a cyberinfrastructure for Chinese biographical and geographic data. Our cyberinfrastructure contains (i) the McGill-Harvard-Yenching Library Ming Qing Women's Writings database (MQWW), the only online database on historical Chinese women's writings, (ii) the China Biographical Database, the authority for Chinese historical people, and (iii) the China Historical Geographical Information System, one of the first historical geographic information systems. Key to this integration is that linked databases retain separate identities as bases of knowledge, while they possess sufficient semantic interoperability to allow for multidatabase concepts and to support cross-database queries on an ad hoc basis. Computational ontologies create underlying semantics for database access. This paper focuses on the spatial component in a humanities cyberinfrastructure, which includes issues of conflicting data, heterogeneous data models, disambiguation, and geographic scale. First, we describe the methodology for integrating the databases. Then we detail the system architecture, which includes a tier of ontologies and schema. We describe the user interface and applications that allow for cross-database queries. For instance, users should be able to analyze the data, examine hypotheses on spatial and temporal relationships, and generate historical maps with datasets from MQWW for research, teaching, and publication on Chinese women writers, their familial relations, publishing venues, and the literary and social communities. Last, we discuss the social side of cyberinfrastructure development, as people are considered to be as critical as the technical components for its success.

  11. Spatial cyberinfrastructures, ontologies, and the humanities

    Sieber, Renee E.; Wellen, Christopher C.; Jin, Yuan

    2011-01-01

    We report on research into building a cyberinfrastructure for Chinese biographical and geographic data. Our cyberinfrastructure contains (i) the McGill-Harvard-Yenching Library Ming Qing Women's Writings database (MQWW), the only online database on historical Chinese women's writings, (ii) the China Biographical Database, the authority for Chinese historical people, and (iii) the China Historical Geographical Information System, one of the first historical geographic information systems. Key to this integration is that linked databases retain separate identities as bases of knowledge, while they possess sufficient semantic interoperability to allow for multidatabase concepts and to support cross-database queries on an ad hoc basis. Computational ontologies create underlying semantics for database access. This paper focuses on the spatial component in a humanities cyberinfrastructure, which includes issues of conflicting data, heterogeneous data models, disambiguation, and geographic scale. First, we describe the methodology for integrating the databases. Then we detail the system architecture, which includes a tier of ontologies and schema. We describe the user interface and applications that allow for cross-database queries. For instance, users should be able to analyze the data, examine hypotheses on spatial and temporal relationships, and generate historical maps with datasets from MQWW for research, teaching, and publication on Chinese women writers, their familial relations, publishing venues, and the literary and social communities. Last, we discuss the social side of cyberinfrastructure development, as people are considered to be as critical as the technical components for its success. PMID:21444819

  12. Ontology patterns for complex topographic feature yypes

    Varanka, Dalia E.

    2011-01-01

    Complex feature types are defined as integrated relations between basic features for a shared meaning or concept. The shared semantic concept is difficult to define in commonly used geographic information systems (GIS) and remote sensing technologies. The role of spatial relations between complex feature parts was recognized in early GIS literature, but had limited representation in the feature or coverage data models of GIS. Spatial relations are more explicitly specified in semantic technology. In this paper, semantics for topographic feature ontology design patterns (ODP) are developed as data models for the representation of complex features. In the context of topographic processes, component assemblages are supported by resource systems and are found on local landscapes. The topographic ontology is organized across six thematic modules that can account for basic feature types, resource systems, and landscape types. Types of complex feature attributes include location, generative processes and physical description. Node/edge networks model standard spatial relations and relations specific to topographic science to represent complex features. To demonstrate these concepts, data from The National Map of the U. S. Geological Survey was converted and assembled into ODP.

  13. Ontologies and Formation Spaces for Conceptual ReDesign of Systems

    J. Bíla

    2005-01-01

    Full Text Available This paper discusses ontologies, methods for developing them and languages for representing them. A special ontology for computational support of the Conceptual ReDesign Process (CRDP is introduced with a simple illustrative example of an application. The ontology denoted as Global context (GLB combines features of general semantic networks and features of UML language. The ontology is task-oriented and domain-oriented, and contains three basic strata – GLBExpl(stratum of Explanation, GLBFAct (stratum of Fields of Activities and GLBEnv (stratum of Environment, with their sub-strata. The ontology has been developed to represent functions of systems and their components in CRDP. The main difference between this ontology and ontologies which have been developed to identify functions (the semantic details in those ontologies must be as deep as possible is in the style of the description of the functions. In the proposed ontology, Formation Spaces were used as lower semantic categories the semantic deepness of which is variable and depends on the actual solution approach of a specialised Conceptual Designer.

  14. Ontology construction and application in practice case study of health tourism in Thailand.

    Chantrapornchai, Chantana; Choksuchat, Chidchanok

    2016-01-01

    Ontology is one of the key components in semantic webs. It contains the core knowledge for an effective search. However, building ontology requires the carefully-collected knowledge which is very domain-sensitive. In this work, we present the practice of ontology construction for a case study of health tourism in Thailand. The whole process follows the METHONTOLOGY approach, which consists of phases: information gathering, corpus study, ontology engineering, evaluation, publishing, and the application construction. Different sources of data such as structure web documents like HTML and other documents are acquired in the information gathering process. The tourism corpora from various tourism texts and standards are explored. The ontology is evaluated in two aspects: automatic reasoning using Pellet, and RacerPro, and the questionnaires, used to evaluate by experts of the domains: tourism domain experts and ontology experts. The ontology usability is demonstrated via the semantic web application and via example axioms. The developed ontology is actually the first health tourism ontology in Thailand with the published application.

  15. A Hybrid Color Mapping Approach to Fusing MODIS and Landsat Images for Forward Prediction

    Chiman Kwan; Bence Budavari; Feng Gao; Xiaolin Zhu

    2018-01-01

    We present a new, simple, and efficient approach to fusing MODIS and Landsat images. It is well known that MODIS images have high temporal resolution and low spatial resolution, whereas Landsat images are just the opposite. Similar to earlier approaches, our goal is to fuse MODIS and Landsat images to yield high spatial and high temporal resolution images. Our approach consists of two steps. First, a mapping is established between two MODIS images, where one is at an earlier time, t1, and the...

  16. Inferring ontology graph structures using OWL reasoning

    Rodriguez-Garcia, Miguel Angel

    2018-01-05

    Ontologies are representations of a conceptualization of a domain. Traditionally, ontologies in biology were represented as directed acyclic graphs (DAG) which represent the backbone taxonomy and additional relations between classes. These graphs are widely exploited for data analysis in the form of ontology enrichment or computation of semantic similarity. More recently, ontologies are developed in a formal language such as the Web Ontology Language (OWL) and consist of a set of axioms through which classes are defined or constrained. While the taxonomy of an ontology can be inferred directly from the axioms of an ontology as one of the standard OWL reasoning tasks, creating general graph structures from OWL ontologies that exploit the ontologies\\' semantic content remains a challenge.We developed a method to transform ontologies into graphs using an automated reasoner while taking into account all relations between classes. Searching for (existential) patterns in the deductive closure of ontologies, we can identify relations between classes that are implied but not asserted and generate graph structures that encode for a large part of the ontologies\\' semantic content. We demonstrate the advantages of our method by applying it to inference of protein-protein interactions through semantic similarity over the Gene Ontology and demonstrate that performance is increased when graph structures are inferred using deductive inference according to our method. Our software and experiment results are available at http://github.com/bio-ontology-research-group/Onto2Graph .Onto2Graph is a method to generate graph structures from OWL ontologies using automated reasoning. The resulting graphs can be used for improved ontology visualization and ontology-based data analysis.

  17. Inferring ontology graph structures using OWL reasoning.

    Rodríguez-García, Miguel Ángel; Hoehndorf, Robert

    2018-01-05

    Ontologies are representations of a conceptualization of a domain. Traditionally, ontologies in biology were represented as directed acyclic graphs (DAG) which represent the backbone taxonomy and additional relations between classes. These graphs are widely exploited for data analysis in the form of ontology enrichment or computation of semantic similarity. More recently, ontologies are developed in a formal language such as the Web Ontology Language (OWL) and consist of a set of axioms through which classes are defined or constrained. While the taxonomy of an ontology can be inferred directly from the axioms of an ontology as one of the standard OWL reasoning tasks, creating general graph structures from OWL ontologies that exploit the ontologies' semantic content remains a challenge. We developed a method to transform ontologies into graphs using an automated reasoner while taking into account all relations between classes. Searching for (existential) patterns in the deductive closure of ontologies, we can identify relations between classes that are implied but not asserted and generate graph structures that encode for a large part of the ontologies' semantic content. We demonstrate the advantages of our method by applying it to inference of protein-protein interactions through semantic similarity over the Gene Ontology and demonstrate that performance is increased when graph structures are inferred using deductive inference according to our method. Our software and experiment results are available at http://github.com/bio-ontology-research-group/Onto2Graph . Onto2Graph is a method to generate graph structures from OWL ontologies using automated reasoning. The resulting graphs can be used for improved ontology visualization and ontology-based data analysis.

  18. Toward semantic interoperability with linked foundational ontologies in ROMULUS

    Khan, ZC

    2013-06-01

    Full Text Available A purpose of a foundational ontology is to solve interoperability issues among ontologies. Many foundational ontologies have been developed, reintroducing the ontology interoperability problem. We address this with the new online foundational...

  19. Clustering of the Self-Organizing Map based Approach in Induction Machine Rotor Faults Diagnostics

    Ahmed TOUMI

    2009-12-01

    Full Text Available Self-Organizing Maps (SOM is an excellent method of analyzingmultidimensional data. The SOM based classification is attractive, due to itsunsupervised learning and topology preserving properties. In this paper, theperformance of the self-organizing methods is investigated in induction motorrotor fault detection and severity evaluation. The SOM is based on motor currentsignature analysis (MCSA. The agglomerative hierarchical algorithms using theWard’s method is applied to automatically dividing the map into interestinginterpretable groups of map units that correspond to clusters in the input data. Theresults obtained with this approach make it possible to detect a rotor bar fault justdirectly from the visualization results. The system is also able to estimate theextent of rotor faults.

  20. Complex Topographic Feature Ontology Patterns

    Varanka, Dalia E.; Jerris, Thomas J.

    2015-01-01

    Semantic ontologies are examined as effective data models for the representation of complex topographic feature types. Complex feature types are viewed as integrated relations between basic features for a basic purpose. In the context of topographic science, such component assemblages are supported by resource systems and found on the local landscape. Ontologies are organized within six thematic modules of a domain ontology called Topography that includes within its sphere basic feature types, resource systems, and landscape types. Context is constructed not only as a spatial and temporal setting, but a setting also based on environmental processes. Types of spatial relations that exist between components include location, generative processes, and description. An example is offered in a complex feature type ‘mine.’ The identification and extraction of complex feature types are an area for future research.

  1. Geographic Ontologies, Gazetteers and Multilingualism

    Robert Laurini

    2015-01-01

    Full Text Available Different languages imply different visions of space, so that terminologies are different in geographic ontologies. In addition to their geometric shapes, geographic features have names, sometimes different in diverse languages. In addition, the role of gazetteers, as dictionaries of place names (toponyms, is to maintain relations between place names and location. The scope of geographic information retrieval is to search for geographic information not against a database, but against the whole Internet: but the Internet stores information in different languages, and it is of paramount importance not to remain stuck to a unique language. In this paper, our first step is to clarify the links between geographic objects as computer representations of geographic features, ontologies and gazetteers designed in various languages. Then, we propose some inference rules for matching not only types, but also relations in geographic ontologies with the assistance of gazetteers.

  2. Ontology Matching with Semantic Verification.

    Jean-Mary, Yves R; Shironoshita, E Patrick; Kabuka, Mansur R

    2009-09-01

    ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies.

  3. Cyber indicators of compromise: a domain ontology for security information and event management

    2017-03-01

    heuristics, mapping, and detection. CybOX is aimed at supporting a broad range of important cyber security domains to include [31]: • Digital...REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE CYBER INDICATORS OF COMPROMISE: A DOMAIN ONTOLOGY FOR SECURITY INFORMATION AND...Distribution is unlimited. CYBER INDICATORS OF COMPROMISE: A DOMAIN ONTOLOGY FOR SECURITY INFORMATION AND EVENT MANAGEMENT Marsha D. Rowell

  4. ADVANCED EARTH OBSERVATION APPROACH FOR MULTISCALE FOREST ECOSYSTEM SERVICES MODELING AND MAPPING (MIMOSE

    G. Chirici

    2014-04-01

    Full Text Available In the last decade ecosystem services (ES have been proposed as a method for quantifying the multifunctional role of forest ecosystems. Their spatial distribution on large areas is frequently limited by the lack of information, because field data collection with traditional methods requires much effort in terms of time and cost.  In this contribution we propose a methodology (namely, MultIscale Mapping Of ecoSystem servicEs - MIMOSE based on the integration of remotely sensed images and field observation to produce a wall-to-wall geodatabase of forest parcels accompanied with several information useful as a basis for future trade-off analysis of different ES. Here, we present the application of the MIMOSE approach to a study area of 443,758 hectares  coincident with administrative Molise Region in Central Italy. The procedure is based on a local high resolution forest types map integrated with information on the main forest management approaches. Through the non-parametric k-Nearest Neighbors techniques, we produced a growing stock volume map integrating a local forest inventory with a multispectral satellite IRS LISS III imagery. With the growing stock volume map we derived a forest age map for even-aged forest types. Later these information were used to automatically create a vector forest parcels map by multidimensional image segmentation that were finally populated with a number of information useful for ES spatial estimation. The contribution briefly introduce to the MIMOSE methodology presenting the preliminary results we achieved which constitute the basis for a future implementation of ES modeling.

  5. Resident Space Object Characterization and Behavior Understanding via Machine Learning and Ontology-based Bayesian Networks

    Furfaro, R.; Linares, R.; Gaylor, D.; Jah, M.; Walls, R.

    2016-09-01

    In this paper, we present an end-to-end approach that employs machine learning techniques and Ontology-based Bayesian Networks (BN) to characterize the behavior of resident space objects. State-of-the-Art machine learning architectures (e.g. Extreme Learning Machines, Convolutional Deep Networks) are trained on physical models to learn the Resident Space Object (RSO) features in the vectorized energy and momentum states and parameters. The mapping from measurements to vectorized energy and momentum states and parameters enables behavior characterization via clustering in the features space and subsequent RSO classification. Additionally, Space Object Behavioral Ontologies (SOBO) are employed to define and capture the domain knowledge-base (KB) and BNs are constructed from the SOBO in a semi-automatic fashion to execute probabilistic reasoning over conclusions drawn from trained classifiers and/or directly from processed data. Such an approach enables integrating machine learning classifiers and probabilistic reasoning to support higher-level decision making for space domain awareness applications. The innovation here is to use these methods (which have enjoyed great success in other domains) in synergy so that it enables a "from data to discovery" paradigm by facilitating the linkage and fusion of large and disparate sources of information via a Big Data Science and Analytics framework.

  6. Ontology Based Quality Evaluation for Spatial Data

    Yılmaz, C.; Cömert, Ç.

    2015-08-01

    Many institutions will be providing data to the National Spatial Data Infrastructure (NSDI). Current technical background of the NSDI is based on syntactic web services. It is expected that this will be replaced by semantic web services. The quality of the data provided is important in terms of the decision-making process and the accuracy of transactions. Therefore, the data quality needs to be tested. This topic has been neglected in Turkey. Data quality control for NSDI may be done by private or public "data accreditation" institutions. A methodology is required for data quality evaluation. There are studies for data quality including ISO standards, academic studies and software to evaluate spatial data quality. ISO 19157 standard defines the data quality elements. Proprietary software such as, 1Spatial's 1Validate and ESRI's Data Reviewer offers quality evaluation based on their own classification of rules. Commonly, rule based approaches are used for geospatial data quality check. In this study, we look for the technical components to devise and implement a rule based approach with ontologies using free and open source software in semantic web context. Semantic web uses ontologies to deliver well-defined web resources and make them accessible to end-users and processes. We have created an ontology conforming to the geospatial data and defined some sample rules to show how to test data with respect to data quality elements including; attribute, topo-semantic and geometrical consistency using free and open source software. To test data against rules, sample GeoSPARQL queries are created, associated with specifications.

  7. A practical and automated approach to large area forest disturbance mapping with remote sensing.

    Mutlu Ozdogan

    Full Text Available In this paper, I describe a set of procedures that automate forest disturbance mapping using a pair of Landsat images. The approach is built on the traditional pair-wise change detection method, but is designed to extract training data without user interaction and uses a robust classification algorithm capable of handling incorrectly labeled training data. The steps in this procedure include: i creating masks for water, non-forested areas, clouds, and cloud shadows; ii identifying training pixels whose value is above or below a threshold defined by the number of standard deviations from the mean value of the histograms generated from local windows in the short-wave infrared (SWIR difference image; iii filtering the original training data through a number of classification algorithms using an n-fold cross validation to eliminate mislabeled training samples; and finally, iv mapping forest disturbance using a supervised classification algorithm. When applied to 17 Landsat footprints across the U.S. at five-year intervals between 1985 and 2010, the proposed approach produced forest disturbance maps with 80 to 95% overall accuracy, comparable to those obtained from traditional approaches to forest change detection. The primary sources of mis-classification errors included inaccurate identification of forests (errors of commission, issues related to the land/water mask, and clouds and cloud shadows missed during image screening. The approach requires images from the peak growing season, at least for the deciduous forest sites, and cannot readily distinguish forest harvest from natural disturbances or other types of land cover change. The accuracy of detecting forest disturbance diminishes with the number of years between the images that make up the image pair. Nevertheless, the relatively high accuracies, little or no user input needed for processing, speed of map production, and simplicity of the approach make the new method especially practical for

  8. A novel intra-operative, high-resolution atrial mapping approach.

    Yaksh, Ameeta; van der Does, Lisette J M E; Kik, Charles; Knops, Paul; Oei, Frans B S; van de Woestijne, Pieter C; Bekkers, Jos A; Bogers, Ad J J C; Allessie, Maurits A; de Groot, Natasja M S

    2015-12-01

    A new technique is demonstrated for extensive high-resolution intra-operative atrial mapping that will facilitate the localization of atrial fibrillation (AF) sources and identification of the substrate perpetuating AF. Prior to the start of extra-corporal circulation, a 8 × 24-electrode array (2-mm inter-electrode distance) is placed subsequently on all the right and left epicardial atrial sites, including Bachmann's bundle, for recording of unipolar electrograms during sinus rhythm and (induced) AF. AF is induced by high-frequency pacing at the right atrial free wall. A pacemaker wire stitched to the right atrium serves as a reference signal. The indifferent pole is connected to a steal wire fixed to subcutaneous tissue. Electrograms are recorded by a computerized mapping system and, after amplification (gain 1000), filtering (bandwidth 0.5-400 Hz), sampling (1 kHz) and analogue to digital conversion (16 bits), automatically stored on hard disk. During the mapping procedure, real-time visualization secures electrogram quality. Analysis will be performed offline. This technique was performed in 168 patients of 18 years and older, with coronary and/or structural heart disease, with or without AF, electively scheduled for cardiac surgery and a ventricular ejection fraction above 40 %. The mean duration of the entire mapping procedure including preparation time was 9 ± 2 min. Complications related to the mapping procedure during or after cardiac surgery were not observed. We introduce the first epicardial atrial mapping approach with a high resolution of ≥1728 recording sites which can be performed in a procedure time of only 9±2 mins. This mapping technique can potentially identify areas responsible for initiation and persistence of AF and hopefully can individualize both diagnosis and therapy of AF.

  9. Analysis of multiplex gene expression maps obtained by voxelation.

    An, Li; Xie, Hongbo; Chin, Mark H; Obradovic, Zoran; Smith, Desmond J; Megalooikonomou, Vasileios

    2009-04-29

    Gene expression signatures in the mammalian brain hold the key to understanding neural development and neurological disease. Researchers have previously used voxelation in combination with microarrays for acquisition of genome-wide atlases of expression patterns in the mouse brain. On the other hand, some work has been performed on studying gene functions, without taking into account the location information of a gene's expression in a mouse brain. In this paper, we present an approach for identifying the relation between gene expression maps obtained by voxelation and gene functions. To analyze the dataset, we chose typical genes as queries and aimed at discovering similar gene groups. Gene similarity was determined by using the wavelet features extracted from the left and right hemispheres averaged gene expression maps, and by the Euclidean distance between each pair of feature vectors. We also performed a multiple clustering approach on the gene expression maps, combined with hierarchical clustering. Among each group of similar genes and clusters, the gene function similarity was measured by calculating the average gene function distances in the gene ontology structure. By applying our methodology to find similar genes to certain target genes we were able to improve our understanding of gene expression patterns and gene functions. By applying the clustering analysis method, we obtained significant clusters, which have both very similar gene expression maps and very similar gene functions respectively to their corresponding gene ontologies. The cellular component ontology resulted in prominent clusters expressed in cortex and corpus callosum. The molecular function ontology gave prominent clusters in cortex, corpus callosum and hypothalamus. The biological process ontology resulted in clusters in cortex, hypothalamus and choroid plexus. Clusters from all three ontologies combined were most prominently expressed in cortex and corpus callosum. The experimental

  10. Analysis of multiplex gene expression maps obtained by voxelation

    Smith Desmond J

    2009-04-01

    Full Text Available Abstract Background Gene expression signatures in the mammalian brain hold the key to understanding neural development and neurological disease. Researchers have previously used voxelation in combination with microarrays for acquisition of genome-wide atlases of expression patterns in the mouse brain. On the other hand, some work has been performed on studying gene functions, without taking into account the location information of a gene's expression in a mouse brain. In this paper, we present an approach for identifying the relation between gene expression maps obtained by voxelation and gene functions. Results To analyze the dataset, we chose typical genes as queries and aimed at discovering similar gene groups. Gene similarity was determined by using the wavelet features extracted from the left and right hemispheres averaged gene expression maps, and by the Euclidean distance between each pair of feature vectors. We also performed a multiple clustering approach on the gene expression maps, combined with hierarchical clustering. Among each group of similar genes and clusters, the gene function similarity was measured by calculating the average gene function distances in the gene ontology structure. By applying our methodology to find similar genes to certain target genes we were able to improve our understanding of gene expression patterns and gene functions. By applying the clustering analysis method, we obtained significant clusters, which have both very similar gene expression maps and very similar gene functions respectively to their corresponding gene ontologies. The cellular component ontology resulted in prominent clusters expressed in cortex and corpus callosum. The molecular function ontology gave prominent clusters in cortex, corpus callosum and hypothalamus. The biological process ontology resulted in clusters in cortex, hypothalamus and choroid plexus. Clusters from all three ontologies combined were most prominently expressed in

  11. Inferring ontology graph structures using OWL reasoning

    Rodriguez-Garcia, Miguel Angel; Hoehndorf, Robert

    2018-01-01

    ' semantic content remains a challenge.We developed a method to transform ontologies into graphs using an automated reasoner while taking into account all relations between classes. Searching for (existential) patterns in the deductive closure of ontologies

  12. Ontologies, Knowledge Bases and Knowledge Management

    Chalupsky, Hans

    2002-01-01

    ...) an application called Strategy Development Assistant (SDA) that uses that ontology. The JFACC ontology served as a basis for knowledge sharing among several applications in the domain of air campaign planning...

  13. Technique for designing a domain ontology

    Palagin, A. V.; Petrenko, N. G.; Malakhov, K. S.

    2018-01-01

    The article describes the technique for designing a domain ontology, shows the flowchart of algorithm design and example of constructing a fragment of the ontology of the subject area of Computer Science is considered.

  14. Platonic wholes and quantum ontology

    Woszczek, Marek

    2015-01-01

    The subject of the book is a reconsideration of the internalistic model of composition of the Platonic type, more radical than traditional, post-Aristotelian externalistic compositionism, and its application in the field of the ontology of quantum theory. At the centre of quantum ontology is nonseparability. Quantum wholes are atemporal wholes governed by internalistic logic and they are primitive, global physical entities, requiring an extreme relativization of the fundamental notions of mechanics. That ensures quantum theory to be fully consistent with the relativistic causal structure, with

  15. MRPack: Multi-Algorithm Execution Using Compute-Intensive Approach in MapReduce

    2015-01-01

    Large quantities of data have been generated from multiple sources at exponential rates in the last few years. These data are generated at high velocity as real time and streaming data in variety of formats. These characteristics give rise to challenges in its modeling, computation, and processing. Hadoop MapReduce (MR) is a well known data-intensive distributed processing framework using the distributed file system (DFS) for Big Data. Current implementations of MR only support execution of a single algorithm in the entire Hadoop cluster. In this paper, we propose MapReducePack (MRPack), a variation of MR that supports execution of a set of related algorithms in a single MR job. We exploit the computational capability of a cluster by increasing the compute-intensiveness of MapReduce while maintaining its data-intensive approach. It uses the available computing resources by dynamically managing the task assignment and intermediate data. Intermediate data from multiple algorithms are managed using multi-key and skew mitigation strategies. The performance study of the proposed system shows that it is time, I/O, and memory efficient compared to the default MapReduce. The proposed approach reduces the execution time by 200% with an approximate 50% decrease in I/O cost. Complexity and qualitative results analysis shows significant performance improvement. PMID:26305223

  16. Mobile Ground-Based Radar Sensor for Localization and Mapping: An Evaluation of two Approaches

    Damien Vivet

    2013-08-01

    Full Text Available This paper is concerned with robotic applications using a ground-based radar sensor for simultaneous localization and mapping problems. In mobile robotics, radar technology is interesting because of its long range and the robustness of radar waves to atmospheric conditions, making these sensors well-suited for extended outdoor robotic applications. Two localization and mapping approaches using data obtained from a 360° field of view microwave radar sensor are presented and compared. The first method is a trajectory-oriented simultaneous localization and mapping technique, which makes no landmark assumptions and avoids the data association problem. The estimation of the ego-motion makes use of the Fourier-Mellin transform for registering radar images in a sequence, from which the rotation and translation of the sensor motion can be estimated. The second approach uses the consequence of using a rotating range sensor in high speed robotics. In such a situation, movement combinations create distortions in the collected data. Velocimetry is achieved here by explicitly analysing these measurement distortions. As a result, the trajectory of the vehicle and then the radar map of outdoor environments can be obtained. The evaluation of experimental results obtained by the two methods is presented on real-world data from a vehicle moving at 30 km/h over a 2.5 km course.

  17. Turkers in Africa: A Crowdsourcing Approach to Improving Agricultural Landcover Maps

    Estes, L. D.; Caylor, K. K.; Choi, J.

    2012-12-01

    In the coming decades a substantial portion of Africa is expected to be transformed to agriculture. The scale of this conversion may match or exceed that which occurred in the Brazilian Cerrado and Argentinian Pampa in recent years. Tracking the rate and extent of this conversion will depend on having an accurate baseline of the current extent of croplands. Continent-wide baseline data do exist, but the accuracy of these relatively coarse resolution, remotely sensed assessments is suspect in many regions. To develop more accurate maps of the distribution and nature of African croplands, we develop a distributed "crowdsourcing" approach that harnesses human eyeballs and image interpretation capabilities. Our initial goal is to assess the accuracy of existing agricultural land cover maps, but ultimately we aim to generate "wall-to-wall" cropland maps that can be revisited and updated to track agricultural transformation. Our approach utilizes the freely avail- able, high-resolution satellite imagery provided by Google Earth, combined with Amazon.com's Mechanical Turk platform, an online service that provides a large, global pool of workers (known as "Turkers") who perform "Human Intelligence Tasks" (HITs) for a fee. Using open-source R and python software, we select a random sample of 1 km2 cells from a grid placed over our study area, stratified by field density classes drawn from one of the coarse-scale land cover maps, and send these in batches to Mechanical Turk for processing. Each Turker is required to conduct an initial training session, on the basis of which they are assigned an accuracy score that determines whether the Turker is allowed to proceed with mapping tasks. Completed mapping tasks are automatically retrieved and processed on our server, and subject to two further quality control measures. The first of these is a measure of the spatial accuracy of Turker mapped areas compared to a "gold standard" maps from selected locations that are randomly

  18. Toward an Ontology of Simulated Social Interaction

    Seibt, Johanna

    2017-01-01

    and asymmetric modes of realizing Á, called the ‘simulatory expansion’ of interaction type Á. Simulatory expansions of social interactions can be used to map out different kinds and degrees of sociality in human-human and human-robot interaction, relative to current notions of sociality in philosophy......, anthropology, and linguistics. The classificatory framework developed (SISI) thus represents the field of possible simulated social interactions. SISI can be used to clarify which conceptual and empirical grounds we can draw on to evaluate capacities and affordances of robots for social interaction......The paper develops a general conceptual framework for the ontological classification of human-robot interaction. After arguing against fictionalist interpretations of human-robot interactions, I present five notions of simulation or partial realization, formally defined in terms of relationships...

  19. Global land cover mapping at 30 m resolution: A POK-based operational approach

    Chen, Jun; Chen, Jin; Liao, Anping; Cao, Xin; Chen, Lijun; Chen, Xuehong; He, Chaoying; Han, Gang; Peng, Shu; Lu, Miao; Zhang, Weiwei; Tong, Xiaohua; Mills, Jon

    2015-05-01

    Global Land Cover (GLC) information is fundamental for environmental change studies, land resource management, sustainable development, and many other societal benefits. Although GLC data exists at spatial resolutions of 300 m and 1000 m, a 30 m resolution mapping approach is now a feasible option for the next generation of GLC products. Since most significant human impacts on the land system can be captured at this scale, a number of researchers are focusing on such products. This paper reports the operational approach used in such a project, which aims to deliver reliable data products. Over 10,000 Landsat-like satellite images are required to cover the entire Earth at 30 m resolution. To derive a GLC map from such a large volume of data necessitates the development of effective, efficient, economic and operational approaches. Automated approaches usually provide higher efficiency and thus more economic solutions, yet existing automated classification has been deemed ineffective because of the low classification accuracy achievable (typically below 65%) at global scale at 30 m resolution. As a result, an approach based on the integration of pixel- and object-based methods with knowledge (POK-based) has been developed. To handle the classification process of 10 land cover types, a split-and-merge strategy was employed, i.e. firstly each class identified in a prioritized sequence and then results are merged together. For the identification of each class, a robust integration of pixel-and object-based classification was developed. To improve the quality of the classification results, a knowledge-based interactive verification procedure was developed with the support of web service technology. The performance of the POK-based approach was tested using eight selected areas with differing landscapes from five different continents. An overall classification accuracy of over 80% was achieved. This indicates that the developed POK-based approach is effective and feasible

  20. Biomedical word sense disambiguation with ontologies and metadata: automation meets accuracy

    Hakenberg Jörg

    2009-01-01

    Full Text Available Abstract Background Ontology term labels can be ambiguous and have multiple senses. While this is no problem for human annotators, it is a challenge to automated methods, which identify ontology terms in text. Classical approaches to word sense disambiguation use co-occurring words or terms. However, most treat ontologies as simple terminologies, without making use of the ontology structure or the semantic similarity between terms. Another useful source of information for disambiguation are metadata. Here, we systematically compare three approaches to word sense disambiguation, which use ontologies and metadata, respectively. Results The 'Closest Sense' method assumes that the ontology defines multiple senses of the term. It computes the shortest path of co-occurring terms in the document to one of these senses. The 'Term Cooc' method defines a log-odds ratio for co-occurring terms including co-occurrences inferred from the ontology structure. The 'MetaData' approach trains a classifier on metadata. It does not require any ontology, but requires training data, which the other methods do not. To evaluate these approaches we defined a manually curated training corpus of 2600 documents for seven ambiguous terms from the Gene Ontology and MeSH. All approaches over all conditions achieve 80% success rate on average. The 'MetaData' approach performed best with 96%, when trained on high-quality data. Its performance deteriorates as quality of the training data decreases. The 'Term Cooc' approach performs better on Gene Ontology (92% success than on MeSH (73% success as MeSH is not a strict is-a/part-of, but rather a loose is-related-to hierarchy. The 'Closest Sense' approach achieves on average 80% success rate. Conclusion Metadata is valuable for disambiguation, but requires high quality training data. Closest Sense requires no training, but a large, consistently modelled ontology, which are two opposing conditions. Term Cooc achieves greater 90