WorldWideScience

Sample records for based interoperability solution

  1. Interoperable and standard e-Health solution over Bluetooth.

    Science.gov (United States)

    Martinez, I; Del Valle, P; Munoz, P; Trigo, J D; Escayola, J; Martínez-Espronceda, M; Muñoz, A; Serrano, L; Garcia, J

    2010-01-01

    The new paradigm of e-Health demands open sensors and middleware components that permit transparent integration and end-to-end interoperability of new personal health devices. The use of standards seems to be the internationally adopted way to solve these problems. This paper presents the implementation of an end-to-end standards-based e-Health solution. This includes ISO/IEEE11073 standard for the interoperability of the medical devices in the patient environment and EN13606 standard for the interoperable exchange of the Electronic Healthcare Record. The design strictly fulfills all the technical features of the most recent versions of both standards. The implemented prototype has been tested in a laboratory environment to demonstrate its feasibility for its further transfer to the healthcare system.

  2. Linked Data for Transaction Based Enterprise Interoperability

    NARCIS (Netherlands)

    Folmer, Erwin Johan Albert; ir. Krukkert, D.; Sinderen, Marten; Chapurlat, Vincent

    2015-01-01

    Interoperability is of major importance in B2B environments. Starting with EDI in the ‘80s, currently interoperability relies heavily on XML-based standards. Although having great impact, still issues remain to be solved for improving B2B interoperability. These issues include lack of dynamics, cost

  3. On MDA - SOA based Intercloud Interoperability framework

    Directory of Open Access Journals (Sweden)

    Tahereh Nodehi

    2013-01-01

    Full Text Available Cloud computing has been one of the latest technologies which assures reliable delivery of on - demand computing services over the Internet. Cloud service providers have established geographically distributed data centers and computing resources, which are available online as service. The clouds operated by different service providers working together in collaboration can open up lots more spaces for innovative scenarios with huge amount of resources provisioning on demand. However, current cloud systems do not support intercloud interoperability. This paper is thus motivated to address Intercloud Interoperabilityby analyzing different methodologies that have been applied to resolve various scenarios of interoperability. Model Driven Architecture (MDA and Service Oriented Architecture (SOA method have been used to address interoperability in various scenarios, which also opens up spaces to address intercloud interoperability by making use of these well accepted methodologies. The focus of this document is to show Intercloud Interoperability can be supported through a Model Driven approach and Service Oriented systems. Moreover, the current state of the art in Intercloud, concept and benefits of MDA and SOA are discussed in the paper. At the same time this paper also proposes a generic architecture for MDA - SOA based framework, which can be useful for developing applications which will require intercloud interoperability. The paper justi fies the usability of the framework by a use - case scenario for dynamic workload migration among heterogeneous clouds.

  4. -Means Based Fingerprint Segmentation with Sensor Interoperability

    Directory of Open Access Journals (Sweden)

    Yang Xiukun

    2010-01-01

    Full Text Available A critical step in an automatic fingerprint recognition system is the segmentation of fingerprint images. Existing methods are usually designed to segment fingerprint images originated from a certain sensor. Thus their performances are significantly affected when dealing with fingerprints collected by different sensors. This work studies the sensor interoperability of fingerprint segmentation algorithms, which refers to the algorithm's ability to adapt to the raw fingerprints obtained from different sensors. We empirically analyze the sensor interoperability problem, and effectively address the issue by proposing a -means based segmentation method called SKI. SKI clusters foreground and background blocks of a fingerprint image based on the -means algorithm, where a fingerprint block is represented by a 3-dimensional feature vector consisting of block-wise coherence, mean, and variance (abbreviated as CMV. SKI also employs morphological postprocessing to achieve favorable segmentation results. We perform SKI on each fingerprint to ensure sensor interoperability. The interoperability and robustness of our method are validated by experiments performed on a number of fingerprint databases which are obtained from various sensors.

  5. INTEROPERABLE FRAMEWORK SOLUTION TO ICU HEALTH CARE MONITORING

    Directory of Open Access Journals (Sweden)

    Shola Usha Rani

    2015-03-01

    Full Text Available An interoperable telehealth system provides an independent healthcare solution for better management of health and wellness. It allows people to manage their heart disease and diabetes etc. by sending their health parameters like blood pressure, heart rate, glucose levels, temperature, weight, respiration from remote place to health professional, and get real-time feedback on their condition. Here different medical devices are connected to the patient for monitoring. Each kind of device is manufactured by different vendors. And each device information and communication requires different installation and network design. It causes design complexities and network overheads when moving patients for diagnosis examinations. This problem will be solved by interoperability among devices. The ISO/IEEE 11073 is an international standard which produces interoperable hospital information system solution to medical devices. One such type of integrated environment that requires the integration of medical devices is ICU (Intensive Care Unit. This paper presents the issues for ICU monitoring system and framework solution for it.

  6. Interoperability

    DEFF Research Database (Denmark)

    Savin, Andrej

    would considerably alter the current privacy setting.3 First, the current Directive would be replaced with a Regulation, achieving EU-­‐wide harmonization. Second, the scope of the instrument would be widened and the provisions made more precise. Third, the use of consent for data processing would....... Finally, the Commission would obtain significant new powers to adopt delegated acts. This appendix explores the impact that the proposed Regulation might have on interoperability of user-­‐generated services.4 Since the proposed Regulation is an instrument of high complexity, only those provisions...

  7. Data interoperability software solution for emergency reaction in the Europe Union

    Science.gov (United States)

    Casado, R.; Rubiera, E.; Sacristan, M.; Schütte, F.; Peters, R.

    2015-07-01

    Emergency management becomes more challenging in international crisis episodes because of cultural, semantic and linguistic differences between all stakeholders, especially first responders. Misunderstandings between first responders makes decision making slower and more difficult. However, spread and development of networks and IT-based emergency management systems (EMSs) have improved emergency responses, which have become more coordinated. Despite improvements made in recent years, EMSs have not still solved problems related to cultural, semantic and linguistic differences which are the real cause of slower decision making. In addition, from a technical perspective, the consolidation of current EMSs and the different formats used to exchange information offers another problem to be solved in any solution proposed for information interoperability between heterogeneous EMSs in different contexts. To overcome these problems, we present a software solution based on semantic and mediation technologies. EMERGency ELements (EMERGEL) (Fundacion CTIC and AntwortING Ingenieurbüro PartG, 2013), a common and modular ontology shared by all the stakeholders, has been defined. It offers the best solution to gather all stakeholders' knowledge in a unique and flexible data model, taking into account different countries' cultural and linguistic issues. To deal with the diversity of data protocols and formats, we have designed a service-oriented architecture for data interoperability (named DISASTER: Data Interoperability Solution At STakeholders Emergency Reaction) providing a flexible extensible solution to solve the mediation issues. Web services have been adopted as specific technology to implement this paradigm that has the most significant academic and industrial visibility and attraction. Contributions of this work have been validated through the design and development of a cross-border realistic prototype scenario, actively involving both emergency managers and emergency

  8. IHE based interoperability - benefits and challenges.

    Science.gov (United States)

    Wozak, Florian; Ammenwerth, Elske; Hörbst, Alexander; Sögner, Peter; Mair, Richard; Schabetsberger, Thomas

    2008-01-01

    Optimized workflows and communication between institutions involved in a patient's treatment process can lead to improved quality and efficiency in the healthcare sector. Electronic Health Records (EHRs) provide a patient-centered access to clinical data across institutional boundaries supporting the above mentioned aspects. Interoperability is regarded as vital success factor. However a clear definition of interoperability does not exist. The aim of this work is to define and to assess interoperability criteria as required for EHRs. The definition and assessment of interoperability criteria is supported by the analysis of existing literature and personal experience as well as by discussions with several domain experts. Criteria for interoperability addresses the following aspects: Interfaces, Semantics, Legal and organizational aspects and Security. The Integrating the Healthcare Enterprises initiative (IHE) profiles make a major contribution to these aspects, but they also arise new problems. Flexibility for adoption to different organizational/regional or other specific conditions is missing. Regional or national initiatives should get a possibility to realize their specific needs within the boundaries of IHE profiles. Security so far is an optional element which is one of IHE greatest omissions. An integrated security approach seems to be preferable. Irrespective of the so far practical significance of the IHE profiles it appears to be of great importance, that the profiles are constantly checked against practical experiences and are continuously adapted.

  9. A cloud-based approach for interoperable electronic health records (EHRs).

    Science.gov (United States)

    Bahga, Arshdeep; Madisetti, Vijay K

    2013-09-01

    We present a cloud-based approach for the design of interoperable electronic health record (EHR) systems. Cloud computing environments provide several benefits to all the stakeholders in the healthcare ecosystem (patients, providers, payers, etc.). Lack of data interoperability standards and solutions has been a major obstacle in the exchange of healthcare data between different stakeholders. We propose an EHR system - cloud health information systems technology architecture (CHISTAR) that achieves semantic interoperability through the use of a generic design methodology which uses a reference model that defines a general purpose set of data structures and an archetype model that defines the clinical data attributes. CHISTAR application components are designed using the cloud component model approach that comprises of loosely coupled components that communicate asynchronously. In this paper, we describe the high-level design of CHISTAR and the approaches for semantic interoperability, data integration, and security.

  10. The GEOSS solution for enabling data interoperability and integrative research.

    Science.gov (United States)

    Nativi, Stefano; Mazzetti, Paolo; Craglia, Max; Pirrone, Nicola

    2014-03-01

    Global sustainability research requires an integrative research effort underpinned by digital infrastructures (systems) able to harness data and heterogeneous information across disciplines. Digital data and information sharing across systems and applications is achieved by implementing interoperability: a property of a product or system to work with other products or systems, present or future. There are at least three main interoperability challenges a digital infrastructure must address: technological, semantic, and organizational. In recent years, important international programs and initiatives are focusing on such an ambitious objective. This manuscript presents and combines the studies and the experiences carried out by three relevant projects, focusing on the heavy metal domain: Global Mercury Observation System, Global Earth Observation System of Systems (GEOSS), and INSPIRE. This research work recognized a valuable interoperability service bus (i.e., a set of standards models, interfaces, and good practices) proposed to characterize the integrative research cyber-infrastructure of the heavy metal research community. In the paper, the GEOSS common infrastructure is discussed implementing a multidisciplinary and participatory research infrastructure, introducing a possible roadmap for the heavy metal pollution research community to join GEOSS as a new Group on Earth Observation community of practice and develop a research infrastructure for carrying out integrative research in its specific domain.

  11. Interoperability for electronic ID

    OpenAIRE

    Zygadlo, Zuzanna

    2009-01-01

    Electronic Business, including eBanking, eCommerce and eGovernmental services, is today based on a large variety of security solutions, comprising electronic IDs provided by a broad community of Public Key Infrastructure (PKI) vendors. Significant differences in implementations of those solutions introduce a problem of lack of interoperability in electronic business, which have not yet been resolved by standardization and interoperability initiatives based on existing PKI trust models. It i...

  12. DIMP: an interoperable solution for software integration and product data exchange

    Science.gov (United States)

    Wang, Xi Vincent; Xu, Xun William

    2012-08-01

    Today, globalisation has become one of the main trends of manufacturing business that has led to a world-wide decentralisation of resources amongst not only individual departments within one company but also business partners. However, despite the development and improvement in the last few decades, difficulties in information exchange and sharing still exist in heterogeneous applications environments. This article is divided into two parts. In the first part, related research work and integrating solutions are reviewed and discussed. The second part introduces a collaborative environment called distributed interoperable manufacturing platform, which is based on a module-based, service-oriented architecture (SOA). In the platform, the STEP-NC data model is used to facilitate data-exchange among heterogeneous CAD/CAM/CNC systems.

  13. Author identities an interoperability problem solved by a collaborative solution

    Science.gov (United States)

    Fleischer, D.; Czerniak, A.; Schirnick, C.

    2012-12-01

    The identity of authors and data providers is crucial for personalized interoperability. The marketplace of available identifiers is packed and the right choice is getting more and more complicated. Even though there are more then 15 different systems available there are still some under development and proposed to come up by the end of 2012 ('PubMed Central Author ID' and ORCID). Data Management on a scale beyond the size of a single research institute but on the scale of a scientific site including a university with student education program needs to tackle this problem and so did the Kiel Data Management an Infrastructure. The main problem with the identities of researchers is the quite high frequency changes in positions during a scientist life. The required system needed to be a system that already contained the potential of preregistered people with their scientific publications from other countries, institutions and organizations. Scanning the author ID marketplace brought up, that there us a high risk of additional workload to the researcher itself or the administration due to the fact that individuals need to register an ID for themselves or the chosen register is not yet big enough to simply find the right entry. On the other hand libraries deal with authors and their publications now for centuries and they have high quality catalogs with person identities already available. Millions of records internationally mapped are available by collaboration with libraries and can be used in exactly the same scope. The international collaboration between libraries (VIAF) provides a mapping between libraries from the US, CA, UK, FR, GER and many more. The international library author identification system made it possible to actually reach at the first matching a success of 60% of all scientists. The additional advantage is that librarians can finalize the Identity system in a kind of background process. The Kiel Data Management Infrastructure initiated a web service

  14. Collaborative Solution Architecture for Developing a National Interoperability Framework in Romania

    Directory of Open Access Journals (Sweden)

    Bogdan GHILIC-MICU

    2010-01-01

    Full Text Available Interoperability framework is a set of standards and guidelines that describe how organizations have established or will establish to interact. The framework is not static, but one that adapts to the change of standards, administrative requirements and technology. It can be adapted to the socio - economic, political, cultural, linguistic, historical and geographical purposes and to a specific context or situation. The article aims to clarify the essential concepts necessary for outlining Romanian national interoperability framework and to propose collaborative solution architecture for its development, updating and maintaining.

  15. Cloud-based Communications Planning Collaboration and Interoperability

    Science.gov (United States)

    2012-06-01

    Marine Expeditionary Force SaaS Software as a Service SOA Service Oriented Architecture SPE Systems Planning and Engineering SPEED... Software as a Service (SaaS) application to improve processes and products in the field of Marine Corps communications planning through automation...collaboration, and interoperability. It introduces the idea of using the Software as a Service (SaaS) model to develop a cloud-based communications

  16. Standards-based data interoperability in the climate sciences

    Science.gov (United States)

    Woolf, Andrew; Cramer, Ray; Gutierrez, Marta; Kleese van Dam, Kerstin; Kondapalli, Siva; Latham, Susan; Lawrence, Bryan; Lowry, Roy; O'Neill, Kevin

    2005-03-01

    Emerging developments in geographic information systems and distributed computing offer a roadmap towards an unprecedented spatial data infrastructure in the climate sciences. Key to this are the standards developments for digital geographic information being led by the International Organisation for Standardisation (ISO) technical committee on geographic information/geomatics (TC211) and the Open Geospatial Consortium (OGC). These, coupled with the evolution of standardised web services for applications on the internet by the World Wide Web Consortium (W3C), mean that opportunities for both new applications and increased interoperability exist. These are exemplified by the ability to construct ISO-compliant data models that expose legacy data sources through OGC web services. This paper concentrates on the applicability of these standards to climate data by introducing some examples and outlining the challenges ahead. An abstract data model is developed, based on ISO standards, and applied to a range of climate data both observational and modelled. An OGC Web Map Server interface is constructed for numerical weather prediction (NWP) data stored in legacy data files. A W3C web service for remotely accessing gridded climate data is illustrated. Challenges identified include the following: first, both the ISO and OGC specifications require extensions to support climate data. Secondly, OGC services need to fully comply with W3C web services, and support complex access control. Finally, to achieve real interoperability, broadly accepted community-based semantic data models are required across the range of climate data types. These challenges are being actively pursued, and broad data interoperability for the climate sciences appears within reach.

  17. A methodology for the development of software agent based interoperable telemedicine systems: a tele-electrocardiography perspective.

    Science.gov (United States)

    Ganguly, P; Ray, P

    2000-01-01

    Telemedicine involves the integration of information, human-machine, and healthcare technologies. Because different modalities of patient care require applications running on heterogeneous computing environment, software interoperability is a major issue in telemedicine. Software agent technology provides a range of promising techniques to solve this problem. This article discusses the development of a methodology for the design of interoperable telemedicine systems (illustrated with a tele-electrocardiography application). Software interoperability between different applications can be modeled at different levels of abstraction such as physical interoperability, data-type interoperability, specification-level interoperability, and semantic interoperability. Software agents address the issue of software interoperability at semantic level. A popular object-oriented software development methodology - unified modeling language (UML) - has been used for this development. This research has demonstrated the feasibility of the development of agent-based interoperable telemedicine systems. More research is needed before widespread deployment of such systems can take place.

  18. IoT interoperability:a hub-based approach

    OpenAIRE

    Blackstock, Michael; Lea, Rodger

    2014-01-01

    Interoperability in the Internet of Things is critical for emerging services and applications. In this paper we advocate the use of IoT ‘hubs’ to aggregate things using web protocols, and suggest a staged approach to interoperability. In the context of a UK government funded project involving 8 IoT projects to address cross-domain IoT interoperability, we introduce the HyperCat IoT catalogue specification. We then describe the tools and techniques we developed to adapt an existing data portal...

  19. eHealth integration and interoperability issues: towards a solution through enterprise architecture.

    Science.gov (United States)

    Adenuga, Olugbenga A; Kekwaletswe, Ray M; Coleman, Alfred

    2015-01-01

    Investments in healthcare information and communication technology (ICT) and health information systems (HIS) continue to increase. This is creating immense pressure on healthcare ICT and HIS to deliver and show significance in such investments in technology. It is discovered in this study that integration and interoperability contribute largely to this failure in ICT and HIS investment in healthcare, thus resulting in the need towards healthcare architecture for eHealth. This study proposes an eHealth architectural model that accommodates requirement based on healthcare need, system, implementer, and hardware requirements. The model is adaptable and examines the developer's and user's views that systems hold high hopes for their potential to change traditional organizational design, intelligence, and decision-making.

  20. An adaptive semantic based mediation system for data interoperability among Health Information Systems.

    Science.gov (United States)

    Khan, Wajahat Ali; Khattak, Asad Masood; Hussain, Maqbool; Amin, Muhammad Bilal; Afzal, Muhammad; Nugent, Christopher; Lee, Sungyoung

    2014-08-01

    Heterogeneity in the management of the complex medical data, obstructs the attainment of data level interoperability among Health Information Systems (HIS). This diversity is dependent on the compliance of HISs with different healthcare standards. Its solution demands a mediation system for the accurate interpretation of data in different heterogeneous formats for achieving data interoperability. We propose an adaptive AdapteR Interoperability ENgine mediation system called ARIEN, that arbitrates between HISs compliant to different healthcare standards for accurate and seamless information exchange to achieve data interoperability. ARIEN stores the semantic mapping information between different standards in the Mediation Bridge Ontology (MBO) using ontology matching techniques. These mappings are provided by our System for Parallel Heterogeneity (SPHeRe) matching system and Personalized-Detailed Clinical Model (P-DCM) approach to guarantee accuracy of mappings. The realization of the effectiveness of the mappings stored in the MBO is evaluation of the accuracy in transformation process among different standard formats. We evaluated our proposed system with the transformation process of medical records between Clinical Document Architecture (CDA) and Virtual Medical Record (vMR) standards. The transformation process achieved over 90 % of accuracy level in conversion process between CDA and vMR standards using pattern oriented approach from the MBO. The proposed mediation system improves the overall communication process between HISs. It provides an accurate and seamless medical information exchange to ensure data interoperability and timely healthcare services to patients.

  1. Archetype-based electronic health records: a literature review and evaluation of their applicability to health data interoperability and access.

    Science.gov (United States)

    Wollersheim, Dennis; Sari, Anny; Rahayu, Wenny

    Health Information Managers (HIMs) are responsible for overseeing health information. The change management necessary during the transition to electronic health records (EHR) is substantial, and ongoing. Archetype-based EHRs are a core health information system component which solve many of the problems that arise during this period of change. Archetypes are models of clinical content, and they have many beneficial properties. They are interoperable, both between settings and through time. They are more amenable to change than conventional paradigms, and their design is congruent with clinical practice. This paper is an overview of the current archetype literature relevant to Health Information Managers. The literature was sourced in the English language sections of ScienceDirect, IEEE Explore, Pubmed, Google Scholar, ACM Digital library and other databases on the usage of archetypes for electronic health record storage, looking at the current areas of archetype research, appropriate usage, and future research. We also used reference lists from the cited papers, papers referenced by the openEHR website, and the recommendations from experts in the area. Criteria for inclusion were (a) if studies covered archetype research and (b) were either studies of archetype use, archetype system design, or archetype effectiveness. The 47 papers included show a wide and increasing worldwide archetype usage, in a variety of medical domains. Most of the papers noted that archetypes are an appropriate solution for future-proof and interoperable medical data storage. We conclude that archetypes are a suitable solution for the complex problem of electronic health record storage and interoperability.

  2. Policy-Based Negotiation Engine for Cross-Domain Interoperability

    Science.gov (United States)

    Vatan, Farrokh; Chow, Edward T.

    2012-01-01

    A successful policy negotiation scheme for Policy-Based Management (PBM) has been implemented. Policy negotiation is the process of determining the "best" communication policy that all of the parties involved can agree on. Specifically, the problem is how to reconcile the various (and possibly conflicting) communication protocols used by different divisions. The solution must use protocols available to all parties involved, and should attempt to do so in the best way possible. Which protocols are commonly available, and what the definition of "best" is will be dependent on the parties involved and their individual communications priorities.

  3. A Proposed Engineering Process and Prototype Toolset for Developing C2-to-Simulation Interoperability Solutions

    NARCIS (Netherlands)

    Gautreau, B.; Khimeche, L.; Reus, N.M. de; Heffner, K.; Mevassvik, O.M.

    2014-01-01

    The Coalition Battle Management Language (C-BML) is an open standard being developed for the exchange of digitized military information among command and control (C2), simulation and autonomous systems by the Simulation Interoperability Standards Organization (SISO). As the first phase of the C-BML

  4. The role of architecture and ontology for interoperability.

    Science.gov (United States)

    Blobel, Bernd; González, Carolina; Oemig, Frank; Lopéz, Diego; Nykänen, Pirkko; Ruotsalainen, Pekka

    2010-01-01

    Turning from organization-centric to process-controlled or even to personalized approaches, advanced healthcare settings have to meet special interoperability challenges. eHealth and pHealth solutions must assure interoperability between actors cooperating to achieve common business objectives. Hereby, the interoperability chain also includes individually tailored technical systems, but also sensors and actuators. For enabling corresponding pervasive computing and even autonomic computing, individualized systems have to be based on an architecture framework covering many domains, scientifically managed by specialized disciplines using their specific ontologies in a formalized way. Therefore, interoperability has to advance from a communication protocol to an architecture-centric approach mastering ontology coordination challenges.

  5. Standards-based Content Resources: A Prerequisite for Content Integration and Content Interoperability

    Directory of Open Access Journals (Sweden)

    Christian Galinski

    2010-05-01

    Full Text Available Objective: to show how standards-based approaches for content standardization, content management, content related services and tools as well as the respective certification systems not only guarantee reliable content integration and content interoperability, but also are of particular benefit to people with special needs in eAccessibility/eInclusion. Method: document MoU/MG/05 N0221 ''Semantic Interoperability and the need for a coherent policy for a framework of distributed, possibly federated repositories for all kinds of content items on a world-wide scale''2, which was adopted in 2005, was a first step towards the formulation of global interoperability requirements for structured content. These requirements -based on advanced terminological principles- were taken up in EU-projects such as IN-SAFETY (INfrastructure and SAFETY and OASIS (Open architecture for Accessible Services Integration and Standardization. Results: Content integration and content interoperability are key concepts in connection with the emergence of state-of-the-art distributed and federated databases/repositories of structured content. Given the fact that linguistic content items are increasingly combined with or embedded in non-linguistic content items (and vice versa, a systemic and generic approach to data modelling and content management has become the order of the day. Fulfilling the requirements of capability for multilinguality and multimodality, based on open standards makes software and database design fit for eAccessibility/eInclusion from the outset. It also makes structured content capable for global content integration and content interoperability, because it enhances its potential for being re-used and re-purposed in totally different eApplications. Such content as well as the methods, tools and services applied can be subject to new kinds of certification schemes which also should be based on standards. Conclusions: Content must be totally reliable in some

  6. Interoperability of Geographic Information: A Communication Process –Based Prototype

    Directory of Open Access Journals (Sweden)

    Jean Brodeur

    2005-04-01

    Full Text Available Since 1990, municipal, state/provincial, and federal governments have developed numerous geographic databases over the years to fulfill organizations' specific needs. As such, same real world topographic phenomena have been abstracted differently, for instance vegetation (surface, trees (surface, wooded area (line, wooded area (point and line, milieu boisé (surface, zone boisée (unknown geometry. Today, information about these geographic phenomena is accessible on the Internet from Web infrastructures specially developed to simplify their access. Early in the nineties, the development of interoperability of geographic information has been undertaken to solve syntactic, structural, and semantic heterogeneities as well as spatial and temporal heterogeneities to facilitate sharing and integration of such data. Recently, we have proposed a new conceptual framework for interoperability of geographic information based on the human communication process, cognitive science, and ontology, and introduced geosemantic proximity, a reasoning methodology to qualify dynamically the semantic similarity between geographic abstractions. This framework could be of interest to other disciplines. This paper presents the details of our framework for interoperability of geographic information as well as a prototype.

  7. CCR+: Metadata Based Extended Personal Health Record Data Model Interoperable with the ASTM CCR Standard.

    Science.gov (United States)

    Park, Yu Rang; Yoon, Young Jo; Jang, Tae Hun; Seo, Hwa Jeong; Kim, Ju Han

    2014-01-01

    Extension of the standard model while retaining compliance with it is a challenging issue because there is currently no method for semantically or syntactically verifying an extended data model. A metadata-based extended model, named CCR+, was designed and implemented to achieve interoperability between standard and extended models. Furthermore, a multilayered validation method was devised to validate the standard and extended models. The American Society for Testing and Materials (ASTM) Community Care Record (CCR) standard was selected to evaluate the CCR+ model; two CCR and one CCR+ XML files were evaluated. In total, 188 metadata were extracted from the ASTM CCR standard; these metadata are semantically interconnected and registered in the metadata registry. An extended-data-model-specific validation file was generated from these metadata. This file can be used in a smartphone application (Health Avatar CCR+) as a part of a multilayered validation. The new CCR+ model was successfully evaluated via a patient-centric exchange scenario involving multiple hospitals, with the results supporting both syntactic and semantic interoperability between the standard CCR and extended, CCR+, model. A feasible method for delivering an extended model that complies with the standard model is presented herein. There is a great need to extend static standard models such as the ASTM CCR in various domains: the methods presented here represent an important reference for achieving interoperability between standard and extended models.

  8. A Space Acquisition Leading Indicator Based on System Interoperation Maturity

    Science.gov (United States)

    2010-12-01

    uses a series of gates and reviews that demand various levels of program maturity and rigor. These gates and reviews enforce good acquisition...Integrated Circuit ( ASIC ) An inadequate architecture design and a flawed flight software development plan for the GEO satellite‘s Signal... ASIC ) 2005 DOD - Status of the Space Based Infrared System Program, Report to the Defense and Intelligence Committees SBIRS An inadequate

  9. Design and Implement AN Interoperable Internet of Things Application Based on AN Extended Ogc Sensorthings Api Standard

    Science.gov (United States)

    Huang, C. Y.; Wu, C. H.

    2016-06-01

    The Internet of Things (IoT) is an infrastructure that interconnects uniquely-identifiable devices using the Internet. By interconnecting everyday appliances, various monitoring and physical mashup applications can be constructed to improve people's daily life. However, IoT devices created by different manufacturers follow different proprietary protocols and cannot communicate with each other. This heterogeneity issue causes different products to be locked in multiple closed ecosystems that we call IoT silos. In order to address this issue, a common industrial solution is the hub approach, which implements connectors to communicate with IoT devices following different protocols. However, with the growing number of proprietary protocols proposed by device manufacturers, IoT hubs need to support and maintain a lot of customized connectors. Hence, we believe the ultimate solution to address the heterogeneity issue is to follow open and interoperable standard. Among the existing IoT standards, the Open Geospatial Consortium (OGC) SensorThings API standard supports comprehensive conceptual model and query functionalities. The first version of SensorThings API mainly focuses on connecting to IoT devices and sharing sensor observations online, which is the sensing capability. Besides the sensing capability, IoT devices could also be controlled via the Internet, which is the tasking capability. While the tasking capability was not included in the first version of the SensorThings API standard, this research aims on defining the tasking capability profile and integrates with the SensorThings API standard, which we call the extended-SensorThings API in this paper. In general, this research proposes a lightweight JSON-based web service description, the "Tasking Capability Description", allowing device owners and manufacturers to describe different IoT device protocols. Through the extended- SensorThings API, users and applications can follow a coherent protocol to control Io

  10. Web services-based text-mining demonstrates broad impacts for interoperability and process simplification

    Science.gov (United States)

    Wiegers, Thomas C.; Davis, Allan Peter; Mattingly, Carolyn J.

    2014-01-01

    The Critical Assessment of Information Extraction systems in Biology (BioCreAtIvE) challenge evaluation tasks collectively represent a community-wide effort to evaluate a variety of text-mining and information extraction systems applied to the biological domain. The BioCreative IV Workshop included five independent subject areas, including Track 3, which focused on named-entity recognition (NER) for the Comparative Toxicogenomics Database (CTD; http://ctdbase.org). Previously, CTD had organized document ranking and NER-related tasks for the BioCreative Workshop 2012; a key finding of that effort was that interoperability and integration complexity were major impediments to the direct application of the systems to CTD's text-mining pipeline. This underscored a prevailing problem with software integration efforts. Major interoperability-related issues included lack of process modularity, operating system incompatibility, tool configuration complexity and lack of standardization of high-level inter-process communications. One approach to potentially mitigate interoperability and general integration issues is the use of Web services to abstract implementation details; rather than integrating NER tools directly, HTTP-based calls from CTD's asynchronous, batch-oriented text-mining pipeline could be made to remote NER Web services for recognition of specific biological terms using BioC (an emerging family of XML formats) for inter-process communications. To test this concept, participating groups developed Representational State Transfer /BioC-compliant Web services tailored to CTD's NER requirements. Participants were provided with a comprehensive set of training materials. CTD evaluated results obtained from the remote Web service-based URLs against a test data set of 510 manually curated scientific articles. Twelve groups participated in the challenge. Recall, precision, balanced F-scores and response times were calculated. Top balanced F-scores for gene, chemical and

  11. Web services-based text-mining demonstrates broad impacts for interoperability and process simplification.

    Science.gov (United States)

    Wiegers, Thomas C; Davis, Allan Peter; Mattingly, Carolyn J

    2014-01-01

    The Critical Assessment of Information Extraction systems in Biology (BioCreAtIvE) challenge evaluation tasks collectively represent a community-wide effort to evaluate a variety of text-mining and information extraction systems applied to the biological domain. The BioCreative IV Workshop included five independent subject areas, including Track 3, which focused on named-entity recognition (NER) for the Comparative Toxicogenomics Database (CTD; http://ctdbase.org). Previously, CTD had organized document ranking and NER-related tasks for the BioCreative Workshop 2012; a key finding of that effort was that interoperability and integration complexity were major impediments to the direct application of the systems to CTD's text-mining pipeline. This underscored a prevailing problem with software integration efforts. Major interoperability-related issues included lack of process modularity, operating system incompatibility, tool configuration complexity and lack of standardization of high-level inter-process communications. One approach to potentially mitigate interoperability and general integration issues is the use of Web services to abstract implementation details; rather than integrating NER tools directly, HTTP-based calls from CTD's asynchronous, batch-oriented text-mining pipeline could be made to remote NER Web services for recognition of specific biological terms using BioC (an emerging family of XML formats) for inter-process communications. To test this concept, participating groups developed Representational State Transfer /BioC-compliant Web services tailored to CTD's NER requirements. Participants were provided with a comprehensive set of training materials. CTD evaluated results obtained from the remote Web service-based URLs against a test data set of 510 manually curated scientific articles. Twelve groups participated in the challenge. Recall, precision, balanced F-scores and response times were calculated. Top balanced F-scores for gene, chemical and

  12. The impact of SOA for achieving healthcare interoperability. An empirical investigation based on a hypothetical adoption.

    Science.gov (United States)

    Daskalakis, S; Mantas, J

    2009-01-01

    The evaluation of a service-oriented prototype implementation for healthcare interoperability. A prototype framework was developed, aiming to exploit the use of service-oriented architecture (SOA) concepts for achieving healthcare interoperability and to move towards a virtual patient record (VPR) paradigm. The prototype implementation was evaluated for its hypothetical adoption. The evaluation strategy was based on the initial proposition of the DeLone and McLean model of information systems (IS) success [1], as modeled by Iivari [2]. A set of SOA and VPR characteristics were empirically encapsulated within the dimensions of IS success model, combined with measures from previous research works. The data gathered was analyzed using partial least squares (PLS). The results highlighted that system quality is a partial predictor of system use but not of user satisfaction. On the contrary, information quality proved to be a significant predictor of user satisfaction and partially a strong significant predictor of system use. Moreover, system use did not prove to be a significant predictor of individual impact whereas the bi-directional relation between use and user satisfaction did not confirm. Additionally, user satisfaction was found to be a strong significant predictor of individual impact. Finally, individual impact proved to be a strong significant predictor of organizational impact. The empirical study attempted to obtain hypothetical, but still useful beliefs and perceptions regarding the SOA prototype implementation. The deduced observations can form the basis for further investigation regarding the adaptability of SOA implementations with VPR characteristics in the healthcare domain.

  13. An Interoperable Architecture for Air Pollution Early Warning System Based on Sensor Web

    Science.gov (United States)

    Samadzadegan, F.; Zahmatkesh, H.; Saber, M.; Ghazi khanlou, H. J.

    2013-09-01

    Environmental monitoring systems deal with time-sensitive issues which require quick responses in emergency situations. Handling the sensor observations in near real-time and obtaining valuable information is challenging issues in these systems from a technical and scientific point of view. The ever-increasing population growth in urban areas has caused certain problems in developing countries, which has direct or indirect impact on human life. One of applicable solution for controlling and managing air quality by considering real time and update air quality information gathered by spatially distributed sensors in mega cities, using sensor web technology for developing monitoring and early warning systems. Urban air quality monitoring systems using functionalities of geospatial information system as a platform for analysing, processing, and visualization of data in combination with Sensor Web for supporting decision support systems in disaster management and emergency situations. This system uses Sensor Web Enablement (SWE) framework of the Open Geospatial Consortium (OGC), which offers a standard framework that allows the integration of sensors and sensor data into spatial data infrastructures. SWE framework introduces standards for services to access sensor data and discover events from sensor data streams as well as definition set of standards for the description of sensors and the encoding of measurements. The presented system provides capabilities to collect, transfer, share, process air quality sensor data and disseminate air quality status in real-time. It is possible to overcome interoperability challenges by using standard framework. In a routine scenario, air quality data measured by in-situ sensors are communicated to central station where data is analysed and processed. The extracted air quality status is processed for discovering emergency situations, and if necessary air quality reports are sent to the authorities. This research proposed an

  14. AN INTEROPERABLE ARCHITECTURE FOR AIR POLLUTION EARLY WARNING SYSTEM BASED ON SENSOR WEB

    Directory of Open Access Journals (Sweden)

    F. Samadzadegan

    2013-09-01

    Full Text Available Environmental monitoring systems deal with time-sensitive issues which require quick responses in emergency situations. Handling the sensor observations in near real-time and obtaining valuable information is challenging issues in these systems from a technical and scientific point of view. The ever-increasing population growth in urban areas has caused certain problems in developing countries, which has direct or indirect impact on human life. One of applicable solution for controlling and managing air quality by considering real time and update air quality information gathered by spatially distributed sensors in mega cities, using sensor web technology for developing monitoring and early warning systems. Urban air quality monitoring systems using functionalities of geospatial information system as a platform for analysing, processing, and visualization of data in combination with Sensor Web for supporting decision support systems in disaster management and emergency situations. This system uses Sensor Web Enablement (SWE framework of the Open Geospatial Consortium (OGC, which offers a standard framework that allows the integration of sensors and sensor data into spatial data infrastructures. SWE framework introduces standards for services to access sensor data and discover events from sensor data streams as well as definition set of standards for the description of sensors and the encoding of measurements. The presented system provides capabilities to collect, transfer, share, process air quality sensor data and disseminate air quality status in real-time. It is possible to overcome interoperability challenges by using standard framework. In a routine scenario, air quality data measured by in-situ sensors are communicated to central station where data is analysed and processed. The extracted air quality status is processed for discovering emergency situations, and if necessary air quality reports are sent to the authorities. This research

  15. The Open Anatomy Browser: A Collaborative Web-Based Viewer for Interoperable Anatomy Atlases.

    Science.gov (United States)

    Halle, Michael; Demeusy, Valentin; Kikinis, Ron

    2017-01-01

    The Open Anatomy Browser (OABrowser) is an open source, web-based, zero-installation anatomy atlas viewer based on current web browser technologies and evolving anatomy atlas interoperability standards. OABrowser displays three-dimensional anatomical models, image cross-sections of labeled structures and source radiological imaging, and a text-based hierarchy of structures. The viewer includes novel collaborative tools: users can save bookmarks of atlas views for later access and exchange those bookmarks with other users, and dynamic shared views allow groups of users can participate in a collaborative interactive atlas viewing session. We have published several anatomy atlases (an MRI-derived brain atlas and atlases of other parts of the anatomy) to demonstrate OABrowser's functionality. The atlas source data, processing tools, and the source for OABrowser are freely available through GitHub and are distributed under a liberal open source license.

  16. Interoperable Solution for Test Execution in Various I&T Environments

    Science.gov (United States)

    Lee, Young H.; Bareh, Magdy S.

    2006-01-01

    When there is spacecraft collaboration between several industry partners, there is an inherent difference in integration and test (I&T) methodologies, which creates a challenge for verifying flight systems during the development phase. To converge the differing I&T methodologies, considerations were required for multiple project areas such as Flight System Testbed (FST), Assembly, Test, and Launch Operations (ATLO), and Spacecraft Simulator environments. This paper details the challenges and approaches of the JPL's effort in engineering a solution to testing the flight system with the Mission Operations Ground System while maintaining the comparability with testing methods of the industry partners.

  17. Lemnos Interoperable Security Program

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, John [Tennessee Valley Authority, Knoxville, TN (United States); Halbgewachs, Ron [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Chavez, Adrian [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Smith, Rhett [Schweitzer Engineering Laboratories, Chattanooga, TN (United States); Teumim, David [Teumim Technical, Allentown, PA (United States)

    2012-01-31

    The manner in which the control systems are being designed and operated in the energy sector is undergoing some of the most significant changes in history due to the evolution of technology and the increasing number of interconnections to other system. With these changes however come two significant challenges that the energy sector must face; 1) Cyber security is more important than ever before, and 2) Cyber security is more complicated than ever before. A key requirement in helping utilities and vendors alike in meeting these challenges is interoperability. While interoperability has been present in much of the discussions relating to technology utilized within the energy sector and especially the Smart Grid, it has been absent in the context of cyber security. The Lemnos project addresses these challenges by focusing on the interoperability of devices utilized within utility control systems which support critical cyber security functions. In theory, interoperability is possible with many of the cyber security solutions available to utilities today. The reality is that the effort required to achieve cyber security interoperability is often a barrier for utilities. For example, consider IPSec, a widely-used Internet Protocol to define Virtual Private Networks, or tunnels , to communicate securely through untrusted public and private networks. The IPSec protocol suite has a significant number of configuration options and encryption parameters to choose from, which must be agreed upon and adopted by both parties establishing the tunnel. The exercise in getting software or devices from different vendors to interoperate is labor intensive and requires a significant amount of security expertise by the end user. Scale this effort to a significant number of devices operating over a large geographical area and the challenge becomes so overwhelming that it often leads utilities to pursue solutions from a single vendor. These single vendor solutions may inadvertently lock

  18. COTS-based OO-component approach for software inter-operability and reuse (software systems engineering methodology)

    Science.gov (United States)

    Yin, J.; Oyaki, A.; Hwang, C.; Hung, C.

    2000-01-01

    The purpose of this research and study paper is to provide a summary description and results of rapid development accomplishments at NASA/JPL in the area of advanced distributed computing technology using a Commercial-Off--The-Shelf (COTS)-based object oriented component approach to open inter-operable software development and software reuse.

  19. SMART on FHIR: a standards-based, interoperable apps platform for electronic health records.

    Science.gov (United States)

    Mandel, Joshua C; Kreda, David A; Mandl, Kenneth D; Kohane, Isaac S; Ramoni, Rachel B

    2016-09-01

    In early 2010, Harvard Medical School and Boston Children's Hospital began an interoperability project with the distinctive goal of developing a platform to enable medical applications to be written once and run unmodified across different healthcare IT systems. The project was called Substitutable Medical Applications and Reusable Technologies (SMART). We adopted contemporary web standards for application programming interface transport, authorization, and user interface, and standard medical terminologies for coded data. In our initial design, we created our own openly licensed clinical data models to enforce consistency and simplicity. During the second half of 2013, we updated SMART to take advantage of the clinical data models and the application-programming interface described in a new, openly licensed Health Level Seven draft standard called Fast Health Interoperability Resources (FHIR). Signaling our adoption of the emerging FHIR standard, we called the new platform SMART on FHIR. We introduced the SMART on FHIR platform with a demonstration that included several commercial healthcare IT vendors and app developers showcasing prototypes at the Health Information Management Systems Society conference in February 2014. This established the feasibility of SMART on FHIR, while highlighting the need for commonly accepted pragmatic constraints on the base FHIR specification. In this paper, we describe the creation of SMART on FHIR, relate the experience of the vendors and developers who built SMART on FHIR prototypes, and discuss some challenges in going from early industry prototyping to industry-wide production use. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  20. Open Source Interoperability: It's More than Technology

    Directory of Open Access Journals (Sweden)

    Dominic Sartorio

    2008-01-01

    Full Text Available The Open Solutions Alliance is a consortium of leading commercial open source vendors, integrators and end users dedicated to the growth of open source based solutions in the enterprise. We believe Linux and other infrastructure software, such as Apache, has become mainstream, and packaged solutions represent the next great growth opportunity. However some unique challenges can temper that opportunity. These challenges include getting the word out about the maturity and enterprise-readiness of those solutions, ensuring interoperability both with each other and with other proprietary and legacy solutions, and ensuring healthy collaboration between vendors and their respective customer and developer communities.

  1. The Italian Cloud-based brokering Infrastructure to sustain Interoperability for Operative Hydrology

    Science.gov (United States)

    Boldrini, E.; Pecora, S.; Bussettini, M.; Bordini, F.; Nativi, S.

    2015-12-01

    This work presents the informatics platform carried out to implement the National Hydrological Operative Information System of Italy. In particular, the presentation will focus on the governing aspects of the cloud infrastructure and brokering software that make possible to sustain the hydrology data flow between heterogeneous user clients and data providers.The Institute for Environmental Protection and Research, ISPRA (Istituto Superiore per la Protezione e la Ricerca Ambientale) in collaboration with the Regional Agency for Environmental Protection in the Emilia-Romagna region, ARPA-ER (Agenzia Regionale per la Prevenzione e l´Ambiente dell´Emilia-Romagna) and CNR-IIA (National Research Council of Italy) designed and developed an innovative platform for the discovery and access of hydrological data coming from 19 Italian administrative regions and 2 Italian autonomous provinces, in near real time. ISPRA has deployed and governs such a system. The presentation will introduce and discuss the technological barriers for interoperability as well as social and policy ones. The adopted solutions will be described outlining the sustainability challenges and benefits.

  2. Buildings Interoperability Landscape

    Energy Technology Data Exchange (ETDEWEB)

    Hardin, Dave [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Stephan, Eric G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wang, Weimin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Corbin, Charles D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Widergren, Steven E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-12-31

    Through its Building Technologies Office (BTO), the United States Department of Energy’s Office of Energy Efficiency and Renewable Energy (DOE-EERE) is sponsoring an effort to advance interoperability for the integration of intelligent buildings equipment and automation systems, understanding the importance of integration frameworks and product ecosystems to this cause. This is important to BTO’s mission to enhance energy efficiency and save energy for economic and environmental purposes. For connected buildings ecosystems of products and services from various manufacturers to flourish, the ICT aspects of the equipment need to integrate and operate simply and reliably. Within the concepts of interoperability lie the specification, development, and certification of equipment with standards-based interfaces that connect and work. Beyond this, a healthy community of stakeholders that contribute to and use interoperability work products must be developed. On May 1, 2014, the DOE convened a technical meeting to take stock of the current state of interoperability of connected equipment and systems in buildings. Several insights from that meeting helped facilitate a draft description of the landscape of interoperability for connected buildings, which focuses mainly on small and medium commercial buildings. This document revises the February 2015 landscape document to address reviewer comments, incorporate important insights from the Buildings Interoperability Vision technical meeting, and capture thoughts from that meeting about the topics to be addressed in a buildings interoperability vision. In particular, greater attention is paid to the state of information modeling in buildings and the great potential for near-term benefits in this area from progress and community alignment.

  3. Intercloud Architecture for Interoperability and Integration

    NARCIS (Netherlands)

    Demchenko, Y.; Makkes, M.X.; Strijkers, R.J.; Ngo, C.

    2011-01-01

    This paper presents on-going research to develop the Intercloud Architecture (ICA) Framework that should address problems in multi-provider multi-domain heterogeneous Cloud based infrastructure services and applications integration and interoperability, including integration and interoperability

  4. Network Function Virtualization (NFV) based architecture to address connectivity, interoperability and manageability challenges in Internet of Things (IoT)

    Science.gov (United States)

    Haseeb, Shariq; Hashim, Aisha Hassan A.; Khalifa, Othman O.; Faris Ismail, Ahmad

    2017-11-01

    IoT aims to interconnect sensors and actuators built into devices (also known as Things) in order for them to share data and control each other to improve existing processes for making people’s life better. IoT aims to connect between all physical devices like fridges, cars, utilities, buildings and cities so that they can take advantage of small pieces of information collected by each one of these devices and derive more complex decisions. However, these devices are heterogeneous in nature because of various vendor support, connectivity options and protocol suit. Heterogeneity of such devices makes it difficult for them to leverage on each other’s capabilities in the traditional IoT architecture. This paper highlights the effects of heterogeneity challenges on connectivity, interoperability, management in greater details. It also surveys some of the existing solutions adopted in the core network to solve the challenges of massive IoT deployments. Finally, the paper proposes a new architecture based on NFV to address the problems.

  5. Semantically Interoperable XML Data.

    Science.gov (United States)

    Vergara-Niedermayr, Cristobal; Wang, Fusheng; Pan, Tony; Kurc, Tahsin; Saltz, Joel

    2013-09-01

    XML is ubiquitously used as an information exchange platform for web-based applications in healthcare, life sciences, and many other domains. Proliferating XML data are now managed through latest native XML database technologies. XML data sources conforming to common XML schemas could be shared and integrated with syntactic interoperability. Semantic interoperability can be achieved through semantic annotations of data models using common data elements linked to concepts from ontologies. In this paper, we present a framework and software system to support the development of semantic interoperable XML based data sources that can be shared through a Grid infrastructure. We also present our work on supporting semantic validated XML data through semantic annotations for XML Schema, semantic validation and semantic authoring of XML data. We demonstrate the use of the system for a biomedical database of medical image annotations and markups.

  6. Development of a gateway for interoperability in community-based care: An empirical study.

    Science.gov (United States)

    Ota, Sakiko; Kudo, Ken-Ichi; Taguchi, Kenta; Ihori, Mikio; Yoshie, Satoru; Yamamoto, Takuma; Sudoh, Osamu; Tsuji, Tetsuo; Iijima, Katsuya

    2018-01-01

    Information and communications technology has attracted attention as a useful way of sharing care records in community-based care. Such information sharing systems, however, imposed the burden of inputting the same records into different information systems due to a lack of interoperability of the systems. The purpose of this study was to develop a gateway that links information systems and to investigate the functionality and usability of the gateway through an empirical study. We developed a gateway with healthcare and welfare professionals in Kashiwa city, Japan. The gateway system consisted of two sub-systems: a data exchange sub-system and a common sub-system. Regarding the security, we used the transport layer security 1.2 and a public key infrastructure. For document formats, we utilized the health level seven international, extensible markup language, and portable document format. In addition, we performed an empirical study with 11 scenarios of four simulated patients and a questionnaire survey to the professionals. Professionals of eight occupations participated the empirical study and verified the gateway to link information systems of six vendors. For a questionnaire survey, 32 professionals out of 40 reported that the gateway would eliminate the burden of inputting the same records into different information systems.

  7. Enhancing Data Interoperability with Web Services

    Science.gov (United States)

    Shrestha, S. R.; Zimble, D. A.; Wang, W.; Herring, D.; Halpert, M.

    2014-12-01

    In an effort to improve data access and interoperability of climate and weather data, the National Oceanic and Atmospheric Administration's (NOAA) Climate.gov and Climate Prediction Center (CPC) are exploring various platform solutions to enhance a user's ability to locate, preview, and acquire the data. The Climate.gov and CPC data team faces multiple challenges including the various kinds of data and formats, inconsistency of metadata records, variety of data service implementations, very large volumes of data and geographically distributed locations. We have created the Data Access and Interoperability project to design a web-based platform, where interoperability between systems can be leveraged to allow greater data discovery, access, visualization and delivery. In the interoperable data platform, systems can integrate with each other to support the synthesis of climate and weather data. Interoperability is the ability for users to discover the available climate and weather data, preview and interact with the data, and acquire the data in common digital formats through a simple web-based interface. The goal of the interoperable data platform is to leverage existing web services, implement the established standards and integrate with existing solutions across the earth sciences domain instead of creating new technologies. Towards this effort to improve the interoperability of the platform, we are collaborating with ESRI Inc. to provide climate and weather data via web services. In this presentation, we will discuss and demonstrate how to use ArcGIS to author RESTful based scientific web services using open standards. These web services are able to encapsulate the logic required to handle and describe scientific data through a variety of service types including, image, map, feature, geoprocessing, and their respective service methods. Combining these types of services and leveraging well-documented APIs, including the ArcGIS JavaScript API, we can afford to

  8. RuleML-Based Learning Object Interoperability on the Semantic Web

    Science.gov (United States)

    Biletskiy, Yevgen; Boley, Harold; Ranganathan, Girish R.

    2008-01-01

    Purpose: The present paper aims to describe an approach for building the Semantic Web rules for interoperation between heterogeneous learning objects, namely course outlines from different universities, and one of the rule uses: identifying (in)compatibilities between course descriptions. Design/methodology/approach: As proof of concept, a rule…

  9. Achieving control and interoperability through unified model-based systems and software engineering

    Science.gov (United States)

    Rasmussen, Robert; Ingham, Michel; Dvorak, Daniel

    2005-01-01

    Control and interoperation of complex systems is one of the most difficult challenges facing NASA's Exploration Systems Mission Directorate. An integrated but diverse array of vehicles, habitats, and supporting facilities, evolving over the long course of the enterprise, must perform ever more complex tasks while moving steadily away from the sphere of ground support and intervention.

  10. Interoperability of CAD Standards and Robotics in CIME

    DEFF Research Database (Denmark)

    Sørensen, Torben

    The research presented in this dissertation concerns the identification of problems and provision of solutions for increasing the degree of interoperability between CAD, CACSD (Computer Aided Control Systems Design) and CAR (Computer Aided Robotics) in Computer Integrated Manufacturing and Engine......The research presented in this dissertation concerns the identification of problems and provision of solutions for increasing the degree of interoperability between CAD, CACSD (Computer Aided Control Systems Design) and CAR (Computer Aided Robotics) in Computer Integrated Manufacturing......· The development of a STEP based interface for general control system data and functions, especially related to robot motion control for interoperability of CAD, CACSD, and CAR systems for the extension of the inter-system communication capabilities beyond the stage achieved up to now.This interface development...

  11. Data Interoperability

    Directory of Open Access Journals (Sweden)

    Pasquale Pagano

    2013-07-01

    Full Text Available In the context of scientific investigations, data have acquired an ever growing leading role while their large scale, cross-community and cross-domain sharing have concurred to identify new investigation paradigms (Hey, Tansley, & Tolle, 2009. Unfortunately, data interoperability – a mandatory prerequisite for achieving the above scenarios – is still a difficult open research challenge. Both the “data” and “interoperability” concepts are difficult to be fully perceived and actually lead to different perceptions in diverse communities. This problem is further amplified when considered in the context of (global research data infrastructures that are expected to serve a plethora of communities of practice (Lave & Wenger, 1991 potentially involved in very diverse application scenarios, each characterised by a specific sharing problem.

  12. Achieving Interoperability Through Base Registries for Governmental Services and Document Management

    Science.gov (United States)

    Charalabidis, Yannis; Lampathaki, Fenareti; Askounis, Dimitris

    As digital infrastructures increase their presence worldwide, following the efforts of governments to provide citizens and businesses with high-quality one-stop services, there is a growing need for the systematic management of those newly defined and constantly transforming processes and electronic documents. E-government Interoperability Frameworks usually cater to the technical standards of e-government systems interconnection, but do not address service composition and use by citizens, businesses, or other administrations.

  13. Architectural Blueprint for Plate Boundary Observatories based on interoperable Data Management Platforms

    Science.gov (United States)

    Kerschke, D. I.; Häner, R.; Schurr, B.; Oncken, O.; Wächter, J.

    2014-12-01

    Interoperable data management platforms play an increasing role in the advancement of knowledge and technology in many scientific disciplines. Through high quality services they support the establishment of efficient and innovative research environments. Well-designed research environments can facilitate the sustainable utilization, exchange, and re-use of scientific data and functionality by using standardized community models. Together with innovative 3D/4D visualization, these concepts provide added value in improving scientific knowledge-gain, even across the boundaries of disciplines. A project benefiting from the added value is the Integrated Plate boundary Observatory in Chile (IPOC). IPOC is a European-South American network to study earthquakes and deformation at the Chilean continental margin and to monitor the plate boundary system for capturing an anticipated great earthquake in a seismic gap. In contrast to conventional observatories that monitor individual signals only, IPOC captures a large range of different processes through various observation methods (e.g., seismographs, GPS, magneto-telluric sensors, creep-meter, accelerometer, InSAR). For IPOC a conceptual design has been devised that comprises an architectural blueprint for a data management platform based on common and standardized data models, protocols, and encodings as well as on an exclusive use of Free and Open Source Software (FOSS) including visualization components. Following the principles of event-driven service-oriented architectures, the design enables novel processes by sharing and re-using functionality and information on the basis of innovative data mining and data fusion technologies. This platform can help to improve the understanding of the physical processes underlying plate deformations as well as the natural hazards induced by them. Through the use of standards, this blueprint can not only be facilitated for other plate observing systems (e.g., the European Plate

  14. Promoting A-Priori Interoperability of HLA-Based Simulations in the Space Domain: The SISO Space Reference FOM Initiative

    Science.gov (United States)

    Moller, Bjorn; Garro, Alfredo; Falcone, Alberto; Crues, Edwin Z.; Dexter, Daniel E.

    2016-01-01

    Distributed and Real-Time Simulation plays a key-role in the Space domain being exploited for missions and systems analysis and engineering as well as for crew training and operational support. One of the most popular standards is the 1516-2010 IEEE Standard for Modeling and Simulation (M&S) High Level Architecture (HLA). HLA supports the implementation of distributed simulations (called Federations) in which a set of simulation entities (called Federates) can interact using a Run-Time Infrastructure (RTI). In a given Federation, a Federate can publish and/or subscribes objects and interactions on the RTI only in accordance with their structures as defined in a FOM (Federation Object Model). Currently, the Space domain is characterized by a set of incompatible FOMs that, although meet the specific needs of different organizations and projects, increases the long-term cost for interoperability. In this context, the availability of a reference FOM for the Space domain will enable the development of interoperable HLA-based simulators for related joint projects and collaborations among worldwide organizations involved in the Space domain (e.g. NASA, ESA, Roscosmos, and JAXA). The paper presents a first set of results achieved by a SISO standardization effort that aims at providing a Space Reference FOM for international collaboration on Space systems simulations.

  15. Future Interoperability of Camp Protection Systems (FICAPS)

    Science.gov (United States)

    Caron, Sylvie; Gündisch, Rainer; Marchand, Alain; Stahl, Karl-Hermann

    2013-05-01

    The FICAPS Project has been established as a Project of the European Defence Agency based on an initiative of Germany and France. Goal of this Project was to derive Guidelines, which by a proper implementation in future developments improve Camp Protection Systems (CPS) by enabling and improving interoperability between Camp Protection Systems and its Equipments of different Nations involved in multinational missions. These Guidelines shall allow for: • Real-time information exchange between equipments and systems of different suppliers and nations (even via SatCom), • Quick and easy replacement of equipments (even of different Nations) at run-time in the field by means of plug and play capability, thus lowering the operational and logistic costs and making the system highly available, • Enhancement of system capabilities (open and modular systems) by adding new equipment with new capabilities (just plug-in, automatic adjustment of the HMI Human Machine Interface) without costly and time consuming validation and test on system level (validation and test can be done on Equipment level), Four scenarios have been identified to summarize the interoperability requirements from an operational viewpoint. To prove the definitions given in the Guideline Document, a French and a German Demonstration System, based on existing national assets, were realized. Demonstrations, showing the capabilities given by the defined interoperability requirements with respect to the operational scenarios, were performed. Demonstrations included remote control of a CPS by another CPS, remote sensor control (Electro-Optic/InfraRed EO/IR) and remote effector control. This capability can be applied to extend the protection area or to protect distant infrastructural assets Demonstrations have been performed. The required interoperability functionality was shown successfully. Even if the focus of the FICAPS project was on camp protection, the solution found is also appropriate for other

  16. Turning Interoperability Operational with GST

    Science.gov (United States)

    Schaeben, Helmut; Gabriel, Paul; Gietzel, Jan; Le, Hai Ha

    2013-04-01

    GST - Geosciences in space and time is being developed and implemented as hub to facilitate the exchange of spatially and temporally indexed multi-dimensional geoscience data and corresponding geomodels amongst partners. It originates from TUBAF's contribution to the EU project "ProMine" and its perspective extensions are TUBAF's contribution to the actual EU project "GeoMol". As of today, it provides basic components of a geodata infrastructure as required to establish interoperability with respect to geosciences. Generally, interoperability means the facilitation of cross-border and cross-sector information exchange, taking into account legal, organisational, semantic and technical aspects, cf. Interoperability Solutions for European Public Administrations (ISA), cf. http://ec.europa.eu/isa/. Practical interoperability for partners of a joint geoscience project, say European Geological Surveys acting in a border region, means in particular provision of IT technology to exchange spatially and maybe additionally temporally indexed multi-dimensional geoscience data and corresponding models, i.e. the objects composing geomodels capturing the geometry, topology, and various geoscience contents. Geodata Infrastructure (GDI) and interoperability are objectives of several inititatives, e.g. INSPIRE, OneGeology-Europe, and most recently EGDI-SCOPE to name just the most prominent ones. Then there are quite a few markup languages (ML) related to geographical or geological information like GeoSciML, EarthResourceML, BoreholeML, ResqML for reservoir characterization, earth and reservoir models, and many others featuring geoscience information. Several Web Services are focused on geographical or geoscience information. The Open Geospatial Consortium (OGC) promotes specifications of a Web Feature Service (WFS), a Web Map Service (WMS), a Web Coverage Serverice (WCS), a Web 3D Service (W3DS), and many more. It will be clarified how GST is related to these initiatives, especially

  17. Postmarketing Safety Study Tool: A Web Based, Dynamic, and Interoperable System for Postmarketing Drug Surveillance Studies

    Directory of Open Access Journals (Sweden)

    A. Anil Sinaci

    2015-01-01

    Full Text Available Postmarketing drug surveillance is a crucial aspect of the clinical research activities in pharmacovigilance and pharmacoepidemiology. Successful utilization of available Electronic Health Record (EHR data can complement and strengthen postmarketing safety studies. In terms of the secondary use of EHRs, access and analysis of patient data across different domains are a critical factor; we address this data interoperability problem between EHR systems and clinical research systems in this paper. We demonstrate that this problem can be solved in an upper level with the use of common data elements in a standardized fashion so that clinical researchers can work with different EHR systems independently of the underlying information model. Postmarketing Safety Study Tool lets the clinical researchers extract data from different EHR systems by designing data collection set schemas through common data elements. The tool interacts with a semantic metadata registry through IHE data element exchange profile. Postmarketing Safety Study Tool and its supporting components have been implemented and deployed on the central data warehouse of the Lombardy region, Italy, which contains anonymized records of about 16 million patients with over 10-year longitudinal data on average. Clinical researchers in Roche validate the tool with real life use cases.

  18. Data Distribution Service-Based Interoperability Framework for Smart Grid Testbed Infrastructure

    Directory of Open Access Journals (Sweden)

    Tarek A. Youssef

    2016-03-01

    Full Text Available This paper presents the design and implementation of a communication and control infrastructure for smart grid operation. The proposed infrastructure enhances the reliability of the measurements and control network. The advantages of utilizing the data-centric over message-centric communication approach are discussed in the context of smart grid applications. The data distribution service (DDS is used to implement a data-centric common data bus for the smart grid. This common data bus improves the communication reliability, enabling distributed control and smart load management. These enhancements are achieved by avoiding a single point of failure while enabling peer-to-peer communication and an automatic discovery feature for dynamic participating nodes. The infrastructure and ideas presented in this paper were implemented and tested on the smart grid testbed. A toolbox and application programing interface for the testbed infrastructure are developed in order to facilitate interoperability and remote access to the testbed. This interface allows control, monitoring, and performing of experiments remotely. Furthermore, it could be used to integrate multidisciplinary testbeds to study complex cyber-physical systems (CPS.

  19. Interoperation Modeling for Intelligent Domotic Environments

    Science.gov (United States)

    Bonino, Dario; Corno, Fulvio

    This paper introduces an ontology-based model for domotic device inter-operation. Starting from a previously published ontology (DogOnt) a refactoring and extension is described allowing to explicitly represent device capabilities, states and commands, and supporting abstract modeling of device inter-operation.

  20. Innovation in OGC: The Interoperability Program

    Directory of Open Access Journals (Sweden)

    George Percivall

    2015-10-01

    Full Text Available The OGC Interoperability Program is a source of innovation in the development of open standards. The approach to innovation is based on hands-on; collaborative engineering leading to more mature standards and implementations. The process of the Interoperability Program engages a community of sponsors and participants based on an economic model that benefits all involved. Each initiative begins with an innovative approach to identify interoperability needs followed by agile software development to advance the state of technology to the benefit of society. Over eighty initiatives have been conducted in the Interoperability Program since the breakthrough Web Mapping Testbed began the program in 1999. OGC standards that were initiated in Interoperability Program are the basis of two thirds of the certified compliant products.

  1. Standards-based sensor interoperability and networking SensorWeb: an overview

    Science.gov (United States)

    Bolling, Sam

    2012-06-01

    The War fighter lacks a unified Intelligence, Surveillance, and Reconnaissance (ISR) environment to conduct mission planning, command and control (C2), tasking, collection, exploitation, processing, and data discovery of disparate sensor data across the ISR Enterprise. Legacy sensors and applications are not standardized or integrated for assured, universal access. Existing tasking and collection capabilities are not unified across the enterprise, inhibiting robust C2 of ISR including near-real time, cross-cueing operations. To address these critical needs, the National Measurement and Signature Intelligence (MASINT) Office (NMO), and partnering Combatant Commands and Intelligence Agencies are developing SensorWeb, an architecture that harmonizes heterogeneous sensor data to a common standard for users to discover, access, observe, subscribe to and task sensors. The SensorWeb initiative long term goal is to establish an open commercial standards-based, service-oriented framework to facilitate plug and play sensors. The current development effort will produce non-proprietary deliverables, intended as a Government off the Shelf (GOTS) solution to address the U.S. and Coalition nations' inability to quickly and reliably detect, identify, map, track, and fully understand security threats and operational activities.

  2. Interoperability Strategic Vision

    Energy Technology Data Exchange (ETDEWEB)

    Widergren, Steven E.; Knight, Mark R.; Melton, Ronald B.; Narang, David; Martin, Maurice; Nordman, Bruce; Khandekar, Aditya; Hardy, Keith S.

    2018-02-28

    The Interoperability Strategic Vision whitepaper aims to promote a common understanding of the meaning and characteristics of interoperability and to provide a strategy to advance the state of interoperability as applied to integration challenges facing grid modernization. This includes addressing the quality of integrating devices and systems and the discipline to improve the process of successfully integrating these components as business models and information technology improve over time. The strategic vision for interoperability described in this document applies throughout the electric energy generation, delivery, and end-use supply chain. Its scope includes interactive technologies and business processes from bulk energy levels to lower voltage level equipment and the millions of appliances that are becoming equipped with processing power and communication interfaces. A transformational aspect of a vision for interoperability in the future electric system is the coordinated operation of intelligent devices and systems at the edges of grid infrastructure. This challenge offers an example for addressing interoperability concerns throughout the electric system.

  3. Flexible Language Interoperability

    DEFF Research Database (Denmark)

    Ekman, Torbjörn; Mechlenborg, Peter; Schultz, Ulrik Pagh

    2007-01-01

    Virtual machines raise the abstraction level of the execution environment at the cost of restricting the set of supported languages. Moreover, the ability of a language implementation to integrate with other languages hosted on the same virtual machine typically constrains the features...... of the language. In this paper, we present a highly flexible yet efficient approach to hosting multiple programming languages on an object-oriented virtual machine. Our approach is based on extending the interface of each class with language-specific wrapper methods, offering each language a tailored view...... of a given class. This approach can be deployed both on a statically typed virtual machine, such as the JVM, and on a dynamic virtual machine, such as a Smalltalk virtual machine. We have implemented our approach to language interoperability on top of a prototype virtual machine for embedded systems based...

  4. Component-Based Modelling for Scalable Smart City Systems Interoperability: A Case Study on Integrating Energy Demand Response Systems.

    Science.gov (United States)

    Palomar, Esther; Chen, Xiaohong; Liu, Zhiming; Maharjan, Sabita; Bowen, Jonathan

    2016-10-28

    Smart city systems embrace major challenges associated with climate change, energy efficiency, mobility and future services by embedding the virtual space into a complex cyber-physical system. Those systems are constantly evolving and scaling up, involving a wide range of integration among users, devices, utilities, public services and also policies. Modelling such complex dynamic systems' architectures has always been essential for the development and application of techniques/tools to support design and deployment of integration of new components, as well as for the analysis, verification, simulation and testing to ensure trustworthiness. This article reports on the definition and implementation of a scalable component-based architecture that supports a cooperative energy demand response (DR) system coordinating energy usage between neighbouring households. The proposed architecture, called refinement of Cyber-Physical Component Systems (rCPCS), which extends the refinement calculus for component and object system (rCOS) modelling method, is implemented using Eclipse Extensible Coordination Tools (ECT), i.e., Reo coordination language. With rCPCS implementation in Reo, we specify the communication, synchronisation and co-operation amongst the heterogeneous components of the system assuring, by design scalability and the interoperability, correctness of component cooperation.

  5. Supporting interoperability of collaborative networks through engineering of a service-based Mediation Information System (MISE 2.0)

    Science.gov (United States)

    Benaben, Frederick; Mu, Wenxin; Boissel-Dallier, Nicolas; Barthe-Delanoe, Anne-Marie; Zribi, Sarah; Pingaud, Herve

    2015-08-01

    The Mediation Information System Engineering project is currently finishing its second iteration (MISE 2.0). The main objective of this scientific project is to provide any emerging collaborative situation with methods and tools to deploy a Mediation Information System (MIS). MISE 2.0 aims at defining and designing a service-based platform, dedicated to initiating and supporting the interoperability of collaborative situations among potential partners. This MISE 2.0 platform implements a model-driven engineering approach to the design of a service-oriented MIS dedicated to supporting the collaborative situation. This approach is structured in three layers, each providing their own key innovative points: (i) the gathering of individual and collaborative knowledge to provide appropriate collaborative business behaviour (key point: knowledge management, including semantics, exploitation and capitalisation), (ii) deployment of a mediation information system able to computerise the previously deduced collaborative processes (key point: the automatic generation of collaborative workflows, including connection with existing devices or services) (iii) the management of the agility of the obtained collaborative network of organisations (key point: supervision of collaborative situations and relevant exploitation of the gathered data). MISE covers business issues (through BPM), technical issues (through an SOA) and agility issues of collaborative situations (through EDA).

  6. Towards Interoperable Preservation Repositories: TIPR

    Directory of Open Access Journals (Sweden)

    Priscilla Caplan

    2010-07-01

    Full Text Available Towards Interoperable Preservation Repositories (TIPR is a project funded by the Institute of Museum and Library Services to create and test a Repository eXchange Package (RXP. The package will make it possible to transfer complex digital objects between dissimilar preservation repositories.  For reasons of redundancy, succession planning and software migration, repositories must be able to exchange copies of archival information packages with each other. Every different repository application, however, describes and structures its archival packages differently. Therefore each system produces dissemination packages that are rarely understandable or usable as submission packages by other repositories. The RXP is an answer to that mismatch. Other solutions for transferring packages between repositories focus either on transfers between repositories of the same type, such as DSpace-to-DSpace transfers, or on processes that rely on central translation services.  Rather than build translators between many dissimilar repository types, the TIPR project has defined a standards-based package of metadata files that can act as an intermediary information package, the RXP, a lingua franca all repositories can read and write.

  7. Intercloud Architecture Framework for Interoperability and Integration

    NARCIS (Netherlands)

    Demchenko, Y.; Ngo, C.; Makkes, M.X.; Strijkers, R.J.

    2013-01-01

    This report presents on-going research to develop the Intercloud Architecture Framework (ICAF) that addresses interoperability and integration issues in multi-provider multi-domain heterogeneous Cloud based infrastructure services and applications provisioning, including integration and

  8. Design challenges and gaps in standards in developing an interoperable zero footprint DI thin client for use in image-enabled electronic health record solutions

    Science.gov (United States)

    Agrawal, Arun; Koff, David; Bak, Peter; Bender, Duane; Castelli, Jane

    2015-03-01

    The deployment of regional and national Electronic Health Record solutions has been a focus of many countries throughout the past decade. A major challenge for these deployments has been support for ubiquitous image viewing. More specifically, these deployments require an imaging solution that can work over the Internet, leverage any point of service device: desktop, tablet, phone; and access imaging data from any source seamlessly. Whereas standards exist to enable ubiquitous image viewing, few if any solutions exist that leverage these standards and meet the challenge. Rather, most of the currently available web based DI viewing solutions are either proprietary solutions or require special plugins. We developed a true zero foot print browser based DI viewing solution based on the Web Access DICOM Objects (WADO) and Cross-enterprise Document Sharing for Imaging (XDS-I.b) standards to a) demonstrate that a truly ubiquitous image viewer can be deployed; b) identify the gaps in the current standards and the design challenges for developing such a solution. The objective was to develop a viewer, which works on all modern browsers on both desktop and mobile devices. The implementation allows basic viewing functionalities of scroll, zoom, pan and window leveling (limited). The major gaps identified in the current DICOM WADO standards are a lack of ability to allow any kind of 3D reconstruction or MPR views. Other design challenges explored include considerations related to optimization of the solution for response time and low memory foot print.

  9. WellnessRules: A Web 3.0 Case Study in RuleML-Based Prolog-N3 Profile Interoperation

    Science.gov (United States)

    Boley, Harold; Osmun, Taylor Michael; Craig, Benjamin Larry

    An interoperation study, WellnessRules, is described, where rules about wellness opportunities are created by participants in rule languages such as Prolog and N3, and translated within a wellness community using RuleML/XML. The wellness rules are centered around participants, as profiles, encoding knowledge about their activities conditional on the season, the time-of-day, the weather, etc. This distributed knowledge base extends FOAF profiles with a vocabulary and rules about wellness group networking. The communication between participants is organized through Rule Responder, permitting wellness-profile translation and distributed querying across engines. WellnessRules interoperates between rules and queries in the relational (Datalog) paradigm of the pure-Prolog subset of POSL and in the frame (F-logic) paradigm of N3. An evaluation of Rule Responder instantiated for WellnessRules revealed acceptable Web response times.

  10. A Cloud Interoperability Broker (CIB for data migration in SaaS

    Directory of Open Access Journals (Sweden)

    Hassan Ali

    2016-12-01

    Full Text Available Cloud computing is becoming increasingly popular. Information technology market leaders, e.g., Microsoft, Google, and Amazon, are extensively shifting toward cloud-based solutions. However, there is isolation in the cloud implementations provided by the cloud vendors. Limited interoperability can cause one user to adhere to a single cloud provider; thus, a required migration of an application or data from one cloud provider to another may necessitate a significant effort and/or full-cycle redevelopment to fit the new provider's standards and implementation. The ability to move from one cloud vendor to another would be a step toward advancing cloud computing interoperability and increasing customer trust. This study proposes a cloud broker solution to fill the interoperability gap between different software-as-a-service providers. The proposed cloud broker was implemented and tested on a real enterprise application dataset. The migration process was completed and it worked correctly, according to a specified mapping model.

  11. Towards semantic interoperability for electronic health records.

    Science.gov (United States)

    Garde, Sebastian; Knaup, Petra; Hovenga, Evelyn; Heard, Sam

    2007-01-01

    In the field of open electronic health records (EHRs), openEHR as an archetype-based approach is being increasingly recognised. It is the objective of this paper to shortly describe this approach, and to analyse how openEHR archetypes impact on health professionals and semantic interoperability. Analysis of current approaches to EHR systems, terminology and standards developments. In addition to literature reviews, we organised face-to-face and additional telephone interviews and tele-conferences with members of relevant organisations and committees. The openEHR archetypes approach enables syntactic interoperability and semantic interpretability -- both important prerequisites for semantic interoperability. Archetypes enable the formal definition of clinical content by clinicians. To enable comprehensive semantic interoperability, the development and maintenance of archetypes needs to be coordinated internationally and across health professions. Domain knowledge governance comprises a set of processes that enable the creation, development, organisation, sharing, dissemination, use and continuous maintenance of archetypes. It needs to be supported by information technology. To enable EHRs, semantic interoperability is essential. The openEHR archetypes approach enables syntactic interoperability and semantic interpretability. However, without coordinated archetype development and maintenance, 'rank growth' of archetypes would jeopardize semantic interoperability. We therefore believe that openEHR archetypes and domain knowledge governance together create the knowledge environment required to adopt EHRs.

  12. Interoperability of medical device information and the clinical applications: an HL7 RMIM based on the ISO/IEEE 11073 DIM.

    Science.gov (United States)

    Yuksel, Mustafa; Dogac, Asuman

    2011-07-01

    Medical devices are essential to the practice of modern healthcare services. Their benefits will increase if clinical software applications can seamlessly acquire the medical device data. The need to represent medical device observations in a format that can be consumable by clinical applications has already been recognized by the industry. Yet, the solutions proposed involve bilateral mappings from the ISO/IEEE 11073 Domain Information Model (DIM) to specific message or document standards. Considering that there are many different types of clinical applications such as the electronic health record and the personal health record systems, the clinical workflows, and the clinical decision support systems each conforming to different standard interfaces, detailing a mapping mechanism for every one of them introduces significant work and, thus, limits the potential health benefits of medical devices. In this paper, to facilitate the interoperability of clinical applications and the medical device data, we use the ISO/IEEE 11073 DIM to derive an HL7 v3 Refined Message Information Model (RMIM) of the medical device domain from the HL7 v3 Reference Information Mode (RIM). This makes it possible to trace the medical device data back to a standard common denominator, that is, HL7 v3 RIM from which all the other medical domains under HL7 v3 are derived. Hence, once the medical device data are obtained in the RMIM format, it can easily be transformed into HL7-based standard interfaces through XML transformations because these interfaces all have their building blocks from the same RIM. To demonstrate this, we provide the mappings from the developed RMIM to some of the widely used HL7 v3-based standard interfaces.

  13. Sensor Interoperability and Fusion in Fingerprint Verification: A Case Study using Minutiae-and Ridge-Based Matchers

    NARCIS (Netherlands)

    Alonso-Fernandez, F.; Veldhuis, Raymond N.J.; Bazen, A.M.; Fierrez-Aguilar, J.; Ortega-Garcia, J.

    2006-01-01

    Information fusion in fingerprint recognition has been studied in several papers. However, only a few papers have been focused on sensor interoperability and sensor fusion. In this paper, these two topics are studied using a multisensor database acquired with three different fingerprint sensors.

  14. An Interoperable Electronic Medical Record-Based Platform for Personalized Predictive Analytics

    Science.gov (United States)

    Abedtash, Hamed

    2017-01-01

    Precision medicine refers to the delivering of customized treatment to patients based on their individual characteristics, and aims to reduce adverse events, improve diagnostic methods, and enhance the efficacy of therapies. Among efforts to achieve the goals of precision medicine, researchers have used observational data for developing predictive…

  15. Telefacturing Based Distributed Manufacturing Environment for Optimal Manufacturing Service by Enhancing the Interoperability in the Hubs

    Directory of Open Access Journals (Sweden)

    V. K. Manupati

    2017-01-01

    Full Text Available Recent happenings are surrounding the manufacturing sector leading to intense progress towards the development of effective distributed collaborative manufacturing environments. This evolving collaborative manufacturing not only focuses on digitalisation of this environment but also necessitates service-dependent manufacturing system that offers an uninterrupted approach to a number of diverse, complicated, dynamic manufacturing operations management systems at a common work place (hub. This research presents a novel telefacturing based distributed manufacturing environment for recommending the manufacturing services based on the user preferences. The first step in this direction is to deploy the most advanced tools and techniques, that is, Ontology-based Protégé 5.0 software for transforming the huge stored knowledge/information into XML schema of Ontology Language (OWL documents and Integration of Process Planning and Scheduling (IPPS for multijobs in a collaborative manufacturing system. Thereafter, we also investigate the possibilities of allocation of skilled workers to the best feasible operations sequence. In this context, a mathematical model is formulated for the considered objectives, that is, minimization of makespan and total training cost of the workers. With an evolutionary algorithm and developed heuristic algorithm, the performance of the proposed manufacturing system has been improved. Finally, to manifest the capability of the proposed approach, an illustrative example from the real-time manufacturing industry is validated for optimal service recommendation.

  16. Semantic Interoperable Electronic Patient Records: The Unfolding of Consensus based Archetypes.

    Science.gov (United States)

    Pedersen, Rune; Wynn, Rolf; Ellingsen, Gunnar

    2015-01-01

    This paper is a status report from a large-scale openEHR-based EPR project from the North Norway Regional Health Authority encouraged by the unfolding of a national repository for openEHR archetypes. Clinicians need to engage in, and be responsible for the production of archetypes. The consensus processes have so far been challenged by a low number of active clinicians, a lack of critical specialties to reach consensus, and a cumbersome review process (3 or 4 review rounds) for each archetype. The goal is to have several clinicians from each specialty as a backup if one is hampered to participate. Archetypes and their importance for structured data and sharing of information has to become more visible for the clinicians through more sharpened information practice.

  17. Interoperable computerized smart card based system for health insurance and health services applied in cardiology.

    Science.gov (United States)

    Cocei, Horia-Delatebea; Stefan, Livia; Dobre, Ioana; Croitoriu, Mihai; Sinescu, Crina; Ovricenco, Eduard

    2002-01-01

    In 1999 Romania started its health care reform by promulgating the Health Insurance Law. A functional and efficient health care system needs procedures for monitoring and evaluation of the medical services, communication between different service providers and entities involved in the system, integration and availability of the information. The final goal is a good response to the needs and demands of the patients and of the real life. For this project we took into account, on one hand, the immediate need for computerized systems for the health care providers and, on the other hand, the large number of trials and experiments with health smart cards across Europe. Our project will implement a management system based on electronic patient records to be used in all cardiology clinics and will experiment the health smart cards, will promote and demonstrate the capabilities of the smart card technology. We focused our attention towards a specific and also critical category of patients, those with heart diseases, and also towards a critical sector of the health care system--the emergency care. The patient card was tested on a number of 150 patients at a cardiology clinic in Bucharest. This was the first trial of a health smart card in Romania.

  18. Achieving mask order processing automation, interoperability and standardization based on P10

    Science.gov (United States)

    Rodriguez, B.; Filies, O.; Sadran, D.; Tissier, Michel; Albin, D.; Stavroulakis, S.; Voyiatzis, E.

    2007-02-01

    Last year the MUSCLE (Masks through User's Supply Chain: Leadership by Excellence) project was presented. Here is the project advancement. A key process in mask supply chain management is the exchange of technical information for ordering masks. This process is large, complex, company specific and error prone, and leads to longer cycle times and higher costs due to missing or wrong inputs. Its automation and standardization could produce significant benefits. We need to agree on the standard for mandatory and optional parameters, and also a common way to describe parameters when ordering. A system was created to improve the performance in terms of Key Performance Indicators (KPIs) such as cycle time and cost of production. This tool allows us to evaluate and measure the effect of factors, as well as the effect of implementing the improvements of the complete project. Next, a benchmark study and a gap analysis were performed. These studies show the feasibility of standardization, as there is a large overlap in requirements. We see that the SEMI P10 standard needs enhancements. A format supporting the standard is required, and XML offers the ability to describe P10 in a flexible way. Beyond using XML for P10, the semantics of the mask order should also be addressed. A system design and requirements for a reference implementation for a P10 based management system are presented, covering a mechanism for the evolution and for version management and a design for P10 editing and data validation.

  19. Building an Interoperability Test System for Electric Vehicle Chargers Based on ISO/IEC 15118 and IEC 61850 Standards

    Directory of Open Access Journals (Sweden)

    Minho Shin

    2016-05-01

    Full Text Available The electric vehicle market is rapidly growing due to its environmental friendliness and governmental support. As electric vehicles are powered by electricity, the interoperability between the vehicles and the chargers made by multiple vendors is crucial for the success of the technology. Relevant standards are being published, but the methods for conformance testing need to be developed. In this paper, we present our conformance test system for the electric vehicle charger in accordance with the standards ISO/IEC 15118, IEC 61851 and IEC 61850-90-8. Our test system leverages the TTCN-3 framework for its flexibility and productivity. We evaluate the test system by lab tests with two reference chargers that we built. We also present the test results in two international testival events for the ISO/IEC 15118 interoperability. We confirmed that our test system is robust, efficient and practical.

  20. Building an Interoperability Test System for Electric Vehicle Chargers Based on ISO/IEC 15118 and IEC 61850 Standards

    OpenAIRE

    Minho Shin; Hwimin Kim; Hyoseop Kim; Hyuksoo Jang

    2016-01-01

    The electric vehicle market is rapidly growing due to its environmental friendliness and governmental support. As electric vehicles are powered by electricity, the interoperability between the vehicles and the chargers made by multiple vendors is crucial for the success of the technology. Relevant standards are being published, but the methods for conformance testing need to be developed. In this paper, we present our conformance test system for the electric vehicle charger in accordance with...

  1. Device interoperability and authentication for telemedical appliance based on the ISO/IEEE 11073 Personal Health Device (PHD) Standards.

    Science.gov (United States)

    Caranguian, Luther Paul R; Pancho-Festin, Susan; Sison, Luis G

    2012-01-01

    In this study, we focused on the interoperability and authentication of medical devices in the context of telemedical systems. A recent standard called the ISO/IEEE 11073 Personal Health Device (X73-PHD) Standards addresses the device interoperability problem by defining common protocols for agent (medical device) and manager (appliance) interface. The X73-PHD standard however has not addressed security and authentication of medical devices which is important in establishing integrity of a telemedical system. We have designed and implemented a security policy within the X73-PHD standards. The policy will enable device authentication using Asymmetric-Key Cryptography and the RSA algorithm as the digital signature scheme. We used two approaches for performing the digital signatures: direct software implementation and use of embedded security modules (ESM). The two approaches were evaluated and compared in terms of execution time and memory requirement. For the standard 2048-bit RSA, ESM calculates digital signatures only 12% of the total time for the direct implementation. Moreover, analysis shows that ESM offers more security advantage such as secure storage of keys compared to using direct implementation. Interoperability with other systems was verified by testing the system with LNI Healthlink, a manager software that implements the X73-PHD standard. Lastly, security analysis was done and the system's response to common attacks on authentication systems was analyzed and several measures were implemented to protect the system against them.

  2. Parallel mesh management using interoperable tools.

    Energy Technology Data Exchange (ETDEWEB)

    Tautges, Timothy James (Argonne National Laboratory); Devine, Karen Dragon

    2010-10-01

    This presentation included a discussion of challenges arising in parallel mesh management, as well as demonstrated solutions. They also described the broad range of software for mesh management and modification developed by the Interoperable Technologies for Advanced Petascale Simulations (ITAPS) team, and highlighted applications successfully using the ITAPS tool suite.

  3. Standard CGIF interoperability in Amine

    OpenAIRE

    Kabbaj, A.; Launders, I.; Polovina, S.

    2009-01-01

    The adoption of standard CGIF by CG tools will enable interoperability between them to be achieved, and in turn lead to the interoperability between CG tools and other tools. The integration of ISO Common Logic’s standard CGIF notation in the Amine platform is presented. It also describes the first steps towards full interoperability between the Amine CG tool (through its Synergy component) and CharGer, a representative CG tool that supports similar interoperability and for process (or ‘activ...

  4. A Smart Home Center Platform Solution Based on Smart Mirror

    OpenAIRE

    Deng Xibo; Peng Zhiran; Wu Wenquan

    2017-01-01

    With the popularization of the concept of smart home, people have raised requirements on the experience of smart living. A smart home platform center solution is put forward in order to solve the intelligent interoperability and information integration of smart home, which enable people to have a more intelligent and convenient life experience. This platform center is achieved through the Smart Mirror. The Smart Mirror refers to a smart furniture, on the basis of the traditional concept of mi...

  5. An observational study of the relationship between meaningful use-based electronic health information exchange, interoperability, and medication reconciliation capabilities.

    Science.gov (United States)

    Elysee, Gerald; Herrin, Jeph; Horwitz, Leora I

    2017-10-01

    Stagnation in hospitals' adoption of data integration functionalities coupled with reduction in the number of operational health information exchanges could become a significant impediment to hospitals' adoption of 3 critical capabilities: electronic health information exchange, interoperability, and medication reconciliation, in which electronic systems are used to assist with resolving medication discrepancies and improving patient safety. Against this backdrop, we assessed the relationships between the 3 capabilities.We conducted an observational study applying partial least squares-structural equation modeling technique to 27 variables obtained from the 2013 American Hospital Association annual survey Information Technology (IT) supplement, which describes health IT capabilities.We included 1330 hospitals. In confirmatory factor analysis, out of the 27 variables, 15 achieved loading values greater than 0.548 at P building blocks of the 3 capabilities. Subsequent path analysis showed a significant, positive, and cyclic relationship between the capabilities, in that decreases in the hospitals' adoption of one would lead to decreases in the adoption of the others.These results show that capability for high quality medication reconciliation may be impeded by lagging adoption of interoperability and health information exchange capabilities. Policies focused on improving one or more of these capabilities may have ancillary benefits.

  6. Interoperable End-to-End Remote Patient Monitoring Platform based on IEEE 11073 PHD and ZigBee Health Care Profile.

    Science.gov (United States)

    Clarke, Malcolm; de Folter, Joost; Verma, Vivek; Gokalp, Hulya

    2017-08-07

    This paper describes the implementation of an end-to-end remote monitoring platform based on the IEEE 11073 standards for Personal Health Devices (PHD). It provides an overview of the concepts and approaches and describes how the standard has been optimized for small devices with limited resources of processor, memory and power and that use short range wireless technology. It explains aspects of IEEE 11073, including the Domain Information Model, state model and nomenclature, and how these support its plug-and-play architecture. It shows how these aspects underpin a much larger eco-system of interoperable devices and systems that include IHE PCD-01, HL7 and BlueTooth LE medical devices, and the relationship to the Continua Guidelines, advocating the adoption of data standards and nomenclature to support semantic interoperability between health and ambient assisted living (AAL) in future platforms. The paper further describes the adaptions that have been made in order to implement the standard on the ZigBee Health Care Profile and the experiences of implementing an end-to-end platform that has been deployed to frail elderly patients with chronic disease(s) and patients with diabetes.

  7. Recent ARC developments: Through modularity to interoperability

    Energy Technology Data Exchange (ETDEWEB)

    Smirnova, O; Cameron, D; Ellert, M; Groenager, M; Johansson, D; Kleist, J [NDGF, Kastruplundsgade 22, DK-2770 Kastrup (Denmark); Dobe, P; Joenemo, J; Konya, B [Lund University, Experimental High Energy Physics, Institute of Physics, Box 118, SE-22100 Lund (Sweden); Fraagaat, T; Konstantinov, A; Nilsen, J K; Saada, F Ould; Qiang, W; Read, A [University of Oslo, Department of Physics, P. O. Box 1048, Blindern, N-0316 Oslo (Norway); Kocan, M [Pavol Jozef Safarik University, Faculty of Science, Jesenna 5, SK-04000 Kosice (Slovakia); Marton, I; Nagy, Zs [NIIF/HUNGARNET, Victor Hugo 18-22, H-1132 Budapest (Hungary); Moeller, S [University of Luebeck, Inst. Of Neuro- and Bioinformatics, Ratzeburger Allee 160, D-23538 Luebeck (Germany); Mohn, B, E-mail: oxana.smirnova@hep.lu.s [Uppsala University, Department of Physics and Astronomy, Div. of Nuclear and Particle Physics, Box 535, SE-75121 Uppsala (Sweden)

    2010-04-01

    The Advanced Resource Connector (ARC) middleware introduced by NorduGrid is one of the basic Grid solutions used by scientists worldwide. While being well-proven in daily use by a wide variety of scientific applications at large-scale infrastructures like the Nordic DataGrid Facility (NDGF) and smaller scale projects, production ARC of today is still largely based on conventional Grid technologies and custom interfaces introduced a decade ago. In order to guarantee sustainability, true cross-system portability and standards-compliance based interoperability, the ARC community undertakes a massive effort of implementing modular Web Service (WS) approach into the middleware. With support from the EU KnowARC project, new components were introduced and the existing key ARC services got extended with WS technology based standard-compliant interfaces following a service-oriented architecture. Such components include the hosting environment framework, the resource-coupled execution service, the re-engineered client library, the self-healing storage solution and the peer-to-peer information system, to name a few. Gradual introduction of these new services and client tools into the production middleware releases is carried out together with NDGF and thus ensures a smooth transition to the next generation Grid middleware. Standard interfaces and modularity of the new component design are essential for ARC contributions to the planned Universal Middleware Distribution of the European Grid Initiative.

  8. Recent ARC developments: Through modularity to interoperability

    International Nuclear Information System (INIS)

    Smirnova, O; Cameron, D; Ellert, M; Groenager, M; Johansson, D; Kleist, J; Dobe, P; Joenemo, J; Konya, B; Fraagaat, T; Konstantinov, A; Nilsen, J K; Saada, F Ould; Qiang, W; Read, A; Kocan, M; Marton, I; Nagy, Zs; Moeller, S; Mohn, B

    2010-01-01

    The Advanced Resource Connector (ARC) middleware introduced by NorduGrid is one of the basic Grid solutions used by scientists worldwide. While being well-proven in daily use by a wide variety of scientific applications at large-scale infrastructures like the Nordic DataGrid Facility (NDGF) and smaller scale projects, production ARC of today is still largely based on conventional Grid technologies and custom interfaces introduced a decade ago. In order to guarantee sustainability, true cross-system portability and standards-compliance based interoperability, the ARC community undertakes a massive effort of implementing modular Web Service (WS) approach into the middleware. With support from the EU KnowARC project, new components were introduced and the existing key ARC services got extended with WS technology based standard-compliant interfaces following a service-oriented architecture. Such components include the hosting environment framework, the resource-coupled execution service, the re-engineered client library, the self-healing storage solution and the peer-to-peer information system, to name a few. Gradual introduction of these new services and client tools into the production middleware releases is carried out together with NDGF and thus ensures a smooth transition to the next generation Grid middleware. Standard interfaces and modularity of the new component design are essential for ARC contributions to the planned Universal Middleware Distribution of the European Grid Initiative.

  9. Defining Inter-Cloud Architecture for Interoperability and Integration

    NARCIS (Netherlands)

    Demchenko, Y.; Ngo, C.; Makkes, M.X.; Strijkers, R.J.; Laat, C. de

    2012-01-01

    This paper presents on-going research to develop the Inter-Cloud Architecture that should address problems in multi-provider multi-domain heterogeneous Cloud based applications integration and interoperability, including integration and interoperability with legacy infrastructure services. Cloud

  10. Defining inter-cloud architecture for interoperability and integration

    NARCIS (Netherlands)

    Demchenko, Y.; Ngo, C.; Makkes, M.X.; Strijkers, R.; de Laat, C.; Zimmermann, W.; Lee, Y.W.; Demchenko, Y.

    2012-01-01

    This paper presents an on-going research to develop the Inter-Cloud Architecture, which addresses the architectural problems in multi-provider multi-domain heterogeneous cloud based applications integration and interoperability, including integration and interoperability with legacy infrastructure

  11. Interoperability does matter

    Directory of Open Access Journals (Sweden)

    Manfred Goepel

    2006-04-01

    Full Text Available In companies, the historically developed IT systems are mostly application islands. They always produce good results if the system's requirements and surroundings are not changed and as long as a system interface is not needed. With the ever increas-ing dynamic and globalization of the market, however, these IT islands are certain to collapse. Interoperability (IO is the bid of the hour, assuming the integration of users, data, applications and processes. In the following, important IO enablers such as ETL, EAI, and SOA will be examined on the basis of practica-bility. It will be shown that especially SOA produces a surge of interoperability that could rightly be referred to as IT evolution.

  12. Architectures for the Development of the National Interoperability Framework in Romania

    Directory of Open Access Journals (Sweden)

    Codrin-Florentin NISIOIU

    2015-10-01

    Full Text Available The authors of Digital Agenda consider that Europe do not take fully advantage of interoperability. They believe that we need effective interoperability between IT products and services to build a truly Digital Society. The Digital Agenda can only be effective if all the elements and applications are interoperable and based on open standards and platforms. In this context, I propose in this article a specific architecture for developing Romanian National Interoperability framework.

  13. Maturity model for enterprise interoperability

    Science.gov (United States)

    Guédria, Wided; Naudet, Yannick; Chen, David

    2015-01-01

    Historically, progress occurs when entities communicate, share information and together create something that no one individually could do alone. Moving beyond people to machines and systems, interoperability is becoming a key factor of success in all domains. In particular, interoperability has become a challenge for enterprises, to exploit market opportunities, to meet their own objectives of cooperation or simply to survive in a growing competitive world where the networked enterprise is becoming a standard. Within this context, many research works have been conducted over the past few years and enterprise interoperability has become an important area of research, ensuring the competitiveness and growth of European enterprises. Among others, enterprises have to control their interoperability strategy and enhance their ability to interoperate. This is the purpose of the interoperability assessment. Assessing interoperability maturity allows a company to know its strengths and weaknesses in terms of interoperability with its current and potential partners, and to prioritise actions for improvement. The objective of this paper is to define a maturity model for enterprise interoperability that takes into account existing maturity models while extending the coverage of the interoperability domain. The assessment methodology is also presented. Both are demonstrated with a real case study.

  14. Intercloud Architecture for interoperability and integration

    NARCIS (Netherlands)

    Demchenko, Y.; Makkes, M.X.; Strijkers, R.J.; Laat, C. de

    2012-01-01

    This paper presents on-going research to develop the Intercloud Architecture Framework (ICAF) that addresses problems in multi-provider multi-domain heterogeneous cloud based infrastructure services and applications integration and interoperability. The paper refers to existing standards in Cloud

  15. Intercloud architecture for interoperability and integration

    NARCIS (Netherlands)

    Demchenko, Y.; Makkes, M.X.; Strijkers, R.; de Laat, C.

    2012-01-01

    This paper presents on-going research to develop the Intercloud Architecture Framework (ICAF) that addresses problems in multi-provider multi-domain heterogeneous cloud based infrastructure services and applications integration and interoperability. The paper refers to existing standards in Cloud

  16. BIM Interoperability Limitations: Australian and Malaysian Rail Projects

    Directory of Open Access Journals (Sweden)

    Kenley Russell

    2016-01-01

    Full Text Available Building information modelling (BIM is defined as a process involving the generation and management of digital representation of physical and functional characteristics of a facility. The purpose of interoperability in integrated or “open” BIM is to facilitate the information exchange between different digital systems, models and tools. There has been effort towards data interoperability with development of open source standards and object-oriented models, such as industry foundation classes (IFC for vertical infrastructure. However, the lack of open data standards for the information exchange for horizontal infrastructure limits the adoption and effectiveness of integrated BIM. The paper outlines two interoperability issues for construction of rail infrastructure. The issues are presented in two case study reports, one from Australia and one from Malaysia. The each case study includes: a description of the project, the application of BIM in the project, a discussion of the promised BIM interoperability solution plus the identification of the unresolved lack of interoperability for horizontal infrastructure project management. The Moreton Bay Rail project in Australia introduces general software interoperability issues. The Light Rail Extension project in Kuala Lumpur outlines an example of the integration problems related to two different location data structures. The paper highlights how the continuing lack of data interoperability limits utilisation of integrated BIM for horizontal infrastructure rail projects.

  17. Proposing hierarchy-similarity based access control framework: A multilevel Electronic Health Record data sharing approach for interoperable environment

    Directory of Open Access Journals (Sweden)

    Shalini Bhartiya

    2017-10-01

    Full Text Available Interoperability in healthcare environment deals with sharing of patient’s Electronic Health Records (EHR with fellow professionals in inter as well as intra departments or organizations. Healthcare environment experiences frequent shifting of doctors, paramedical staff in inter as well as intra departments or hospitals. The system exhibits dynamic attributes of users and resources managed through access control policies defined for that environment. Rules obtained on merging of such policies often generate policy-conflicts thereby resulting in undue data leakages to unintended users. This paper proposes an access control framework that applies a Hierarchy Similarity Analyzer (HSA on the policies need to be merged. It calculates a Security_Level (SL and assigns it to the users sharing data. The SL determines the authorized amount of data that can be shared on successful collaboration of two policies. The proposed framework allows integration of independent policies and identifies the possible policy-conflicts arising due to attribute disparities in defined rules. The framework is implemented on XACML policies and compared with other access models designed using centralized and decentralized approaches. Conditional constraints and properties are defined that generate policy-conflicts as prevalent in the policies.

  18. BENEFITS OF LINKED DATA FOR INTEROPERABILITY DURING CRISIS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    R. Roller

    2015-08-01

    Full Text Available Floodings represent a permanent risk to the Netherlands in general and to her power supply in particular. Data sharing is essential within this crisis scenario as a power cut affects a great variety of interdependant sectors. Currently used data sharing systems have been shown to hamper interoperability between stakeholders since they lack flexibility and there is no consensus in term definitions and interpretations. The study presented in this paper addresses these challenges by proposing a new data sharing solution based on Linked Data, a method of interlinking data points in a structured way on the web. A conceptual model for two data sharing parties in a flood-caused power cut crisis management scenario was developed to which relevant data were linked. The analysis revealed that the presented data sharing solution burderns its user with extra costs in the short run, but saves resources in the long run by overcoming interoperability problems of the legacy systems. The more stakeholders adopt Linked Data the stronger its benefits for data sharing will become.

  19. Inter-operability

    International Nuclear Information System (INIS)

    Plaziat, J.F.; Moulin, P.; Van Beurden, R.; Ballet, E.

    2005-01-01

    Building an internal gas market implies establishing harmonized rules for cross border trading between operators. To that effect, the European association EASEE-gas is carrying out standards and procedures, commonly called 'inter-operability'. Set up in 2002, the Association brings together all segments of the gas industry: producers, transporters, distributors, traders and shippers, suppliers, consumers and service providers. This workshop presents the latest status on issues such as barriers to gas trade in Europe, rules and procedures under preparation by EASEE-gas, and the implementation schedule of these rules by operators. This article gathers 5 presentations about this topic given at the gas conference

  20. Unmanned Ground Vehicle (UGV) Interoperability Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The UGV Interoperability Lab provides the capability to verify vendor conformance against government-defined interoperability profiles (IOPs). This capability allows...

  1. Cloud portability and interoperability issues and current trends

    CERN Document Server

    Di Martino, Beniamino; Esposito, Antonio

    2015-01-01

    This book offers readers a quick, comprehensive and up-to-date overview of the most important methodologies, technologies, APIs and standards related to the portability and interoperability of cloud applications and services, illustrated by a number of use cases representing a variety of interoperability and portability scenarios. The lack of portability and interoperability between cloud platforms at different service levels is the main issue affecting cloud-based services today. The brokering, negotiation, management, monitoring and reconfiguration of cloud resources are challenging tasks

  2. Augmenting interoperability across repositories architectural ideas

    CERN Multimedia

    CERN. Geneva

    2005-01-01

    The aDORe digital repository architecture designed and implemented by the Los Alamos Research Library is fully standards-based and highly modular, with the various components of the architecture interacting in a protocol-driven manner. Although aDORe was designed for use in the context of the Los Alamos Library, its modular and standards-based design has led to interesting insights regarding possible new levels of interoperability in a federation of heterogeneous repositories. The presentation will discuss these insights, and will illustrate that attractive federations of repositories can be built by introducing rather basic interoperability requirements. The presentation will also show that, once these requirements are met, a powerful service framework that overlays the federation can emerge.

  3. Evaluation of Enterprise Architecture Interoperability

    National Research Council Canada - National Science Library

    Jamison, Theresa A; Niska, Brice T; Layman, Phillip A; Whitney, Steven P

    2005-01-01

    ...), which describes these architectures. The purpose of this project, suggested by Air Force Space Command, was to examine the value of existing analytical tools in making an interoperability assessment of individual enterprises, as well...

  4. IPSec VPN Capabilities and Interoperability

    Science.gov (United States)

    2006-07-01

    IPSec VPN services include Juniper (formerly Netscreen) and Cisco. Of interest is the interoperability of setting up an IPSec VPN tunnel ... IPSec VPN services include Juniper (formerly Netscreen) and Cisco. Of interest is the interoperability of setting up an IPSec VPN tunnel with a Juniper...vendor implementations of IPSec VPN tunneling in an environment where both vendors play a role. The second objective was to determine some

  5. Bringing nature-based solutions to scale

    Science.gov (United States)

    Jongman, Brenden; Lange, Glenn-Marie; Balog, Simone; van Wesenbeeck, Bregje

    2017-04-01

    Coastal communities in developing countries are highly exposed and vulnerable to coastal flood risk, and are likely to suffer from climate change induced changes in risk. Over the last decade, strong evidence has surfaced that nature-based solutions or ecosystem-based approaches are efficient and effective alternatives for flood risk reduction and climate change adaptation. In developing countries, numerous projects have therefore been implemented, often driven by international donors and NGOs. Some of these projects have been successful in reducing risk while improving environmental and socioeconomic conditions. However, the feasibility assessment, design and implementation of nature-based solutions is a multifaceted process, which needs to be well-understood before such solutions can be effectively implemented as an addition or alternative to grey infrastructure. This process has not always been followed. As a result, many projects have failed to deliver positive outcomes. The international community therefore has a challenge in bringing nature-based solutions to scale in an effective way. In this presentation, we will present best practice guidelines on nature-based solution implementation that are currently being discussed by the international community. Furthermore, we will present the alpha version of a new web platform being developed by the World Bank that will serve as a much-needed central repository for project information on nature-based solutions, and that will host actionable implementation guidelines. The presentation will also serve as an invitation to the scientific community to share their experience and lessons learned, and contribute to the outlining of best practice guidance.

  6. The Benefit of Ontologies for Interoperability of CCIS. (Easy, Quick and Cheap Solutions are Impossible, if Semantics of CCIS are Affected.)

    National Research Council Canada - National Science Library

    Wunder, Michael A

    2003-01-01

    ... conditions of the target database. These algorithms are based on ontologies, which are the formal descriptions of concepts and relationships of objects that are relevant for a domain. They describe how we see the world we are looking at. No changes to existing CCIS are necessary -- they may remain as they are.

  7. River Basin Standards Interoperability Pilot

    Science.gov (United States)

    Pesquer, Lluís; Masó, Joan; Stasch, Christoph

    2016-04-01

    There is a lot of water information and tools in Europe to be applied in the river basin management but fragmentation and a lack of coordination between countries still exists. The European Commission and the member states have financed several research and innovation projects in support of the Water Framework Directive. Only a few of them are using the recently emerging hydrological standards, such as the OGC WaterML 2.0. WaterInnEU is a Horizon 2020 project focused on creating a marketplace to enhance the exploitation of EU funded ICT models, tools, protocols and policy briefs related to water and to establish suitable conditions for new market opportunities based on these offerings. One of WaterInnEU's main goals is to assess the level of standardization and interoperability of these outcomes as a mechanism to integrate ICT-based tools, incorporate open data platforms and generate a palette of interchangeable components that are able to use the water data emerging from the recently proposed open data sharing processes and data models stimulated by initiatives such as the INSPIRE directive. As part of the standardization and interoperability activities in the project, the authors are designing an experiment (RIBASE, the present work) to demonstrate how current ICT-based tools and water data can work in combination with geospatial web services in the Scheldt river basin. The main structure of this experiment, that is the core of the present work, is composed by the following steps: - Extraction of information from river gauges data in OGC WaterML 2.0 format using SOS services (preferably compliant to the OGC SOS 2.0 Hydrology Profile Best Practice). - Model floods using a WPS 2.0, WaterML 2.0 data and weather forecast models as input. - Evaluation of the applicability of Sensor Notification Services in water emergencies. - Open distribution of the input and output data as OGC web services WaterML, / WCS / WFS and with visualization utilities: WMS. The architecture

  8. Solving the interoperability challenge of a distributed complex patient guidance system: a data integrator based on HL7's Virtual Medical Record standard.

    Science.gov (United States)

    Marcos, Carlos; González-Ferrer, Arturo; Peleg, Mor; Cavero, Carlos

    2015-05-01

    We show how the HL7 Virtual Medical Record (vMR) standard can be used to design and implement a data integrator (DI) component that collects patient information from heterogeneous sources and stores it into a personal health record, from which it can then retrieve data. Our working hypothesis is that the HL7 vMR standard in its release 1 version can properly capture the semantics needed to drive evidence-based clinical decision support systems. To achieve seamless communication between the personal health record and heterogeneous data consumers, we used a three-pronged approach. First, the choice of the HL7 vMR as a message model for all components accompanied by the use of medical vocabularies eases their semantic interoperability. Second, the DI follows a service-oriented approach to provide access to system components. Third, an XML database provides the data layer.Results The DI supports requirements of a guideline-based clinical decision support system implemented in two clinical domains and settings, ensuring reliable and secure access, high performance, and simplicity of integration, while complying with standards for the storage and processing of patient information needed for decision support and analytics. This was tested within the framework of a multinational project (www.mobiguide-project.eu) aimed at developing a ubiquitous patient guidance system (PGS). The vMR model with its extension mechanism is demonstrated to be effective for data integration and communication within a distributed PGS implemented for two clinical domains across different healthcare settings in two nations. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  9. OR.NET: a service-oriented architecture for safe and dynamic medical device interoperability.

    Science.gov (United States)

    Kasparick, Martin; Schmitz, Malte; Andersen, Björn; Rockstroh, Max; Franke, Stefan; Schlichting, Stefan; Golatowski, Frank; Timmermann, Dirk

    2018-02-23

    Modern surgical departments are characterized by a high degree of automation supporting complex procedures. It recently became apparent that integrated operating rooms can improve the quality of care, simplify clinical workflows, and mitigate equipment-related incidents and human errors. Particularly using computer assistance based on data from integrated surgical devices is a promising opportunity. However, the lack of manufacturer-independent interoperability often prevents the deployment of collaborative assistive systems. The German flagship project OR.NET has therefore developed, implemented, validated, and standardized concepts for open medical device interoperability. This paper describes the universal OR.NET interoperability concept enabling a safe and dynamic manufacturer-independent interconnection of point-of-care (PoC) medical devices in the operating room and the whole clinic. It is based on a protocol specifically addressing the requirements of device-to-device communication, yet also provides solutions for connecting the clinical information technology (IT) infrastructure. We present the concept of a service-oriented medical device architecture (SOMDA) as well as an introduction to the technical specification implementing the SOMDA paradigm, currently being standardized within the IEEE 11073 service-oriented device connectivity (SDC) series. In addition, the Session concept is introduced as a key enabler for safe device interconnection in highly dynamic ensembles of networked medical devices; and finally, some security aspects of a SOMDA are discussed.

  10. Interoperability and HealthGRID.

    Science.gov (United States)

    Bescos, C; Schmitt, D; Kass, J; García-Barbero, M; Kantchev, P

    2005-01-01

    GRID technology, with initiatives like the GGF, will have the potential to allow both competition and interoperability not only among applications and toolkits, but also among implementations of key services. The pyramid of eHealth interoperability should be achieved from standards in communication and data security, storage and processing, to the policy initiatives, including organizational protocols, financing procedures, and legal framework. The open challenges for GRID use in clinical fields illustrate the potential of the combination of grid technologies with medical routine into a wider interoperable framework. The Telemedicine Alliance is a consortium (ESA, WHO and ITU), initiated in 2002, in building a vision for the provision of eHealth to European citizens by 2010. After a survey with more that 50 interviews of experts, interoperability was identified as the main showstopper to eHealth implementation. There are already several groups and organizations contributing to standardization. TM-Alliance is supporting the "e-Health Standardization Coordination Group" (eHSCG). It is now, in the design and development phase of GRID technology in Health, the right moment to act with the aim of achieving an interoperable and open framework. The Health area should benefit from the initiatives started at the GGF in terms of global architecture and services definitions, as well as from the security and other web services applications developed under the Internet umbrella. There is a risk that existing important results of the standardization efforts in this area are not taken up simply because they are not always known.

  11. Modelling and simulation-based support for interoperability exercises in preparation of 2010 FIFA World Cup South Africa

    CSIR Research Space (South Africa)

    Le Roux, WH

    2008-11-01

    Full Text Available entities, both modelled and created by the injection service through the use of mock-up terminals. The gateway can route multiple connections, whilst at the same time, translate between different protocols. Furthermore, filtering can be based on spatial...

  12. A Smart Home Center Platform Solution Based on Smart Mirror

    Directory of Open Access Journals (Sweden)

    Deng Xibo

    2017-01-01

    Full Text Available With the popularization of the concept of smart home, people have raised requirements on the experience of smart living. A smart home platform center solution is put forward in order to solve the intelligent interoperability and information integration of smart home, which enable people to have a more intelligent and convenient life experience. This platform center is achieved through the Smart Mirror. The Smart Mirror refers to a smart furniture, on the basis of the traditional concept of mirror, combining Raspberry Pi, the application of one-way mirror imaging principle, the touch-enabled design, voice and video interaction. Smart Mirror can provide a series of intelligent experience for the residents, such as controlling all the intelligent furniture through Smart Mirror; accessing and displaying the weather, time, news and other life information; monitoring the home environment; remote interconnection operation.

  13. U.K. MoD Land Open Systems Architecture and coalition interoperability with the U.S.

    Science.gov (United States)

    Pearson, Gavin; Kolodny, Mike

    2013-05-01

    The UK Land Open System Architecture (LOSA) is an open, service-based architecture for systems integration and interoperability in the land environment. It is being developed in order to deliver coherent and agile force elements at readiness to operations. LOSA affects planning, delivery and force generation, and supports Future Force 2020. This paper will review the objectives of LOSA and the progress made to date; before focusing on an approach to achieve plug-and-play interoperability of ISR assets. This approach has been proposed to the US DoD Coalition Warfare Program Office as a programme to develop a technology solution to achieve the goal of ISR interoperability. The approach leverages the efforts of the UK Land Open System Architecture (LOSA) and the US Terra Harvest (TH) programs. An open architecture approach is used to enable rapid integration and for disparate assets to autonomously operate collaboratively and coherently; assets share situational awareness and cue other assets when a prescribed set of operational conditions are met. The objective of the interoperability programme being to develop a common lexicon and coherent approach to collaborative operation and information release.

  14. Dynamic Business Networks: A Headache for Sustainable Systems Interoperability

    Science.gov (United States)

    Agostinho, Carlos; Jardim-Goncalves, Ricardo

    Collaborative networked environments emerged with the spread of the internet, contributing to overcome past communication barriers, and identifying interoperability as an essential property. When achieved seamlessly, efficiency is increased in the entire product life cycle. Nowadays, most organizations try to attain interoperability by establishing peer-to-peer mappings with the different partners, or in optimized networks, by using international standard models as the core for information exchange. In current industrial practice, mappings are only defined once, and the morphisms that represent them, are hardcoded in the enterprise systems. This solution has been effective for static environments, where enterprise and product models are valid for decades. However, with an increasingly complex and dynamic global market, models change frequently to answer new customer requirements. This paper draws concepts from the complex systems science and proposes a framework for sustainable systems interoperability in dynamic networks, enabling different organizations to evolve at their own rate.

  15. Information modeling for interoperable dimensional metrology

    CERN Document Server

    Zhao, Y; Brown, Robert; Xu, Xun

    2014-01-01

    This book analyzes interoperability issues in dimensional metrology systems and describes information modeling techniques. Coverage includes theory, techniques and key technologies, and explores new approaches for solving real-world interoperability problems.

  16. Assessing Schizophrenia with an Interoperable Architecture

    NARCIS (Netherlands)

    Emerencia, Ando; van der Krieke, Lian; Petkov, Nicolai; Aiello, Marco; Bouamrane, Matt-Mouley; Tao, Cui

    2011-01-01

    With the introduction of electronic personal health records and e-health applications spreading, interoperability concerns are of increasing importance to hospitals and care facilities. Interoperability between distributed and complex systems requires, among other things, compatible data formats.

  17. ARGOS policy brief on semantic interoperability.

    Science.gov (United States)

    Kalra, Dipak; Musen, Mark; Smith, Barry; Ceusters, Werner; De Moor, Georges

    2011-01-01

    Semantic interoperability is one of the priority themes of the ARGOS Trans-Atlantic Observatory. This topic represents a globally recognised challenge that must be addressed if electronic health records are to be shared among heterogeneous systems, and the information in them exploited to the maximum benefit of patients, professionals, health services, research, and industry. Progress in this multi-faceted challenge has been piecemeal, and valuable lessons have been learned, and approaches discovered, in Europe and in the US that can be shared and combined. Experts from both continents have met at three ARGOS workshops during 2010 and 2011 to share understanding of these issues and how they might be tackled collectively from both sides of the Atlantic. This policy brief summarises the problems and the reasons why they are important to tackle, and also why they are so difficult. It outlines the major areas of semantic innovation that exist and that are available to help address this challenge. It proposes a series of next steps that need to be championed on both sides of the Atlantic if further progress is to be made in sharing and analysing electronic health records meaningfully. Semantic interoperability requires the use of standards, not only for EHR data to be transferred and structurally mapped into a receiving repository, but also for the clinical content of the EHR to be interpreted in conformity with the original meanings intended by its authors. Wide-scale engagement with professional bodies, globally, is needed to develop these clinical information standards. Accurate and complete clinical documentation, faithful to the patient's situation, and interoperability between systems, require widespread and dependable access to published and maintained collections of coherent and quality-assured semantic resources, including models such as archetypes and templates that would (1) provide clinical context, (2) be mapped to interoperability standards for EHR data

  18. The interoperability force in the ERP field

    Science.gov (United States)

    Boza, Andrés; Cuenca, Llanos; Poler, Raúl; Michaelides, Zenon

    2015-04-01

    Enterprise resource planning (ERP) systems participate in interoperability projects and this participation sometimes leads to new proposals for the ERP field. The aim of this paper is to identify the role that interoperability plays in the evolution of ERP systems. To go about this, ERP systems have been first identified within interoperability frameworks. Second, the initiatives in the ERP field driven by interoperability requirements have been identified from two perspectives: technological and business. The ERP field is evolving from classical ERP as information system integrators to a new generation of fully interoperable ERP. Interoperability is changing the way of running business, and ERP systems are changing to adapt to the current stream of interoperability.

  19. Food product tracing technology capabilities and interoperability.

    Science.gov (United States)

    Bhatt, Tejas; Zhang, Jianrong Janet

    2013-12-01

    identify the supply-chain partner who provided the information prior to sharing this information with product tracing technology providers. The 9 traceability solution providers who agreed to participate in this project have their systems deployed in a wide range of sectors within the food industry including, but not limited to, livestock, dairy, produce, fruits, seafood, meat, and pork; as well as in pharmaceutical, automotive, retail, and other industries. Some have also been implemented across the globe including Canada, China, USA, Norway, and the EU, among others. This broad commercial use ensures that the findings of this work are applicable to a broad spectrum of the food system. Six of the 9 participants successfully completed the data entry phase of this test. To verify successful data entry for these 6, a demo or screenshots of the data set from each system's user interface was requested. Only 4 of the 6 were able to provide us with this evidence for verification. Of the 6 that completed data entry and moved on to the scenarios phase of the test, 5 were able to provide us with the responses to the scenarios. Time metrics were useful for evaluating the scalability and usability of each technology. Scalability was derived from the time it took to enter the nonstandardized data set into the system (ranges from 7 to 11 d). Usability was derived from the time it took to query the scenarios and provide the results (from a few hours to a week). The time was measured in days it took for the participants to respond after we supplied them all the information they would need to successfully execute each test/scenario. Two of the technology solution providers successfully implemented and participated in a proof-of-concept interoperable framework during Year 2 of this study. While not required, they also demonstrated this interoperability capability on the FSMA-mandated food product tracing pilots for the U.S. FDA. This has significant real-world impact since the

  20. Semantic Interoperability in Heterogeneous IoT Infrastructure for Healthcare

    Directory of Open Access Journals (Sweden)

    Sohail Jabbar

    2017-01-01

    Full Text Available Interoperability remains a significant burden to the developers of Internet of Things’ Systems. This is due to the fact that the IoT devices are highly heterogeneous in terms of underlying communication protocols, data formats, and technologies. Secondly due to lack of worldwide acceptable standards, interoperability tools remain limited. In this paper, we proposed an IoT based Semantic Interoperability Model (IoT-SIM to provide Semantic Interoperability among heterogeneous IoT devices in healthcare domain. Physicians communicate their patients with heterogeneous IoT devices to monitor their current health status. Information between physician and patient is semantically annotated and communicated in a meaningful way. A lightweight model for semantic annotation of data using heterogeneous devices in IoT is proposed to provide annotations for data. Resource Description Framework (RDF is a semantic web framework that is used to relate things using triples to make it semantically meaningful. RDF annotated patients’ data has made it semantically interoperable. SPARQL query is used to extract records from RDF graph. For simulation of system, we used Tableau, Gruff-6.2.0, and Mysql tools.

  1. Framework research of semantic sharing and interoperability of geospatial information

    Science.gov (United States)

    Zhao, Hu; Li, Lin; Shi, Yunfei

    2008-12-01

    Knowledge sharing and semantic interoperability is a significant research theme in Geographical Information Science (GIScience) because many researchers believe that semantic heterogeneity has been identified as the main obstacle for GIScience development. Interoperability issues can exist at three levels: syntactic, structural (also called systemic) and semantic. The former two, however, can be achieved by implementing international or domain standards proposed by several organizations, for example, Open Geospatial Consortium (OGC), World Wide Web Consortium (W3C) and the International Organization for Standardization/Technical Committee for Geographic information/Geomatics (ISO/TC 211). In this paper, we are concentrating on semantic interoperability, which is the sort of topic that halt conversations and cause people's eyes to glaze over, from two aspects: data/information/knowledge and operation/processing. We presented a service-centered architecture for semantic interoperability of geospatial data and processes. OGC standards like Web Feature Service (WFS) and Web Map Service (WMS) have been employed as normative interfaces for analyzing requests, division requests and delivering small requests. Ontology has been introduced to describe distributed resource including various data and geo-processing operations. The role of interoperability, especially from semantic perspective, has been distinguished at the first section in this paper. As a fundamental principal, the following section introduces semantic web, web service and other related works at this orientation. We present our service-based architecture in detail and its simple application at part three. Conclusion and further orientations have been illustrated at last section.

  2. Practical experience in deploying and controlling the data sharing interoperability layer at the U.K. Land Open Systems Architecture (LOSA) field trials in October 2012

    Science.gov (United States)

    Bergamaschi, Flavio; Conway-Jones, Dave; Pearson, Gavin

    2013-05-01

    In October 2012 the UK MoD sponsored a multi-vendor field integration trial in support of its Land Open Systems Architecture (LOSA), an open, service based architecture for systems integration and interoperability which builds on the progress made with the Generic Vehicle Architecture (GVA, DefStan 23-09), Generic Base Architecture (GBA, DefStan 23-13) and the Generic Soldier Architecture (DefStan 23-12) programs. The aim of this trial was to experiment with a common data and power interoperability across and in support of the Soldier, Vehicles and Bases domains. This paper presents an overview of the field trial and discusses how the ITA Information Fabric, technology originated in the US and UK International Technology Alliance program, was extended to support the control of the data interoperability layer across various data bearers. This included: (a) interoperability and information sharing across multiple stove piped and legacy solutions; (b) command and control and bandwidth optimization of streamed data (e.g. video) over a peer-to-peer ad-hoc network across multiple domains- integration of disparate sensor systems; (c) integration with DDS based C2 systems.

  3. Metadata behind the Interoperability of Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Miguel Angel Manso Callejo

    2009-05-01

    Full Text Available Wireless Sensor Networks (WSNs produce changes of status that are frequent, dynamic and unpredictable, and cannot be represented using a linear cause-effect approach. Consequently, a new approach is needed to handle these changes in order to support dynamic interoperability. Our approach is to introduce the notion of context as an explicit representation of changes of a WSN status inferred from metadata elements, which in turn, leads towards a decision-making process about how to maintain dynamic interoperability. This paper describes the developed context model to represent and reason over different WSN status based on four types of contexts, which have been identified as sensing, node, network and organisational contexts. The reasoning has been addressed by developing contextualising and bridges rules. As a result, we were able to demonstrate how contextualising rules have been used to reason on changes of WSN status as a first step towards maintaining dynamic interoperability.

  4. Evolutionary Based Solutions for Green Computing

    CERN Document Server

    Kołodziej, Joanna; Li, Juan; Zomaya, Albert

    2013-01-01

    Today’s highly parameterized large-scale distributed computing systems may be composed  of a large number of various components (computers, databases, etc) and must provide a wide range of services. The users of such systems, located at different (geographical or managerial) network cluster may have a limited access to the system’s services and resources, and different, often conflicting, expectations and requirements. Moreover, the information and data processed in such dynamic environments may be incomplete, imprecise, fragmentary, and overloading. All of the above mentioned issues require some intelligent scalable methodologies for the management of the whole complex structure, which unfortunately may increase the energy consumption of such systems.   This book in its eight chapters, addresses the fundamental issues related to the energy usage and the optimal low-cost system design in high performance ``green computing’’ systems. The recent evolutionary and general metaheuristic-based solutions ...

  5. SHARP/PRONGHORN Interoperability: Mesh Generation

    Energy Technology Data Exchange (ETDEWEB)

    Avery Bingham; Javier Ortensi

    2012-09-01

    Progress toward collaboration between the SHARP and MOOSE computational frameworks has been demonstrated through sharing of mesh generation and ensuring mesh compatibility of both tools with MeshKit. MeshKit was used to build a three-dimensional, full-core very high temperature reactor (VHTR) reactor geometry with 120-degree symmetry, which was used to solve a neutron diffusion critical eigenvalue problem in PRONGHORN. PRONGHORN is an application of MOOSE that is capable of solving coupled neutron diffusion, heat conduction, and homogenized flow problems. The results were compared to a solution found on a 120-degree, reflected, three-dimensional VHTR mesh geometry generated by PRONGHORN. The ability to exchange compatible mesh geometries between the two codes is instrumental for future collaboration and interoperability. The results were found to be in good agreement between the two meshes, thus demonstrating the compatibility of the SHARP and MOOSE frameworks. This outcome makes future collaboration possible.

  6. Interoperable Archetypes With a Three Folded Terminology Governance.

    Science.gov (United States)

    Pederson, Rune; Ellingsen, Gunnar

    2015-01-01

    The use of openEHR archetypes increases the interoperability of clinical terminology, and in doing so improves upon the availability of clinical terminology for both primary and secondary purposes. Where clinical terminology is employed in the EPR system, research reports conflicting a results for the use of structuring and standardization as measurements of success. In order to elucidate this concept, this paper focuses on the effort to establish a national repository for openEHR based archetypes in Norway where clinical terminology could be included with benefit for interoperability three folded.

  7. Sustainability of Open-Source Software Organizations as Underpinning for Sustainable Interoperability on Large Scales

    Science.gov (United States)

    Fulker, D. W.; Gallagher, J. H. R.

    2015-12-01

    OPeNDAP's Hyrax data server is an open-source framework fostering interoperability via easily-deployed Web services. Compatible with solutions listed in the (PA001) session description—federation, rigid standards and brokering/mediation—the framework can support tight or loose coupling, even with dependence on community-contributed software. Hyrax is a Web-services framework with a middleware-like design and a handler-style architecture that together reduce the interoperability challenge (for N datatypes and M user contexts) to an O(N+M) problem, similar to brokering. Combined with an open-source ethos, this reduction makes Hyrax a community tool for gaining interoperability. E.g., in its response to the Big Earth Data Initiative (BEDI), NASA references OPeNDAP-based interoperability. Assuming its suitability, the question becomes: how sustainable is OPeNDAP, a small not-for-profit that produces open-source software, i.e., has no software-sales? In other words, if geoscience interoperability depends on OPeNDAP and similar organizations, are those entities in turn sustainable? Jim Collins (in Good to Great) highlights three questions that successful companies can answer (paraphrased here): What is your passion? Where is your world-class excellence? What drives your economic engine? We attempt to shed light on OPeNDAP sustainability by examining these. Passion: OPeNDAP has a focused passion for improving the effectiveness of scientific data sharing and use, as deeply-cooperative community endeavors. Excellence: OPeNDAP has few peers in remote, scientific data access. Skills include computer science with experience in data science, (operational, secure) Web services, and software design (for servers and clients, where the latter vary from Web pages to standalone apps and end-user programs). Economic Engine: OPeNDAP is an engineering services organization more than a product company, despite software being key to OPeNDAP's reputation. In essence, provision of

  8. Enabling interoperability in planetary sciences and heliophysics: The case for an information model

    Science.gov (United States)

    Hughes, J. Steven; Crichton, Daniel J.; Raugh, Anne C.; Cecconi, Baptiste; Guinness, Edward A.; Isbell, Christopher E.; Mafi, Joseph N.; Gordon, Mitchell K.; Hardman, Sean H.; Joyner, Ronald S.

    2018-01-01

    The Planetary Data System has developed the PDS4 Information Model to enable interoperability across diverse science disciplines. The Information Model is based on an integration of International Organization for Standardization (ISO) level standards for trusted digital archives, information model development, and metadata registries. Where controlled vocabularies provides a basic level of interoperability by providing a common set of terms for communication between both machines and humans the Information Model improves interoperability by means of an ontology that provides semantic information or additional related context for the terms. The information model was defined by team of computer scientists and science experts from each of the diverse disciplines in the Planetary Science community, including Atmospheres, Geosciences, Cartography and Imaging Sciences, Navigational and Ancillary Information, Planetary Plasma Interactions, Ring-Moon Systems, and Small Bodies. The model was designed to be extensible beyond the Planetary Science community, for example there are overlaps between certain PDS disciplines and the Heliophysics and Astrophysics disciplines. "Interoperability" can apply to many aspects of both the developer and the end-user experience, for example agency-to-agency, semantic level, and application level interoperability. We define these types of interoperability and focus on semantic level interoperability, the type of interoperability most directly enabled by an information model.

  9. Internet of Things Heterogeneous Interoperable Network Architecture Design

    OpenAIRE

    Bhalerao, Dipashree M.

    2014-01-01

    Internet of Thing‘s (IoT) state of the art deduce that there is no mature Internet of Things architecture available. Thesis contributes an abstract generic IoT system reference architecture development with specifications. Novelties of thesis are proposed solutions and implementations for Scalability, heterogeneous interoperability, security and extension of IoT architecture for rural, poor and catastrophic (RPC) areas. VLC is proposed and proved as one of the suitable internetwork means to o...

  10. The challenge of networked enterprises for cloud computing interoperability

    OpenAIRE

    Mezgár, István; Rauschecker, Ursula

    2014-01-01

    Manufacturing enterprises have to organize themselves into effective system architectures forming different types of Networked Enterprises (NE) to match fast changing market demands. Cloud Computing (CC) is an important up to date computing concept for NE, as it offers significant financial and technical advantages beside high-level collaboration possibilities. As cloud computing is a new concept the solutions for handling interoperability, portability, security, privacy and standardization c...

  11. .NET INTEROPERABILITY GUIDELINES

    Science.gov (United States)

    The CAPE-OPEN middleware standards were created to allow process modelling components (PMCs) developed by third parties to be used in any process modelling environment (PME) utilizing these standards. The CAPE-OPEN middleware specifications were based upon both Microsoft's Compo...

  12. Interoperability of Web Archives and Digital Libraries

    DEFF Research Database (Denmark)

    Kalb, Hendrik; Lazaridou, Paraskevi; Pinsent, Edward

    2013-01-01

    The interoperability of web archives and digital libraries is crucial to avoid silos of preserved data and content. While various researches focus on specfic facets of the challenge to interoperate, there is a lack of empirical work about the overall situation of actual challenges. We conduct...... a Delphi study to survey and reveal the insights of experts in the field. Results of our study are presented in this paper to enhance further research and development efforts for interoperability....

  13. Standards to open and interoperable digital libraries

    Directory of Open Access Journals (Sweden)

    Luís Fernando Sayão

    2007-12-01

    Full Text Available Interoperability is one of the main issues in creating a networked system of digital libraries. However, the interoperability as the way to accomplish data exchange and service collaboration requires adoption of a set of open standards covering all digital repository processes. The aim of this document is to revise the most important standards, protocols and the best pratices that form the framework to an open and fully interoperable digital library.

  14. Semantic Interoperability Almost Without Using The Same Vocabulary: Is It Possible?

    Science.gov (United States)

    Krisnadhi, A. A.

    2016-12-01

    Semantic interoperability, which is a key requirement in realizing cross-repository data integration, is often understood as using the same ontology or vocabulary. Consequently, within a particular domain, one can easily assume that there has to be one unifying domain ontology covering as many vocabulary terms in the domain as possible in order to realize any form of data integration across multiple data sources. Furthermore, the desire to provide very precise definition of those many terms led to the development of huge, foundational and domain ontologies that are comprehensive, but too complicated, restrictive, monolithic, and difficult to use and reuse, which cause common data providers to avoid using them. This problem is especially true in a domain as diverse as geosciences as it is virtually impossible to reach an agreement to the semantics of many terms (e.g., there are hundreds of definitions of forest used throughout the world). To overcome this challenge, modular ontology architecture has emerged in recent years, fueled among others, by advances in the ontology design pattern research. Each ontology pattern models only one key notion. It can act as a small module of a larger ontology. Such a module is developed in such a way that it is largely independent of how other notions in the same domain are modeled. This leads to an increased reusability. Furthermore, an ontology formed out of such modules would have an improved understandability over large, monolithic ontologies. Semantic interoperability in the aforementioned architecture is not achieved by enforcing the use of the same vocabulary, but rather, promoting alignment to the same ontology patterns. In this work, we elaborate how this architecture realizes the above idea. In particular, we describe how multiple data sources with differing perspectives and vocabularies can interoperate through this architecture. Building the solution upon semantic technologies such as Linked Data and the Web Ontology

  15. JCR VSIL Interoperability Testing

    Science.gov (United States)

    2009-08-01

    Simulation System (ESS) with two simulated SUGV assets, the Scalable Soldier Machine Interface ( SSMI ) software with layout based on a version used for the...between a proprietary controller with a serial interface, versus the SSMI oeu and lAUS messaging. Over a run of ten tests the proprietary controller...had an average of 324 ms latency between command and action with a standard deviation of2.25 ms. The same test was run with the SSMI oeu and

  16. Meeting people's needs in a fully interoperable domotic environment.

    Science.gov (United States)

    Miori, Vittorio; Russo, Dario; Concordia, Cesare

    2012-01-01

    The key idea underlying many Ambient Intelligence (AmI) projects and applications is context awareness, which is based mainly on their capacity to identify users and their locations. The actual computing capacity should remain in the background, in the periphery of our awareness, and should only move to the center if and when necessary. Computing thus becomes 'invisible', as it is embedded in the environment and everyday objects. The research project described herein aims to realize an Ambient Intelligence-based environment able to improve users' quality of life by learning their habits and anticipating their needs. This environment is part of an adaptive, context-aware framework designed to make today's incompatible heterogeneous domotic systems fully interoperable, not only for connecting sensors and actuators, but for providing comprehensive connections of devices to users. The solution is a middleware architecture based on open and widely recognized standards capable of abstracting the peculiarities of underlying heterogeneous technologies and enabling them to co-exist and interwork, without however eliminating their differences. At the highest level of this infrastructure, the Ambient Intelligence framework, integrated with the domotic sensors, can enable the system to recognize any unusual or dangerous situations and anticipate health problems or special user needs in a technological living environment, such as a house or a public space.

  17. A Pragmatic Approach to Sustainable Interoperability for the Web 2.0 World

    Science.gov (United States)

    Wright, D. J.; Sankaran, S.

    2015-12-01

    In the geosciences, interoperability is a fundamental requirement. Members of various standards organizations such as the OGC and ISO-TC 211 have done yeomen services to promote a standards-centric approach to manage the interoperability challenges that organizations face today. The specific challenges that organizations face when adopting interoperability patterns are very many. One approach, that of mandating the use of specific standards has been reasonably successful. But scientific communities, as with all others, ultimately want their solutions to be widely accepted and used. And to this end there is a crying need to explore all possible interoperability patterns without restricting the choices to mandated standards. Standards are created by a slow and deliberative process that sometimes takes a long time to come to fruition and therefore sometime feel to fall short of user expectations. It seems therefore that organizations are left with a series of perceived orthogonal requirements when they want to pursue interoperability. They want a robust but agile solution, a mature approach that also needs to satisfy latest technology trends and so on. Sustainable interoperability patterns need to be forward looking and should choose the patterns and paradigms of the Web 2.0 generation. To this end, the key is to choose platform technologies that embrace multiple interoperability mechanisms that are built on fundamental "open" principles and which align with popular mainstream patterns. We seek to explore data-, metadata- and web service-related interoperability patterns through the prism of building solutions that encourage strong implementer and end-user engagement, improved usability and scalability considerations, and appealing developer frameworks that can grow the audience. The path to tread is not new, and the geocommunity only needs to observe and align its end goals with current Web 2.0 patterns to realize all the benefits that today we all take for granted

  18. Biodiversity information platforms: From standards to interoperability

    Directory of Open Access Journals (Sweden)

    Walter Berendsohn

    2011-11-01

    Full Text Available One of the most serious bottlenecks in the scientific workflows of biodiversity sciences is the need to integrate data from different sources, software applications, and services for analysis, visualisation and publication. For more than a quarter of a century the TDWG Biodiversity Information Standards organisation has a central role in defining and promoting data standards and protocols supporting interoperability between disparate and locally distributed systems. Although often not sufficiently recognized, TDWG standards are the foundation of many popular Biodiversity Informatics applications and infrastructures ranging from small desktop software solutions to large scale international data networks. However, individual scientists and groups of collaborating scientist have difficulties in fully exploiting the potential of standards that are often notoriously complex, lack non-technical documentations, and use different representations and underlying technologies. In the last few years, a series of initiatives such as Scratchpads, the EDIT Platform for Cybertaxonomy, and biowikifarm have started to implement and set up virtual work platforms for biodiversity sciences which shield their users from the complexity of the underlying standards. Apart from being practical work-horses for numerous working processes related to biodiversity sciences, they can be seen as information brokers mediating information between multiple data standards and protocols. The ViBRANT project will further strengthen the flexibility and power of virtual biodiversity working platforms by building software interfaces between them, thus facilitating essential information flows needed for comprehensive data exchange, data indexing, web-publication, and versioning. This work will make an important contribution to the shaping of an international, interoperable, and user-oriented biodiversity information infrastructure.

  19. Biodiversity information platforms: From standards to interoperability.

    Science.gov (United States)

    Berendsohn, W G; Güntsch, A; Hoffmann, N; Kohlbecker, A; Luther, K; Müller, A

    2011-01-01

    One of the most serious bottlenecks in the scientific workflows of biodiversity sciences is the need to integrate data from different sources, software applications, and services for analysis, visualisation and publication. For more than a quarter of a century the TDWG Biodiversity Information Standards organisation has a central role in defining and promoting data standards and protocols supporting interoperability between disparate and locally distributed systems.Although often not sufficiently recognized, TDWG standards are the foundation of many popular Biodiversity Informatics applications and infrastructures ranging from small desktop software solutions to large scale international data networks. However, individual scientists and groups of collaborating scientist have difficulties in fully exploiting the potential of standards that are often notoriously complex, lack non-technical documentations, and use different representations and underlying technologies. In the last few years, a series of initiatives such as Scratchpads, the EDIT Platform for Cybertaxonomy, and biowikifarm have started to implement and set up virtual work platforms for biodiversity sciences which shield their users from the complexity of the underlying standards. Apart from being practical work-horses for numerous working processes related to biodiversity sciences, they can be seen as information brokers mediating information between multiple data standards and protocols.The ViBRANT project will further strengthen the flexibility and power of virtual biodiversity working platforms by building software interfaces between them, thus facilitating essential information flows needed for comprehensive data exchange, data indexing, web-publication, and versioning. This work will make an important contribution to the shaping of an international, interoperable, and user-oriented biodiversity information infrastructure.

  20. Design and Realization of Integrated Management System for Data Interoperability between Point-of-Care Testing Equipment and Hospital Information System.

    Science.gov (United States)

    Park, Ki Sang; Heo, Hyuk; Choi, Young Keun

    2013-09-01

    The purpose of this study was to design an integrated data management system based on the POCT1-A2, LIS2-A, LIS2-A2, and HL7 standard to ensure data interoperability between mobile equipment, such as point-of-care testing equipment and the existing hospital data system, its efficiency was also evaluated. The method of this study was intended to design and realize a data management system which would provide a solution for the problems that occur when point-of-care testing equipment is introduced to existing hospital data, after classifying such problems into connectivity, integration, and interoperability. This study also checked if the data management system plays a sufficient role as a bridge between the point-of-care testing equipment and the hospital information system through connection persistence and reliability testing, as well as data integration and interoperability testing. In comparison with the existing system, the data management system facilitated integration by improving the result receiving time, improving the collection rate, and by enabling the integration of disparate types of data into a single system. And it was found out that we can solve the problems related to connectivity, integration and interoperability through generating the message in standardized types. It is expected that the proposed data management system, which is designed to improve the integration point-of-care testing equipment with existing systems, will establish a solid foundation on which better medical service may be provided by hospitals by improving the quality of patient service.

  1. Interoperability of ESA Science Archives

    Science.gov (United States)

    Arviset, C.; Dowson, J.; Hernández, J.; Osuna, P.; Venet, A.

    The ISO Data Archive (IDA) and the XMM-Newton Science Archive (XSA) have been developed by the Science Operations and Data Systems Division of ESA in Villafranca, Spain. They are both built using the same flexible and modular 3-tier architecture: Data Products and Database, Business Logic, User Interface. This open architecture, together with Java and XML technology have helped in making the IDA and XSA inter-operable with other archives and applications. The various accesses from the IDA and the XSA to remote archives are described as well as the mechanism to directly access these ESA archives from remote archives

  2. Towards an enterprise interoperability framework

    CSIR Research Space (South Africa)

    Kotzé, P

    2010-06-01

    Full Text Available Framework defines interoperabili- ty more holistically as ‘the ability of information and communication technology (ICT) systems and of the business processes they support to exchange data and to enable the sharing of information and knowledge’ [14: 5... the services so exchanged to enable them to operate effectively together’ [18]. Interoperability is thus the ability of two or more different entities (be they pieces of software, processes, systems, business units, etc.) to ‘inter-operate’ [29]. 2.3 What...

  3. Excemplify: A Flexible Template Based Solution, Parsing and Managing Data in Spreadsheets for Experimentalists

    Directory of Open Access Journals (Sweden)

    Shi Lei

    2013-06-01

    Full Text Available In systems biology, quantitative experimental data is the basis of building mathematical models. In most of the cases, they are stored in Excel files and hosted locally. To have a public database for collecting, retrieving and citing experimental raw data as well as experimental conditions is important for both experimentalists and modelers. However, the great effort needed in the data handling procedure and in the data submission procedure becomes the crucial limitation for experimentalists to contribute to a database, thereby impeding the database to deliver its benefit. Moreover, manual copy and paste operations which are commonly used in those procedures increase the chance of making mistakes. Excemplify, a web-based application, proposes a flexible and adaptable template-based solution to solve these problems. Comparing to the normal template based uploading approach, which is supported by some public databases, rather than predefining a format that is potentiall impractical, Excemplify allows users to create their own experiment-specific content templates in different experiment stages and to build corresponding knowledge bases for parsing. Utilizing the embedded knowledge of used templates, Excemplify is able to parse experimental data from the initial setup stage and generate following stages spreadsheets automatically. The proposed solution standardizes the flows of data traveling according to the standard procedures of applying the experiment, cuts down the amount of manual effort and reduces the chance of mistakes caused by manual data handling. In addition, it maintains the context of meta-data from the initial preparation manuscript and improves the data consistency. It interoperates and complements RightField and SEEK as well.

  4. Model for Trans-sector Digital Interoperability

    NARCIS (Netherlands)

    Madureira, A.; Den Hartog, F.; Silva, E.; Baken, N.

    2010-01-01

    Interoperability refers to the ability of two or more systems or components to exchange information and to use the information that has been exchanged. The importance of interoperability has grown together with the adoption of Digital Information Networks (DINs). DINs refer to information networks

  5. Model for Trans-sector Digital Interoperability

    NARCIS (Netherlands)

    Popplewell, Keith; Madureira, António; Harding, Jenny; den Hartog, Frank; Goncalves da Silva, Eduardo; Poler, Raul; Chalmeta, Ricardo; Baken, Nico

    Interoperability refers to the ability of two or more systems or components to exchange information and to use the information that has been exchanged. The importance of interoperability has grown together with the adoption of Digital Information Networks (DINs). DINs refer to information networks

  6. Interoperability of Web Archives and Digital Libraries

    DEFF Research Database (Denmark)

    Kalb, Hendrik; Lazaridou, Paraskevi; Pinsent, Edward

    2013-01-01

    The interoperability of web archives and digital libraries is crucial to avoid silos of preserved data and content. While various researches focus on specfic facets of the challenge to interoperate, there is a lack of empirical work about the overall situation of actual challenges. We conduct...

  7. Model for Trans-sector Digital Interoperability

    NARCIS (Netherlands)

    Madureira, António; den Hartog, Frank; Goncalves da Silva, Eduardo; Baken, Nico; Zhao, L.; Macaulay, L.

    2009-01-01

    Interoperability refers to the ability of two or more systems or components to exchange information and to use the information that has been exchanged. The importance of interoperability has grown together with the adoption of Digital Information Networks (DINs). DINs refer to information networks

  8. Interoperability in the networked design infrastructure

    NARCIS (Netherlands)

    Coenders, J.L.

    2012-01-01

    Interoperability, the ability of different software applications to communicate with each other, is one of the biggest challenges for efficient and effective use of advanced software technology in structural design and engineering. In practice, the problem of interoperability exists very much for

  9. An Optimized, Data Distribution Service-Based Solution for Reliable Data Exchange Among Autonomous Underwater Vehicles.

    Science.gov (United States)

    Rodríguez-Molina, Jesús; Bilbao, Sonia; Martínez, Belén; Frasheri, Mirgita; Cürüklü, Baran

    2017-08-05

    Major challenges are presented when managing a large number of heterogeneous vehicles that have to communicate underwater in order to complete a global mission in a cooperative manner. In this kind of application domain, sending data through the environment presents issues that surpass the ones found in other overwater, distributed, cyber-physical systems (i.e., low bandwidth, unreliable transport medium, data representation and hardware high heterogeneity). This manuscript presents a Publish/Subscribe-based semantic middleware solution for unreliable scenarios and vehicle interoperability across cooperative and heterogeneous autonomous vehicles. The middleware relies on different iterations of the Data Distribution Service (DDS) software standard and their combined work between autonomous maritime vehicles and a control entity. It also uses several components with different functionalities deemed as mandatory for a semantic middleware architecture oriented to maritime operations (device and service registration, context awareness, access to the application layer) where other technologies are also interweaved with middleware (wireless communications, acoustic networks). Implementation details and test results, both in a laboratory and a deployment scenario, have been provided as a way to assess the quality of the system and its satisfactory performance.

  10. An Optimized, Data Distribution Service-Based Solution for Reliable Data Exchange Among Autonomous Underwater Vehicles

    Directory of Open Access Journals (Sweden)

    Jesús Rodríguez-Molina

    2017-08-01

    Full Text Available Major challenges are presented when managing a large number of heterogeneous vehicles that have to communicate underwater in order to complete a global mission in a cooperative manner. In this kind of application domain, sending data through the environment presents issues that surpass the ones found in other overwater, distributed, cyber-physical systems (i.e., low bandwidth, unreliable transport medium, data representation and hardware high heterogeneity. This manuscript presents a Publish/Subscribe-based semantic middleware solution for unreliable scenarios and vehicle interoperability across cooperative and heterogeneous autonomous vehicles. The middleware relies on different iterations of the Data Distribution Service (DDS software standard and their combined work between autonomous maritime vehicles and a control entity. It also uses several components with different functionalities deemed as mandatory for a semantic middleware architecture oriented to maritime operations (device and service registration, context awareness, access to the application layer where other technologies are also interweaved with middleware (wireless communications, acoustic networks. Implementation details and test results, both in a laboratory and a deployment scenario, have been provided as a way to assess the quality of the system and its satisfactory performance.

  11. Interoperability

    DEFF Research Database (Denmark)

    Savin, Andrej

    would considerably alter the current privacy setting.3 First, the current Directive would be replaced with a Regulation, achieving EU-­‐wide harmonization. Second, the scope of the instrument would be widened and the provisions made more precise. Third, the use of consent for data processing would...... be limited. Fourth, Data protection “by design” would be distinguished from data protection “by default”. Fifth, new fundamental rights would be introduced and the old ones clarified. Sixth, new rules on controllers’ and processors’ duties, on supervisory authorities and on sanctions would be introduced...... of direct relevance for the project and Work Package 5 will be analysed here....

  12. The next generation of interoperability agents in healthcare.

    Science.gov (United States)

    Cardoso, Luciana; Marins, Fernando; Portela, Filipe; Santos, Manuel; Abelha, António; Machado, José

    2014-05-16

    Interoperability in health information systems is increasingly a requirement rather than an option. Standards and technologies, such as multi-agent systems, have proven to be powerful tools in interoperability issues. In the last few years, the authors have worked on developing the Agency for Integration, Diffusion and Archive of Medical Information (AIDA), which is an intelligent, agent-based platform to ensure interoperability in healthcare units. It is increasingly important to ensure the high availability and reliability of systems. The functions provided by the systems that treat interoperability cannot fail. This paper shows the importance of monitoring and controlling intelligent agents as a tool to anticipate problems in health information systems. The interaction between humans and agents through an interface that allows the user to create new agents easily and to monitor their activities in real time is also an important feature, as health systems evolve by adopting more features and solving new problems. A module was installed in Centro Hospitalar do Porto, increasing the functionality and the overall usability of AIDA.

  13. The Role of Markup for Enabling Interoperability in Health Informatics

    Directory of Open Access Journals (Sweden)

    Steve eMckeever

    2015-05-01

    Full Text Available Interoperability is the faculty of making information systems work together. In this paper we will distinguish a number of different forms that interoperability can take and show how they are realised on a variety of physiological and health care use cases. The last fifteen years has seen the rise of very cheap digital storage both on and off cite. With the advent of the 'Internet of Things' people's expectations are for greater interconnectivity and seamless interoperability. The potential impact these technologies have on healthcare are dramatic: from improved diagnoses through immediate access to a patient's electronic health record, to 'in silico' modeling of organs and early stage drug trials, to predictive medicine based on top-down modeling of disease progression and treatment. We will begin by looking at the underlying technology, classify the various kinds of interoperability that exist in the field, and discuss how they are realised. We conclude with a discussion on future possibilities that big data and further standardizations will enable.

  14. The Next Generation of Interoperability Agents in Healthcare

    Directory of Open Access Journals (Sweden)

    Luciana Cardoso

    2014-05-01

    Full Text Available Interoperability in health information systems is increasingly a requirement rather than an option. Standards and technologies, such as multi-agent systems, have proven to be powerful tools in interoperability issues. In the last few years, the authors have worked on developing the Agency for Integration, Diffusion and Archive of Medical Information (AIDA, which is an intelligent, agent-based platform to ensure interoperability in healthcare units. It is increasingly important to ensure the high availability and reliability of systems. The functions provided by the systems that treat interoperability cannot fail. This paper shows the importance of monitoring and controlling intelligent agents as a tool to anticipate problems in health information systems. The interaction between humans and agents through an interface that allows the user to create new agents easily and to monitor their activities in real time is also an important feature, as health systems evolve by adopting more features and solving new problems. A module was installed in Centro Hospitalar do Porto, increasing the functionality and the overall usability of AIDA.

  15. An Integrated Framework to Achieve Interoperability in Person-Centric Health Management

    Directory of Open Access Journals (Sweden)

    Fabio Vergari

    2011-01-01

    Full Text Available The need for high-quality out-of-hospital healthcare is a known socioeconomic problem. Exploiting ICT's evolution, ad-hoc telemedicine solutions have been proposed in the past. Integrating such ad-hoc solutions in order to cost-effectively support the entire healthcare cycle is still a research challenge. In order to handle the heterogeneity of relevant information and to overcome the fragmentation of out-of-hospital instrumentation in person-centric healthcare systems, a shared and open source interoperability component can be adopted, which is ontology driven and based on the semantic web data model. The feasibility and the advantages of the proposed approach are demonstrated by presenting the use case of real-time monitoring of patients' health and their environmental context.

  16. Interoperability of Standards for Robotics in CIME

    DEFF Research Database (Denmark)

    Kroszynski, Uri; Sørensen, Torben; Ludwig, Arnold

    1997-01-01

    Esprit Project 6457 "Interoperability of Standards for Robotics in CIME (InterRob)" belongs to the Subprogramme "Integration in Manufacturing" of Esprit, the European Specific Programme for Research and Development in Information Technology supported by the European Commision.The first main goal...... of InterRob was to close the information chain between product design, simulation, programming, and robot control by developing standardized interfaces and their software implementation for standards STEP (International Standard for the Exchange of Product model data, ISO 10303) and IRL (Industrial Robot...... Language, DIN 66312). This is a continuation of the previous Esprit projects CAD*I and NIRO, which developed substantial basics of STEP.The InterRob approach is based on standardized models for product geometry, kinematics, robotics, dynamics and control, hence on a coherent neutral information model...

  17. Nutrient Composition Of Cereal Based Oral Rehydration Solutions ...

    African Journals Online (AJOL)

    This study evaluated the nutrient composition of two cereal, millet and sorghum, based oral rehydration solutions. The test solutions were made from 50g of millet and sorghum each. The nutrient composition of the solution was determined using proximate analysis. The result showed that the mothers were aware of the salt ...

  18. A theory of game trees, based on solution trees

    NARCIS (Netherlands)

    W.H.L.M. Pijls (Wim); A. de Bruin (Arie); A. Plaat (Aske)

    1996-01-01

    textabstractIn this paper a complete theory of game tree algorithms is presented, entirely based upon the notion of a solution tree. Two types of solution trees are distinguished: max and min solution trees respectively. We show that most game tree algorithms construct a superposition of a max and a

  19. Basic semantic architecture of interoperability for the intelligent distribution in the CFE electrical system; Arquitectura base de interoperabilidad semantica para el sistema electrico de distribucion inteligente en la CFE

    Energy Technology Data Exchange (ETDEWEB)

    Espinosa Reza, Alfredo; Garcia Mendoza, Raul; Borja Diaz, Jesus Fidel; Sierra Rodriguez, Benjamin [Instituto de Investigaciones Electricas, Cuernavaca, Morelos (Mexico)

    2010-07-01

    The physical and logical architecture of the interoperability platform defined for the distribution management systems (DMS), of the Distribution Subdivision of Comision Federal de Electricidad (CFE) in Mexico is presented. The adopted architecture includes the definition of a technological platform to manage the exchange of information between systems and applications, sustained in the Model of Common Information (CIM), established in norms IEC61968 and IEC 61970. The architecture based on SSOA (Semantic Services Oriented Architecture), on EIB (Enterprise Integration Bus) and on GID (Generic Interface Definition) is presented, as well as the sequence to obtain the interoperability of systems related to the Distribution Management of the of electrical energy in Mexico. Of equal way it is described the process to establish a Semantic Model of the Electrical System of Distribution (SED) and the creation of instances CIM/XML, oriented to the interoperability of the information systems in the DMS scope, by means of the interchange of messages conformed and validated according to the structure obtained and agreed to the rules established by Model CIM. In this way, the messages and the information interchanged among systems, assure the compatibility and correct interpretation in an independent way to the developer, mark or manufacturer of the system source and destiny. The primary target is to establish the infrastructure semantic base of interoperability, cradle in standards that sustain the strategic definition of an Electrical System of Intelligent Distribution (SEDI) in Mexico. [Spanish] Se presenta la arquitectura fisica y logica de la plataforma de interoperabilidad definida para los sistemas de gestion de la distribucion (DMS por sus siglas en ingles), de la Subdireccion de Distribucion de la Comision Federal de Electricidad (CFE) en Mexico. La arquitectura adoptada incluye la definicion de una plataforma tecnologica para gestionar el intercambio de informacion

  20. Modular analytics management architecture for interoperability and decision support

    Science.gov (United States)

    Marotta, Stephen; Metzger, Max; Gorman, Joe; Sliva, Amy

    2016-05-01

    The Dual Node Decision Wheels (DNDW) architecture is a new approach to information fusion and decision support systems. By combining cognitive systems engineering organizational analysis tools, such as decision trees, with the Dual Node Network (DNN) technical architecture for information fusion, the DNDW can align relevant data and information products with an organization's decision-making processes. In this paper, we present the Compositional Inference and Machine Learning Environment (CIMLE), a prototype framework based on the principles of the DNDW architecture. CIMLE provides a flexible environment so heterogeneous data sources, messaging frameworks, and analytic processes can interoperate to provide the specific information required for situation understanding and decision making. It was designed to support the creation of modular, distributed solutions over large monolithic systems. With CIMLE, users can repurpose individual analytics to address evolving decision-making requirements or to adapt to new mission contexts; CIMLE's modular design simplifies integration with new host operating environments. CIMLE's configurable system design enables model developers to build analytical systems that closely align with organizational structures and processes and support the organization's information needs.

  1. A semantic interoperability approach to support integration of gene expression and clinical data in breast cancer.

    Science.gov (United States)

    Alonso-Calvo, Raul; Paraiso-Medina, Sergio; Perez-Rey, David; Alonso-Oset, Enrique; van Stiphout, Ruud; Yu, Sheng; Taylor, Marian; Buffa, Francesca; Fernandez-Lozano, Carlos; Pazos, Alejandro; Maojo, Victor

    2017-08-01

    The introduction of omics data and advances in technologies involved in clinical treatment has led to a broad range of approaches to represent clinical information. Within this context, patient stratification across health institutions due to omic profiling presents a complex scenario to carry out multi-center clinical trials. This paper presents a standards-based approach to ensure semantic integration required to facilitate the analysis of clinico-genomic clinical trials. To ensure interoperability across different institutions, we have developed a Semantic Interoperability Layer (SIL) to facilitate homogeneous access to clinical and genetic information, based on different well-established biomedical standards and following International Health (IHE) recommendations. The SIL has shown suitability for integrating biomedical knowledge and technologies to match the latest clinical advances in healthcare and the use of genomic information. This genomic data integration in the SIL has been tested with a diagnostic classifier tool that takes advantage of harmonized multi-center clinico-genomic data for training statistical predictive models. The SIL has been adopted in national and international research initiatives, such as the EURECA-EU research project and the CIMED collaborative Spanish project, where the proposed solution has been applied and evaluated by clinical experts focused on clinico-genomic studies. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. MIDST: Interoperability for Semantic Annotations

    Science.gov (United States)

    Atzeni, Paolo; Del Nostro, Pierluigi; Paolozzi, Stefano

    In the last years, interoperability of ontologies and databases has received a lot of attention. However, most of the work has concentrated on specific problems (such as storing an ontology in a database or making database data available to ontologies) and referred to specific models for each of the two. Here, we propose an approach that aims at being more general and model independent. In fact, it works for different dialects for ontologies and for various data models for databases. Also, it supports translations in both directions (ontologies to databases and vice versa) and it allows for flexibility in the translations, so that customization is possible. The proposal extends recent work for schema and data translation (the MIDST project, which implements the ModelGen operator proposed in model management), which relies on a metamodel approach, where data models and variations thereof are described in a common framework and translations are built as compositions of elementary ones.

  3. Improving conditions for reuse of design solutions - by means of a context based solution library

    DEFF Research Database (Denmark)

    Mortensen, Niels Henrik; Grothe-Møller, Thorkild; Andreasen, Mogens Myrup

    1997-01-01

    Among the most important reasoning mechanisms in design is reasoning by analogy. One precondition for being able to reason about the properties and functionalitues of a product or subsystem is that the context of the solution is known. This paper presents a computer based solution library where...

  4. A buffer overflow detection based on inequalities solution

    International Nuclear Information System (INIS)

    Xu Guoai; Zhang Miao; Yang Yixian

    2007-01-01

    A new buffer overflow detection model based on Inequalities Solution was designed, which is based on analyzing disadvantage of the old buffer overflow detection technique and successfully converting buffer overflow detection to Inequalities Solution. The new model can conquer the disadvantage of the old technique and improve efficiency of buffer overflow detection. (authors)

  5. Implementing Interoperability in the Seafood Industry: Learning from Experiences in Other Sectors.

    Science.gov (United States)

    Bhatt, Tejas; Gooch, Martin; Dent, Benjamin; Sylvia, Gilbert

    2017-08-01

    Interoperability of communication and information technologies within and between businesses operating along supply chains is being pursued and implemented in numerous industries worldwide to increase the efficiency and effectiveness of operations. The desire for greater interoperability is also driven by the need to reduce business risk through more informed management decisions. Interoperability is achieved by the development of a technology architecture that guides the design and implementation of communication systems existing within individual businesses and between businesses comprising the supply chain. Technology architectures are developed through a purposeful dialogue about why the architecture is required, the benefits and opportunities that the architecture offers the industry, and how the architecture will translate into practical results. An assessment of how the finance, travel, and health industries and a sector of the food industry-fresh produce-have implemented interoperability was conducted to identify lessons learned that can aid the development of interoperability in the seafood industry. The findings include identification of the need for strong, effective governance during the establishment and operation of an interoperability initiative to ensure the existence of common protocols and standards. The resulting insights were distilled into a series of principles for enabling syntactic and semantic interoperability in any industry, which we summarize in this article. Categorized as "structural," "operational," and "integrative," the principles describe requirements and solutions that are pivotal to enabling businesses to create and capture value from full chain interoperability. The principles are also fundamental to allowing governments and advocacy groups to use traceability for public good. © 2017 Institute of Food Technologists®.

  6. Balancing of Heterogeneity and Interoperability in E-Business Networks: The Role of Standards and Protocols

    OpenAIRE

    Frank-Dieter Dorloff; Ejub Kajan

    2012-01-01

    To reach this interoperability visibility and common understanding must be ensured on all levels of the interoperability pyramid. This includes common agreements about the visions, political and legal restrictions, clear descriptions about the collaboration scenarios, included business processes and-rules, the type and roles of the Documents, a common understandable vocabulary, etc. To do this in an effective and automatable manner, ICT based concepts, frameworks and models have to be defined...

  7. Federative approach of interoperability at the design/manufacturing interface using ontologies

    OpenAIRE

    Fortineau, Virginie; Paviot, Thomas; Lamouri, Samir

    2011-01-01

    In the production enterprises, interoperability between information systems used is the key for a successfull Product Lifecycle Management approach. Despite many research works, the preservation of the information ow along the product life is still problematical because of the scienti c and technological locks existing. These locks are identifi ed in this paper and a new federative approach of interoperability is proposed, based upon the use of ontologies and semantic web tools.

  8. Interoperability for Entreprise Systems and Applications '12

    CERN Document Server

    Doumeingts, Guy; Katzy, Bernhard; Chalmeta, Ricardo

    2012-01-01

    Within a scenario of globalised markets, where the capacity to efficiently cooperate with other firms starts to become essential in order to remain in the market in an economically, socially and environmentally cost-effective manner, it can be seen how the most innovative enterprises are beginning to redesign their business model to become interoperable. This goal of interoperability is essential, not only from the perspective of the individual enterprise but also in the new business structures that are now emerging, such as supply chains, virtual enterprises, interconnected organisations or extended enterprises, as well as in mergers and acquisitions. Composed of over 40 papers, Enterprise Interoperability V ranges from academic research through case studies to industrial and administrative experience of interoperability. The international nature of the authorship contnues to broaden. Many of the papers have examples and illustrations calculated to deepen understanding and generate new ideas. The I-ESA'12 Co...

  9. Requirements for Interoperability in Healthcare Information Systems

    Directory of Open Access Journals (Sweden)

    Rita Noumeir

    2012-01-01

    Full Text Available Interoperability is a requirement for the successful deployment of Electronic Health Records (EHR. EHR improves the quality of healthcare by enabling access to all relevant information at the diagnostic decision moment, regardless of location. It is a system that results from the cooperation of several heterogeneous distributed subsystems that need to successfully exchange information relative to a specific healthcare process. This paper analyzes interoperability impediments in healthcare by first defining them and providing concrete healthcare examples, followed by discussion of how specifications can be defined and how verification can be conducted to eliminate those impediments and ensure interoperability in healthcare. This paper also analyzes how Integrating the Healthcare Enterprise (IHE has been successful in enabling interoperability, and identifies some neglected aspects that need attention.

  10. Interoperability for Enterprise Systems and Applications

    CERN Document Server

    Jardim-Gonçalves, Ricardo; Popplewell, Keith; Mendonça, João

    2016-01-01

    A concise reference to the state of the art in systems interoperability, Enterprise Interoperability VII will be of great value to engineers and computer scientists working in manufacturing and other process industries and to software engineers and electronic and manufacturing engineers working in the academic environment. Furthermore, it shows how knowledge of the meaning within information and the use to which it will be put have to be held in common between enterprises for consistent and efficient inter-enterprise networks. Over 30 papers, ranging from academic research through case studies to industrial and administrative experience of interoperability show how, in a scenario of globalised markets, where the capacity to cooperate with other organizations efficiently is essential in order to remain economically, socially and environmentally cost-effective, the most innovative digitized and networked enterprises ensure that their systems and applications are able to interoperate across heterogeneous collabo...

  11. Measuring Systems Interoperability: Challenges and Opportunities

    National Research Council Canada - National Science Library

    Kasunic, Mark; Anderson, William

    2004-01-01

    Interoperability is the ability of systems, units, or forces to provide services to and accept services from other systems, units, or forces and to use the services exchanged to enable them to operate...

  12. RFID in Libraries: Standards and Interoperability

    OpenAIRE

    Hopkinson, Alan

    2007-01-01

    RFID needs standards to ensure interoperability so that systems can survive a change in library system and use RFID in inter-library lending between libraries with different systems. Efforts are under way to develop ISO standards to achieve this.

  13. Modeling Reusable and Interoperable Faceted Browsing Systems with Category Theory

    OpenAIRE

    Harris, Daniel R.

    2015-01-01

    Faceted browsing has become ubiquitous with modern digital libraries and online search engines, yet the process is still difficult to abstractly model in a manner that supports the development of interoperable and reusable interfaces. We propose category theory as a theoretical foundation for faceted browsing and demonstrate how the interactive process can be mathematically abstracted. Existing efforts in facet modeling are based upon set theory, formal concept analysis, and lightweight ontol...

  14. Foundations of reusable and interoperable facet models using category theory

    OpenAIRE

    Harris, Daniel R.

    2016-01-01

    Faceted browsing has become ubiquitous with modern digital libraries and online search engines, yet the process is still difficult to abstractly model in a manner that supports the development of interoperable and reusable interfaces. We propose category theory as a theoretical foundation for faceted browsing and demonstrate how the interactive process can be mathematically abstracted. Existing efforts in facet modeling are based upon set theory, formal concept analysis, and light-weight onto...

  15. Emergent Semantics Interoperability in Large-Scale Decentralized Information Systems

    CERN Document Server

    Cudré-Mauroux, Philippe

    2008-01-01

    Peer-to-peer systems are evolving with new information-system architectures, leading to the idea that the principles of decentralization and self-organization will offer new approaches in informatics, especially for systems that scale with the number of users or for which central authorities do not prevail. This book describes a new way of building global agreements (semantic interoperability) based only on decentralized, self-organizing interactions.

  16. Interoperability and Standardization of Intercloud Cloud Computing

    OpenAIRE

    Wang, Jingxin K.; Ding, Jianrui; Niu, Tian

    2012-01-01

    Cloud computing is getting mature, and the interoperability and standardization of the clouds is still waiting to be solved. This paper discussed the interoperability among clouds about message transmission, data transmission and virtual machine transfer. Starting from IEEE Pioneering Cloud Computing Initiative, this paper discussed about standardization of the cloud computing, especially intercloud cloud computing. This paper also discussed the standardization from the market-oriented view.

  17. Grid interoperability: joining grid information systems

    International Nuclear Information System (INIS)

    Flechl, M; Field, L

    2008-01-01

    A grid is defined as being 'coordinated resource sharing and problem solving in dynamic, multi-institutional virtual organizations'. Over recent years a number of grid projects, many of which have a strong regional presence, have emerged to help coordinate institutions and enable grids. Today, we face a situation where a number of grid projects exist, most of which are using slightly different middleware. Grid interoperation is trying to bridge these differences and enable Virtual Organizations to access resources at the institutions independent of their grid project affiliation. Grid interoperation is usually a bilateral activity between two grid infrastructures. Recently within the Open Grid Forum, the Grid Interoperability Now (GIN) Community Group is trying to build upon these bilateral activities. The GIN group is a focal point where all the infrastructures can come together to share ideas and experiences on grid interoperation. It is hoped that each bilateral activity will bring us one step closer to the overall goal of a uniform grid landscape. A fundamental aspect of a grid is the information system, which is used to find available grid services. As different grids use different information systems, interoperation between these systems is crucial for grid interoperability. This paper describes the work carried out to overcome these differences between a number of grid projects and the experiences gained. It focuses on the different techniques used and highlights the important areas for future standardization

  18. Achieving interoperability for metadata registries using comparative object modeling.

    Science.gov (United States)

    Park, Yu Rang; Kim, Ju Han

    2010-01-01

    Achieving data interoperability between organizations relies upon agreed meaning and representation (metadata) of data. For managing and registering metadata, many organizations have built metadata registries (MDRs) in various domains based on international standard for MDR framework, ISO/IEC 11179. Following this trend, two pubic MDRs in biomedical domain have been created, United States Health Information Knowledgebase (USHIK) and cancer Data Standards Registry and Repository (caDSR), from U.S. Department of Health & Human Services and National Cancer Institute (NCI), respectively. Most MDRs are implemented with indiscriminate extending for satisfying organization-specific needs and solving semantic and structural limitation of ISO/IEC 11179. As a result it is difficult to address interoperability among multiple MDRs. In this paper, we propose an integrated metadata object model for achieving interoperability among multiple MDRs. To evaluate this model, we developed an XML Schema Definition (XSD)-based metadata exchange format. We created an XSD-based metadata exporter, supporting both the integrated metadata object model and organization-specific MDR formats.

  19. ICD-11 (JLMMS) and SCT Inter-Operation.

    Science.gov (United States)

    Mamou, Marzouk; Rector, Alan; Schulz, Stefan; Campbell, James; Solbrig, Harold; Rodrigues, Jean-Marie

    2016-01-01

    The goal of this work is to contribute to a smooth and semantically sound inter-operability between the ICD-11 (International Classification of Diseases-11th revision Joint Linearization for Mortality, Morbidity and Statistics) and SNOMED CT (SCT). To guarantee such inter-operation between a classification, characterized by a single hierarchy of mutually exclusive and exhaustive classes, as is the JLMMS successor of ICD-10 on the one hand, and the multi-hierarchical, ontology-based clinical terminology SCT on the other hand, we use ontology axioms that logically express generalizable truths. This is expressed by the compositional grammar of SCT, together with queries on axiomsof SCT. We test the feasibility of the method on the circulatory chapter of ICD-11 JLMMS and present limitations and results.

  20. Interoperable eHealth Platform for Personalized Smart Services

    DEFF Research Database (Denmark)

    Mihaylov, Mihail Rumenov; Mihovska, Albena Dimitrova; Kyriazakos, Sofoklis

    2015-01-01

    personalized context-aware applications to serve the user's needs. This paper proposes the use of advised sensing, context-aware and cloud-based lifestyle reasoning to design an innovative eHealth platform that supports highly personalized smart services to primary users. The architecture of the platform has...... been designed in accordance with the interoperability requirements and standards as proposed by ITU-T and Continua Alliance. In particular, we define the interface dependencies and functional requirements needed, to allow eCare and eHealth vendors to manufacture interoperable sensors, ambient and home...... networks, telehealth platforms, health support applications and software services. Finally, data mining techniques in relation to the proposed architecture are also proposed to enhance the overall AAL experience of the users....

  1. Interoperable Data Sharing for Diverse Scientific Disciplines

    Science.gov (United States)

    Hughes, John S.; Crichton, Daniel; Martinez, Santa; Law, Emily; Hardman, Sean

    2016-04-01

    For diverse scientific disciplines to interoperate they must be able to exchange information based on a shared understanding. To capture this shared understanding, we have developed a knowledge representation framework using ontologies and ISO level archive and metadata registry reference models. This framework provides multi-level governance, evolves independent of implementation technologies, and promotes agile development, namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. The knowledge representation framework is populated through knowledge acquisition from discipline experts. It is also extended to meet specific discipline requirements. The result is a formalized and rigorous knowledge base that addresses data representation, integrity, provenance, context, quantity, and their relationships within the community. The contents of the knowledge base is translated and written to files in appropriate formats to configure system software and services, provide user documentation, validate ingested data, and support data analytics. This presentation will provide an overview of the framework, present the Planetary Data System's PDS4 as a use case that has been adopted by the international planetary science community, describe how the framework is being applied to other disciplines, and share some important lessons learned.

  2. On the feasibility of interoperable schemes in hand biometrics.

    Science.gov (United States)

    Morales, Aythami; González, Ester; Ferrer, Miguel A

    2012-01-01

    Personal recognition through hand-based biometrics has attracted the interest of many researchers in the last twenty years. A significant number of proposals based on different procedures and acquisition devices have been published in the literature. However, comparisons between devices and their interoperability have not been thoroughly studied. This paper tries to fill this gap by proposing procedures to improve the interoperability among different hand biometric schemes. The experiments were conducted on a database made up of 8,320 hand images acquired from six different hand biometric schemes, including a flat scanner, webcams at different wavelengths, high quality cameras, and contactless devices. Acquisitions on both sides of the hand were included. Our experiment includes four feature extraction methods which determine the best performance among the different scenarios for two of the most popular hand biometrics: hand shape and palm print. We propose smoothing techniques at the image and feature levels to reduce interdevice variability. Results suggest that comparative hand shape offers better performance in terms of interoperability than palm prints, but palm prints can be more effective when using similar sensors.

  3. On the Feasibility of Interoperable Schemes in Hand Biometrics

    Directory of Open Access Journals (Sweden)

    Miguel A. Ferrer

    2012-02-01

    Full Text Available Personal recognition through hand-based biometrics has attracted the interest of many researchers in the last twenty years. A significant number of proposals based on different procedures and acquisition devices have been published in the literature. However, comparisons between devices and their interoperability have not been thoroughly studied. This paper tries to fill this gap by proposing procedures to improve the interoperability among different hand biometric schemes. The experiments were conducted on a database made up of 8,320 hand images acquired from six different hand biometric schemes, including a flat scanner, webcams at different wavelengths, high quality cameras, and contactless devices. Acquisitions on both sides of the hand were included. Our experiment includes four feature extraction methods which determine the best performance among the different scenarios for two of the most popular hand biometrics: hand shape and palm print. We propose smoothing techniques at the image and feature levels to reduce interdevice variability. Results suggest that comparative hand shape offers better performance in terms of interoperability than palm prints, but palm prints can be more effective when using similar sensors.

  4. Semantic modeling and interoperability in product and process engineering a technology for engineering informatics

    CERN Document Server

    2013-01-01

    In the past decade, feature-based design and manufacturing has gained some momentum in various engineering domains to represent and reuse semantic patterns with effective applicability. However, the actual scope of feature application is still very limited. Semantic Modeling and Interoperability in Product and Process Engineering provides a systematic solution for the challenging engineering informatics field aiming at the enhancement of sustainable knowledge representation, implementation and reuse in an open and yet practically manageable scale.   This semantic modeling technology supports uniform, multi-facet and multi-level collaborative system engineering with heterogeneous computer-aided tools, such as CADCAM, CAE, and ERP.  This presented unified feature model can be applied to product and process representation, development, implementation and management. Practical case studies and test samples are provided to illustrate applications which can be implemented by the readers in real-world scenarios. �...

  5. ISAIA: Interoperable Systems for Archival Information Access

    Science.gov (United States)

    Hanisch, Robert J.

    2002-01-01

    The ISAIA project was originally proposed in 1999 as a successor to the informal AstroBrowse project. AstroBrowse, which provided a data location service for astronomical archives and catalogs, was a first step toward data system integration and interoperability. The goals of ISAIA were ambitious: '...To develop an interdisciplinary data location and integration service for space science. Building upon existing data services and communications protocols, this service will allow users to transparently query hundreds or thousands of WWW-based resources (catalogs, data, computational resources, bibliographic references, etc.) from a single interface. The service will collect responses from various resources and integrate them in a seamless fashion for display and manipulation by the user.' Funding was approved only for a one-year pilot study, a decision that in retrospect was wise given the rapid changes in information technology in the past few years and the emergence of the Virtual Observatory initiatives in the US and worldwide. Indeed, the ISAIA pilot study was influential in shaping the science goals, system design, metadata standards, and technology choices for the virtual observatory. The ISAIA pilot project also helped to cement working relationships among the NASA data centers, US ground-based observatories, and international data centers. The ISAIA project was formed as a collaborative effort between thirteen institutions that provided data to astronomers, space physicists, and planetary scientists. Among the fruits we ultimately hoped would come from this project would be a central site on the Web that any space scientist could use to efficiently locate existing data relevant to a particular scientific question. Furthermore, we hoped that the needed technology would be general enough to allow smaller, more-focused community within space science could use the same technologies and standards to provide more specialized services. A major challenge to searching

  6. 77 FR 19575 - Promoting Interoperability in the 700 MHz Commercial Spectrum; Interoperability of Mobile User...

    Science.gov (United States)

    2012-04-02

    ...-1 specifies the spectrum emission limits for available channel bandwidths. \\7\\ Receiver blocking... Interoperability in the 700 MHz Commercial Spectrum; Interoperability of Mobile User Equipment Across Paired Commercial Spectrum Blocks in the 700 MHz Band AGENCY: Federal Communications Commission. ACTION: Notice of...

  7. Market based solutions for power pricing

    International Nuclear Information System (INIS)

    Wangensteen, Ivar

    2002-06-01

    The report examines how the price for effect reserves, spot market power and regulated power is formed provided ideal market conditions rule. Primarily the price determining factors in a market for power reserves are examined and how the connection between this market and the energy market (the spot market) is. In a free market there would be a balance between what the actors may obtain by operating in the open market for power reserves/regulated power on the one hand and the market for spot power on the other. Primarily we suppose that the desired amount of power reserve is known. Secondly the problem constellation is extended to comprise the size of the effect reserves i.e. the optimising of the requirement to the power reserves. The optimal amount of power reserves is obtained when there is a balance between the cost and the benefit. This optimal balance is achieved when expected macro economical loss due to outfacing balances against the cost of maintaining larger reserves. By using a simple model it is demonstrated that a system operator regulates the maximal price in the regulated market and this equals the rationing price. The actors will offer sufficient reserves even if the reserve price is zero (provided risk neutrality). If the maximal price for regulated power is lower the price of effect reserves will rise. Based on the same simple model calculations are made for how short and long term market balance will be for increasing demands

  8. Scalability and interoperability within glideinWMS

    International Nuclear Information System (INIS)

    Bradley, D.; Sfiligoi, I.; Padhi, S.; Frey, J.; Tannenbaum, T.

    2010-01-01

    Physicists have access to thousands of CPUs in grid federations such as OSG and EGEE. With the start-up of the LHC, it is essential for individuals or groups of users to wrap together available resources from multiple sites across multiple grids under a higher user-controlled layer in order to provide a homogeneous pool of available resources. One such system is glideinWMS, which is based on the Condor batch system. A general discussion of glideinWMS can be found elsewhere. Here, we focus on recent advances in extending its reach: scalability and integration of heterogeneous compute elements. We demonstrate that the new developments exceed the design goal of over 10,000 simultaneous running jobs under a single Condor schedd, using strong security protocols across global networks, and sustaining a steady-state job completion rate of a few Hz. We also show interoperability across heterogeneous computing elements achieved using client-side methods. We discuss this technique and the challenges in direct access to NorduGrid and CREAM compute elements, in addition to Globus based systems.

  9. Enterprise interoperability VI : Interoperability for Agility, Resilience and Plasticity of Collaboration

    CERN Document Server

    Bénaben, Frédérick; Poler, Raúl; Bourrières, Jean-Paul

    2014-01-01

    A concise reference to the state of the art in systems interoperability, Enterprise Interoperability VI will be of great value to engineers and computer scientists working in manufacturing and other process industries and to software engineers and electronic and manufacturing engineers working in the academic environment. Over 40 papers, ranging from academic research through case studies to industrial and administrative experience of interoperability show how, in a scenario of globalised markets, where the capacity to cooperate with other firms efficiently starts to become essential in order to remain in the market in an economically, socially and environmentally cost-effective manner, the most innovative enterprises are beginning to redesign their business model to become interoperable. This goal of interoperability is essential, not only from the perspective of the individual enterprise but also in the new business structures that are now emerging, such as supply chains, virtual enterprises, interconnected...

  10. A novel solution configuration on liquid-based endometrial cytology.

    Directory of Open Access Journals (Sweden)

    Shulan Lv

    Full Text Available Early detection and diagnosis of endometrial carcinoma and precancerous change would undoubtedly become the most alluring part for researchers. With the emergence of endometrial brush samplers, a new upsurge in endometrial cytology is in the making. But endometrial specimens obtained by the endometrial brush samplers require special preservation solution. The objective of this study is to develop a new kind of endometrial-cell preservation solution and to test the availability compared with a patented liquid-based cell preservation solution.In this controlled study, we had 5 endometrial cases collected with Li Brush from the First Affiliated Hospital of Xi'an Jiaotong University (09/2016 to 12/2016. The samples of each case were collected 2 times separately and perserved in different perservation solutions. One was a kind of novel endometrial cell preservation solution and the other was a kind of patented liquid-based cell (LBC preservation solution. The endometrial cells were smeared on slides by using the ZP-C automated slide preparation system and stained with Papanicolaou stain. A semi-quantitative scoring system was used to analyze the quality of slides. Statistical analysis was performed using the Wilcoxon signed rank test on the SPSS program (SPSS 18.0. In all LBC preparations, endometrial cells from the novel endometrial cells preservation solution had more cell quantity, less red blood cell fragments, and the background was cleaner compared with control group. Although the novel endometrial-cell preservation solution showed cellularity and absence of blood and debris expressed by no statistically significant differences (p = 0.063 and 0.102 respectively. The preservation period of the two kinds of liquids was equivalent.The novel endometrial-cell preservation solution is superior to the liquid-base cell preservation solution for cervical cells, with clear background, diagnostic cells and low cost.

  11. Rollout Strategy to Implement Interoperable Traceability in the Seafood Industry.

    Science.gov (United States)

    Gooch, Martin; Dent, Benjamin; Sylvia, Gilbert; Cusack, Christopher

    2017-08-01

    Verifying the accuracy and rigor of data exchanged within and between businesses for the purposes of traceability rests on the existence of effective and efficient interoperable information systems that meet users' needs. Interoperability, particularly given the complexities intrinsic to the seafood industry, requires that the systems used by businesses operating along the supply chain share a common technology architecture that is robust, resilient, and evolves as industry needs change. Technology architectures are developed through engaging industry stakeholders in understanding why an architecture is required, the benefits provided to the industry and individual businesses and supply chains, and how the architecture will translate into practical results. This article begins by reiterating the benefits that the global seafood industry can capture by implementing interoperable chain-length traceability and the reason for basing the architecture on a peer-to-peer networked database concept versus more traditional centralized or linear approaches. A summary of capabilities that already exist within the seafood industry that the proposed architecture uses is discussed; and a strategy for implementing the architecture is presented. The 6-step strategy is presented in the form of a critical path. © 2017 Institute of Food Technologists®.

  12. Alcohol-based solutions for bovine testicular tissue fixation.

    Science.gov (United States)

    Cabrera, Nelson C; Espinoza, Jorge R; Vargas-Jentzsch, Paul; Sandoval, Patricio; Ramos, Luis A; Aponte, Pedro M

    2017-01-01

    Tissue fixation, a central element in histotechnology, is currently performed with chemical compounds potentially harmful for human health and the environment. Therefore, alternative fixatives are being developed, including alcohol-based solutions. We evaluated several ethanol-based mixtures with additives to study fixative penetration rate, tissue volume changes, and morphologic effects in the bovine testis. Fixatives used were Bouin solution, 4% formaldehyde (F4), 70% ethanol (E70), E70 with 1.5% glycerol (E70G), E70 with 5% acetic acid (E70A), E70 with 1.5% glycerol and 5% acetic acid (E70AG), and E70 with 1.5% glycerol, 5% acetic acid, and 1% dimethyl sulfoxide (DMSO; E70AGD). Five-millimeter bovine testicular tissue cubes could be completely penetrated by ethanol-based fixatives and Bouin solution in 2-3 h, whereas F4 required 21 h. Bouin solution produced general tissue shrinkage, whereas the other fixatives (alcohol-based and F4) caused tissue volume expansion. Although Bouin solution is an excellent fixative for testicular tissue, ethanol-based fixatives showed good penetration rates, low tissue shrinkage, and preserved sufficient morphology to allow identification of the stages of the seminiferous epithelium cycle, therefore representing a valid alternative for histotechnology laboratories. Common additives such as acetic acid, glycerol, and DMSO offered marginal benefits for the process of fixation; E70AG showed the best preservation of morphology with excellent nuclear detail, close to that of Bouin solution.

  13. Photochemical properties of Ysub(t) base in aqueous solution

    International Nuclear Information System (INIS)

    Paszyc, S.; Rafalska, M.

    1979-01-01

    Photoreactivity of Ysub(t) base (I) has been studied in aqueous solution (pH-6) saturated with oxygen. Two photoproducts (II,III), resulting from irradiation at lambda = 253.7 nm and lambda >= 290 nm were isolated and their structures determined. The quantum yield for Ysub(t) base disappearance (rho dis) is 0.002 (lambda = 313 nm). It was shown that dye- sensitised photo-oxidation of Ysub(t) base in aqueous solution occurs according to a Type I mechanism as well as with participation of singlet state oxygen. Quantum yields, fluorescence decay times and phosphorescence of Ysub(t) base have also been determined. (author)

  14. A step-by-step methodology for enterprise interoperability projects

    Science.gov (United States)

    Chalmeta, Ricardo; Pazos, Verónica

    2015-05-01

    Enterprise interoperability is one of the key factors for enhancing enterprise competitiveness. Achieving enterprise interoperability is an extremely complex process which involves different technological, human and organisational elements. In this paper we present a framework to help enterprise interoperability. The framework has been developed taking into account the three domains of interoperability: Enterprise Modelling, Architecture and Platform and Ontologies. The main novelty of the framework in comparison to existing ones is that it includes a step-by-step methodology that explains how to carry out an enterprise interoperability project taking into account different interoperability views, like business, process, human resources, technology, knowledge and semantics.

  15. Performance of Fly ash Based Geopolymer Mortars in Sulphate Solution

    Directory of Open Access Journals (Sweden)

    P. Ghosh

    2010-01-01

    Full Text Available An experimental investigation was conducted to study the performance of fly ash based geopolymer mortar specimens inMagnesium Sulphate solution. Specimens were manufactured from low calcium fly ash by activation with a mixture of SodiumHydroxide and Sodium Silicate solution and cured thermally. 10% by weight Magnesium Sulphate solution was usedto soak the specimen up to 24 weeks. Performance of the specimens was evaluated in terms of visual appearance, variationof pH of solution, change in weight, and change in compressive strength over the exposure period. White deposits occurredon the surface of specimen which was initially soft but later converted to hard crystals. pH of solution increased noticeablyduring the initial weeks which indicate migration of alkalis from mortar specimens. At the end of 24 weeks samples experiencedvery little weight gain and recorded a loss of compressive strength by up to 56%.

  16. Redox flow batteries based on supporting solutions containing chloride

    Energy Technology Data Exchange (ETDEWEB)

    Li, Liyu; Kim, Soowhan; Yang, Zhenguo; Wang, Wei; Nie, Zimin; Chen, Baowei; Zhang, Jianlu; Xia, Guanguang

    2017-11-14

    Redox flow battery systems having a supporting solution that contains Cl.sup.- ions can exhibit improved performance and characteristics. Furthermore, a supporting solution having mixed SO.sub.4.sup.2- and Cl.sup.- ions can provide increased energy density and improved stability and solubility of one or more of the ionic species in the catholyte and/or anolyte. According to one example, a vanadium-based redox flow battery system is characterized by an anolyte having V.sup.2+ and V.sup.3+ in a supporting solution and a catholyte having V.sup.4+ and V.sup.5+ in a supporting solution. The supporting solution can contain Cl.sup.- ions or a mixture of SO.sub.4.sup.2- and Cl.sup.- ions.

  17. Using software interoperability to achieve a virtual design environment

    Science.gov (United States)

    Gregory, G. Groot; Koshel, R. John

    2005-09-01

    A variety of simulation tools, including optical design and analysis, have benefited by many years of evolution in software functionality and computing power, thus making the notion of virtual design environments a reality. To simulate the optical characteristics of a system, one needs to include optical performance, mechanical design and manufacturing aspects simultaneously. To date, no single software program offers a universal solution. One approach to achieve an integrated environment is to select tools that offer a high degree of interoperability. This allows the selection of the best tools for each aspect of the design working in concert to solve the problem. This paper discusses the issues of how to assemble a design environment and provides an example of a combination of tools for illumination design. We begin by offering a broad definition of interoperability from an optical analysis perspective. This definition includes aspects of file interchange formats, software communications protocols and customized applications. One example solution is proposed by combining SolidWorks1 for computer-aided design (CAD), TracePro2 for optical analysis and MATLAB3 as the mathematical engine for tolerance analysis. The resulting virtual tool will be applied to a lightpipe design task to illustrate how such a system can be used.

  18. Modeling Interoperable Information Systems with 3LGM² and IHE.

    Science.gov (United States)

    Stäubert, S; Schaaf, M; Jahn, F; Brandner, R; Winter, A

    2015-01-01

    Strategic planning of information systems (IS) in healthcare requires descriptions of the current and the future IS state. Enterprise architecture planning (EAP) tools like the 3LGM² tool help to build up and to analyze IS models. A model of the planned architecture can be derived from an analysis of current state IS models. Building an interoperable IS, i. e. an IS consisting of interoperable components, can be considered a relevant strategic information management goal for many IS in healthcare. Integrating the healthcare enterprise (IHE) is an initiative which targets interoperability by using established standards. To link IHE concepts to 3LGM² concepts within the 3LGM² tool. To describe how an information manager can be supported in handling the complex IHE world and planning interoperable IS using 3LGM² models. To describe how developers or maintainers of IHE profiles can be supported by the representation of IHE concepts in 3LGM². Conceptualization and concept mapping methods are used to assign IHE concepts such as domains, integration profiles actors and transactions to the concepts of the three-layer graph-based meta-model (3LGM²). IHE concepts were successfully linked to 3LGM² concepts. An IHE-master-model, i. e. an abstract model for IHE concepts, was modeled with the help of 3LGM² tool. Two IHE domains were modeled in detail (ITI, QRPH). We describe two use cases for the representation of IHE concepts and IHE domains as 3LGM² models. Information managers can use the IHE-master-model as reference model for modeling interoperable IS based on IHE profiles during EAP activities. IHE developers are supported in analyzing consistency of IHE concepts with the help of the IHE-master-model and functions of the 3LGM² tool The complex relations between IHE concepts can be modeled by using the EAP method 3LGM². 3LGM² tool offers visualization and analysis features which are now available for the IHE-master-model. Thus information managers and IHE

  19. Rewilding as nature based solution in land management

    Science.gov (United States)

    Novara, Agata; Gristina, Luciano; Keesstra, Saskia; Pereira, Paulo; Cerda, Artemio

    2017-04-01

    Rewilding is an effective tool of ecological restoration and a nature based solution for hydro-meteorological risk control. Rewilding contributes to reduce flood risk, resist droughts, helps to restore soil organic matter content, increases soil and plant biodiversity, improves the overall ecosystem and human health. The key element of rewilding is not the nature control, but following the natural processes to restore the key soil ecological factors and their connectivity. Rewilding can be applicable at different ecosystem stages, from natural reserve to more anthropogenic system such as agricultural land through the restoration of wild soil function trough permaculture or forest farming. The proposed nature based solution not only avoid the investment in traditional engineering but it also an opportunities for creating new economics model based on wild nature (ecoturism, education, wild edible plants). This work is a review of applied rewilding actions and considerations on future nature based solutions applications will be discussed .

  20. Ionisation constants of inorganic acids and bases in aqueous solution

    CERN Document Server

    Perrin, D D

    2013-01-01

    Ionisation Constants of Inorganic Acids and Bases in Aqueous Solution, Second Edition provides a compilation of tables that summarize relevant data recorded in the literature up to the end of 1980 for the ionization constants of inorganic acids and bases in aqueous solution. This book includes references to acidity functions for strong acids and bases, as well as details about the formation of polynuclear species. This text then explains the details of each column of the tables, wherein column 1 gives the name of the substance and the negative logarithm of the ionization constant and column 2

  1. Family Based Services: A Solution-Focused Approach.

    Science.gov (United States)

    Berg, Insoo Kim

    Drawing on the field of family therapy, this step-by-step guide applies principles of brief, solution-focused therapy to family-based services (FBS) in ways that empower clients, increase cooperation, and aid the survival of social workers. Based on the author's experience at the Brief Family Therapy Center in Milwaukee, Wisconsin, the book is…

  2. Benefit quantification of interoperability in coordinate metrology

    DEFF Research Database (Denmark)

    Savio, E.; Carmignato, S.; De Chiffre, Leonardo

    2014-01-01

    these inefficiencies. The paper presents a methodology for an economic evaluation of interoperability benefits with respect to the verification of geometrical product specifications. It requires input data from testing and inspection activities, as well as information on training of personnel and licensing of software......One of the factors contributing to limited reproducibility of coordinate measurements is the use of different inspection software. Time-consuming efforts for translation of part programmes are sometimes needed, and interoperability of inspection equipment has the potential to reduce...

  3. An Open Source Tool to Test Interoperability

    Science.gov (United States)

    Bermudez, L. E.

    2012-12-01

    Scientists interact with information at various levels from gathering of the raw observed data to accessing portrayed processed quality control data. Geoinformatics tools help scientist on the acquisition, storage, processing, dissemination and presentation of geospatial information. Most of the interactions occur in a distributed environment between software components that take the role of either client or server. The communication between components includes protocols, encodings of messages and managing of errors. Testing of these communication components is important to guarantee proper implementation of standards. The communication between clients and servers can be adhoc or follow standards. By following standards interoperability between components increase while reducing the time of developing new software. The Open Geospatial Consortium (OGC), not only coordinates the development of standards but also, within the Compliance Testing Program (CITE), provides a testing infrastructure to test clients and servers. The OGC Web-based Test Engine Facility, based on TEAM Engine, allows developers to test Web services and clients for correct implementation of OGC standards. TEAM Engine is a JAVA open source facility, available at Sourceforge that can be run via command line, deployed in a web servlet container or integrated in developer's environment via MAVEN. The TEAM Engine uses the Compliance Test Language (CTL) and TestNG to test HTTP requests, SOAP services and XML instances against Schemas and Schematron based assertions of any type of web service, not only OGC services. For example, the OGC Web Feature Service (WFS) 1.0.0 test has more than 400 test assertions. Some of these assertions includes conformance of HTTP responses, conformance of GML-encoded data; proper values for elements and attributes in the XML; and, correct error responses. This presentation will provide an overview of TEAM Engine, introduction of how to test via the OGC Testing web site and

  4. Connectivity, interoperability and manageability challenges in internet of things

    Science.gov (United States)

    Haseeb, Shariq; Hashim, Aisha Hassan A.; Khalifa, Othman O.; Ismail, Ahmad Faris

    2017-09-01

    The vision of Internet of Things (IoT) is about interconnectivity between sensors, actuators, people and processes. IoT exploits connectivity between physical objects like fridges, cars, utilities, buildings and cities for enhancing the lives of people through automation and data analytics. However, this sudden increase in connected heterogeneous IoT devices takes a huge toll on the existing Internet infrastructure and introduces new challenges for researchers to embark upon. This paper highlights the effects of heterogeneity challenges on connectivity, interoperability, management in greater details. It also surveys some of the existing solutions adopted in the core network to solve the challenges of massive IoT deployment. The paper finally concludes that IoT architecture and network infrastructure needs to be reengineered ground-up, so that IoT solutions can be safely and efficiently deployed.

  5. The MED-SUV Multidisciplinary Interoperability Infrastructure

    Science.gov (United States)

    Mazzetti, Paolo; D'Auria, Luca; Reitano, Danilo; Papeschi, Fabrizio; Roncella, Roberto; Puglisi, Giuseppe; Nativi, Stefano

    2016-04-01

    the layer above. In order to address data and service heteogeneity, the MED-SUV infrastructure is based on the brokered architecture approach, implemented using the GI-suite Brokering Framework for discovery and access. The GI-Suite Brokering Framework has been extended and configured to broker all the identified relevant data sources. It is also able to publish data according to several de-iure and de-facto standards including OGC CSW and OpenSearch, facilitating the interconnection with external systems. At the Global level, MED-SUV identified the interconnection with GEOSS as the main requirement. Since MED-SUV Supersite level is implemented based on the same technology adopted in the current GEOSS Common Infrastructure (GCI) by the GEO Discovery and Access Broker (GEO DAB), no major interoperability problem is foreseen. The MED-SUV Multidisciplinary Interoperability Infrastructure is complemented by a user portal providing human-to-machine interaction, and enabling data discovery and access. The GI-Suite Brokering Framework APIs and javascript library support machine-to-machine interaction, enabling the creation of mobile and Web applications using information available through the MED-SUV Supersite.

  6. Vocabulary services to support scientific data interoperability

    Science.gov (United States)

    Cox, Simon; Mills, Katie; Tan, Florence

    2013-04-01

    Shared vocabularies are a core element in interoperable systems. Vocabularies need to be available at run-time, and where the vocabularies are shared by a distributed community this implies the use of web technology to provide vocabulary services. Given the ubiquity of vocabularies or classifiers in systems, vocabulary services are effectively the base of the interoperability stack. In contemporary knowledge organization systems, a vocabulary item is considered a concept, with the "terms" denoting it appearing as labels. The Simple Knowledge Organization System (SKOS) formalizes this as an RDF Schema (RDFS) application, with a bridge to formal logic in Web Ontology Language (OWL). For maximum utility, a vocabulary should be made available through the following interfaces: * the vocabulary as a whole - at an ontology URI corresponding to a vocabulary document * each item in the vocabulary - at the item URI * summaries, subsets, and resources derived by transformation * through the standard RDF web API - i.e. a SPARQL endpoint * through a query form for human users. However, the vocabulary data model may be leveraged directly in a standard vocabulary API that uses the semantics provided by SKOS. SISSvoc3 [1] accomplishes this as a standard set of URI templates for a vocabulary. Any URI comforming to the template selects a vocabulary subset based on the SKOS properties, including labels (skos:prefLabel, skos:altLabel, rdfs:label) and a subset of the semantic relations (skos:broader, skos:narrower, etc). SISSvoc3 thus provides a RESTFul SKOS API to query a vocabulary, but hiding the complexity of SPARQL. It has been implemented using the Linked Data API (LDA) [2], which connects to a SPARQL endpoint. By using LDA, we also get content-negotiation, alternative views, paging, metadata and other functionality provided in a standard way. A number of vocabularies have been formalized in SKOS and deployed by CSIRO, the Australian Bureau of Meteorology (BOM) and their

  7. Aragonite coating solutions (ACS) based on artificial seawater

    Science.gov (United States)

    Tas, A. Cuneyt

    2015-03-01

    Aragonite (CaCO3, calcium carbonate) is an abundant biomaterial of marine life. It is the dominant inorganic phase of coral reefs, mollusc bivalve shells and the stalactites or stalagmites of geological sediments. Inorganic and initially precipitate-free aragonite coating solutions (ACS) of pH 7.4 were developed in this study to deposit monolayers of aragonite spherules or ooids on biomaterial (e.g., UHMWPE, ultrahigh molecular weight polyethylene) surfaces soaked in ACS at 30 °C. The ACS solutions of this study have been developed for the surface engineering of synthetic biomaterials. The abiotic ACS solutions, enriched with calcium and bicarbonate ions at different concentrations, essentially mimicked the artificial seawater composition and started to deposit aragonite after a long (4 h) incubation period at the tropical sea surface temperature of 30 °C. While numerous techniques for the solution deposition of calcium hydroxyapatite (Ca10(PO4)6(OH)2), of low thermodynamic solubility, on synthetic biomaterials have been demonstrated, procedures related to the solution-based surface deposition of high solubility aragonite remained uncommon. Monolayers of aragonite ooids deposited at 30 °C on UHMWPE substrates soaked in organic-free ACS solutions were found to possess nano-structures similar to the mortar-and-brick-type botryoids observed in biogenic marine shells. Samples were characterized using SEM, XRD, FTIR, ICP-AES and contact angle goniometry.

  8. Meeting the New Challenges of International Interoperability

    Science.gov (United States)

    2009-06-01

    achieved by issuing common equipment to partners • Interoperability is gained by continuously working to tie cultural , procedural, and technical and policy...that attribute to improve information and knowledge by collective processes and cross-fertilization Interoperability… it’s Not Just for Geeks Reasons

  9. Equipping the enterprise interoperability problem solver

    NARCIS (Netherlands)

    Oude Luttighuis, P.; Folmer, E.J.A.

    2011-01-01

    The maturity of the enterprise interoperability field does not match the importance attached to it by many, both in the public as well as the private community. A host of models, paradigms, designs, standards, methods, and instruments seems to be available, but many of them are only used in rather

  10. Equipping the Enterprise Interoperability Problem Solver

    NARCIS (Netherlands)

    Oude Luttighuis, Paul; Folmer, Erwin Johan Albert; Charalabidis, Yannis

    2010-01-01

    The maturity of the enterprise interoperability field does not match the importance attached to it by many, both in the public as well as the private community. A host of models, paradigms, designs, standards, methods, and instruments seems to be available, but many of them are only used in rather

  11. Semantic Service Modeling: Enabling System Interoperability.

    NARCIS (Netherlands)

    Pokraev, S.; Quartel, Dick; Steen, Maarten W.A.; Reichert, M.U.

    2006-01-01

    Interoperability is the capability of different systems to use each other’s services effectively. It is about sharing functionality and information between systems at different levels, e.g., between physical devices, software applications, business units within one organization, or between different

  12. An Interoperable Security Framework for Connected Healthcare

    NARCIS (Netherlands)

    Asim, M.; Petkovic, M.; Qu, M.; Wang, C.

    2011-01-01

    Connected and interoperable healthcare system promises to reduce thecost of the healthcare delivery, increase its efficiency and enableconsumers to better engage with clinicians and manage their care. However at the same time it introduces new risks towards security andprivacy of personal health

  13. Smart Grid Interoperability Maturity Model Beta Version

    Energy Technology Data Exchange (ETDEWEB)

    Widergren, Steven E.; Drummond, R.; Giroti, Tony; Houseman, Doug; Knight, Mark; Levinson, Alex; longcore, Wayne; Lowe, Randy; Mater, J.; Oliver, Terry V.; Slack, Phil; Tolk, Andreas; Montgomery, Austin

    2011-12-02

    The GridWise Architecture Council was formed by the U.S. Department of Energy to promote and enable interoperability among the many entities that interact with the electric power system. This balanced team of industry representatives proposes principles for the development of interoperability concepts and standards. The Council provides industry guidance and tools that make it an available resource for smart grid implementations. In the spirit of advancing interoperability of an ecosystem of smart grid devices and systems, this document presents a model for evaluating the maturity of the artifacts and processes that specify the agreement of parties to collaborate across an information exchange interface. You are expected to have a solid understanding of large, complex system integration concepts and experience in dealing with software component interoperation. Those without this technical background should read the Executive Summary for a description of the purpose and contents of the document. Other documents, such as checklists, guides, and whitepapers, exist for targeted purposes and audiences. Please see the www.gridwiseac.org website for more products of the Council that may be of interest to you.

  14. Aragonite coating solutions (ACS) based on artificial seawater

    Energy Technology Data Exchange (ETDEWEB)

    Tas, A. Cuneyt, E-mail: c_tas@hotmail.com

    2015-03-01

    Graphical abstract: - Highlights: • Developed completely inorganic solutions for the deposition of monolayers of aragonite spherules (or ooids). • Solutions mimicked the artificial seawater. • Biomimetic crystallization was performed at the tropical sea surface temperature of 30 °C. - Abstract: Aragonite (CaCO{sub 3}, calcium carbonate) is an abundant biomaterial of marine life. It is the dominant inorganic phase of coral reefs, mollusc bivalve shells and the stalactites or stalagmites of geological sediments. Inorganic and initially precipitate-free aragonite coating solutions (ACS) of pH 7.4 were developed in this study to deposit monolayers of aragonite spherules or ooids on biomaterial (e.g., UHMWPE, ultrahigh molecular weight polyethylene) surfaces soaked in ACS at 30 °C. The ACS solutions of this study have been developed for the surface engineering of synthetic biomaterials. The abiotic ACS solutions, enriched with calcium and bicarbonate ions at different concentrations, essentially mimicked the artificial seawater composition and started to deposit aragonite after a long (4 h) incubation period at the tropical sea surface temperature of 30 °C. While numerous techniques for the solution deposition of calcium hydroxyapatite (Ca{sub 10}(PO{sub 4}){sub 6}(OH){sub 2}), of low thermodynamic solubility, on synthetic biomaterials have been demonstrated, procedures related to the solution-based surface deposition of high solubility aragonite remained uncommon. Monolayers of aragonite ooids deposited at 30 °C on UHMWPE substrates soaked in organic-free ACS solutions were found to possess nano-structures similar to the mortar-and-brick-type botryoids observed in biogenic marine shells. Samples were characterized using SEM, XRD, FTIR, ICP-AES and contact angle goniometry.

  15. Aragonite coating solutions (ACS) based on artificial seawater

    International Nuclear Information System (INIS)

    Tas, A. Cuneyt

    2015-01-01

    Graphical abstract: - Highlights: • Developed completely inorganic solutions for the deposition of monolayers of aragonite spherules (or ooids). • Solutions mimicked the artificial seawater. • Biomimetic crystallization was performed at the tropical sea surface temperature of 30 °C. - Abstract: Aragonite (CaCO 3 , calcium carbonate) is an abundant biomaterial of marine life. It is the dominant inorganic phase of coral reefs, mollusc bivalve shells and the stalactites or stalagmites of geological sediments. Inorganic and initially precipitate-free aragonite coating solutions (ACS) of pH 7.4 were developed in this study to deposit monolayers of aragonite spherules or ooids on biomaterial (e.g., UHMWPE, ultrahigh molecular weight polyethylene) surfaces soaked in ACS at 30 °C. The ACS solutions of this study have been developed for the surface engineering of synthetic biomaterials. The abiotic ACS solutions, enriched with calcium and bicarbonate ions at different concentrations, essentially mimicked the artificial seawater composition and started to deposit aragonite after a long (4 h) incubation period at the tropical sea surface temperature of 30 °C. While numerous techniques for the solution deposition of calcium hydroxyapatite (Ca 10 (PO 4 ) 6 (OH) 2 ), of low thermodynamic solubility, on synthetic biomaterials have been demonstrated, procedures related to the solution-based surface deposition of high solubility aragonite remained uncommon. Monolayers of aragonite ooids deposited at 30 °C on UHMWPE substrates soaked in organic-free ACS solutions were found to possess nano-structures similar to the mortar-and-brick-type botryoids observed in biogenic marine shells. Samples were characterized using SEM, XRD, FTIR, ICP-AES and contact angle goniometry

  16. Aerosol hygroscopic growth parameterization based on a solute specific coefficient

    Science.gov (United States)

    Metzger, S.; Steil, B.; Xu, L.; Penner, J. E.; Lelieveld, J.

    2011-09-01

    Water is a main component of atmospheric aerosols and its amount depends on the particle chemical composition. We introduce a new parameterization for the aerosol hygroscopic growth factor (HGF), based on an empirical relation between water activity (aw) and solute molality (μs) through a single solute specific coefficient νi. Three main advantages are: (1) wide applicability, (2) simplicity and (3) analytical nature. (1) Our approach considers the Kelvin effect and covers ideal solutions at large relative humidity (RH), including CCN activation, as well as concentrated solutions with high ionic strength at low RH such as the relative humidity of deliquescence (RHD). (2) A single νi coefficient suffices to parameterize the HGF for a wide range of particle sizes, from nanometer nucleation mode to micrometer coarse mode particles. (3) In contrast to previous methods, our analytical aw parameterization depends not only on a linear correction factor for the solute molality, instead νi also appears in the exponent in form x · ax. According to our findings, νi can be assumed constant for the entire aw range (0-1). Thus, the νi based method is computationally efficient. In this work we focus on single solute solutions, where νi is pre-determined with the bisection method from our analytical equations using RHD measurements and the saturation molality μssat. The computed aerosol HGF and supersaturation (Köhler-theory) compare well with the results of the thermodynamic reference model E-AIM for the key compounds NaCl and (NH4)2SO4 relevant for CCN modeling and calibration studies. The equations introduced here provide the basis of our revised gas-liquid-solid partitioning model, i.e. version 4 of the EQuilibrium Simplified Aerosol Model (EQSAM4), described in a companion paper.

  17. IT-Based Solutions to the Electoral System in Nigeria

    African Journals Online (AJOL)

    West African Journal of Industrial and Academic Research Vol.5 No. 1 December 2012. 127. IT-Based Solutions to the Electoral System in Nigeria ..... be condemned in totality because it does no one any good. Election is akin to games .... performance of the voting process. However, this approach creates a synchronization ...

  18. Conformal transistor arrays based on solution-processed organic crystals.

    Science.gov (United States)

    Zhao, Xiaoli; Zhang, Bing; Tang, Qingxin; Ding, Xueyan; Wang, Shuya; Zhou, Yuying; Tong, Yanhong; Liu, Yichun

    2017-11-13

    Conformal transistor array based on solution-processed organic crystals, which can provide sensory and scanning features for monitoring, biofeedback, and tracking of physiological function, presents one of the most promising technologies for future large-scale low-cost wearable and implantable electronics. However, it is still a huge challenge for the integration of solution-processed organic crystals into conformal FETs owing to a generally existing swelling phenomenon of the elastic materials and the lack of the corresponding device fabrication technology. Here, we present a promising route to fabricate a conformal field-effect transistor (FET) array based on solution-processed TIPS-pentacene single-crystal micro/nanowire array. By simply drop-casting the organic solution on an anti-solvent photolithography-compatible electrode with bottom-contact coplanar configuration, the transistor array can be formed and can conform onto uneven objects. Excellent electrical properties with device yield as high as 100%, field-effect mobility up to 0.79 cm 2 V -1 s -1 , low threshold voltage, and good device uniformity are demonstrated. The results open up the capability of solution-processed organic crystals for conformal electronics, suggesting their substantial promise for next-generation wearable and implantable electronics.

  19. Effect of disinfectant solutions on a denture base acrylic resin.

    Science.gov (United States)

    Carvalho, Cristiane F; Vanderlei, Aleska D; Marocho, Susana M Salazar; Pereira, Sarina M B; Nogueira, Lafayette; Paes-Júnior, Tarcisio J Arruda

    2012-01-01

    The aim of this study was to evaluate the hardness, roughness and mass loss of an acrylic denture base resin after in vitro exposure to four disinfectant solutions. Forty specimens (Clássico, Brazil) were prepared and randomly assigned to 4 groups n = 10) according to the disinfectant solution: G1: control, stored in distilled water at 37 degrees C; G2: 1% sodium hypochlorite; G3: 2% glutaraldehyde; G4: 4% chlorhexidine. G2 to G4 were immersed for 60 minutes in the disinfectant solution. Measurements were carried out both before and after immersion in the solution. The surface was analyzed with a surface roughness tester (Surfcorder SE 1700 KOZAKALAB), a microdurometer FM-700 (Future Tech) and a scanning electron microscope (DSM 962-ZEISS). Loss of mass was determined with a digital weighing scale. After disinfection procedures, values were analyzed statistically. The acrylic denture base resin may be vulnerable to surface changes after in vitro immersion in the disinfectant solutions studied.

  20. Impact of Business Interoperability on the Performance of Complex Cooperative Supply Chain Networks: A Case Study

    Directory of Open Access Journals (Sweden)

    Izunildo Cabral

    2018-01-01

    Full Text Available This paper proposes an agent-based model for evaluating the effect of business interoperability on the performance of cooperative supply chain networks. The model is based on insights from the Industrial Marketing and Purchasing network approach and the complex systems theory perspective. To demonstrate its applicability, an explanatory case study regarding a Portuguese reverse logistics cooperative supply chain network is presented. Face-to-face interviews and forms were used to collect data. The findings show that the establishment of appropriate levels of business interoperability has helped to reduce several non-value-added interaction processes and consequently improve the operational performance of the Valorpneu network. Regarding the research implications, this paper extends the current knowledge on business interoperability and an important problem in business: how business interoperability gaps in dyadic organizational relationships affect the network of companies that the two companies belong to—network effect. In terms of practical implications, managers can use the proposed model as a starting point to simulate complex interactions between supply chain network partners and understand better how the performance of their networks emerges from these interactions and from the adoption of different levels of business interoperability.

  1. Memristor-based memory: The sneak paths problem and solutions

    KAUST Repository

    Zidan, Mohammed A.

    2012-10-29

    In this paper, we investigate the read operation of memristor-based memories. We analyze the sneak paths problem and provide a noise margin metric to compare the various solutions proposed in the literature. We also analyze the power consumption associated with these solutions. Moreover, we study the effect of the aspect ratio of the memory array on the sneak paths. Finally, we introduce a new technique for solving the sneak paths problem by gating the memory cell using a three-terminal memistor device.

  2. Stability of subsystem solutions in agent-based models

    Science.gov (United States)

    Perc, Matjaž

    2018-01-01

    The fact that relatively simple entities, such as particles or neurons, or even ants or bees or humans, give rise to fascinatingly complex behaviour when interacting in large numbers is the hallmark of complex systems science. Agent-based models are frequently employed for modelling and obtaining a predictive understanding of complex systems. Since the sheer number of equations that describe the behaviour of an entire agent-based model often makes it impossible to solve such models exactly, Monte Carlo simulation methods must be used for the analysis. However, unlike pairwise interactions among particles that typically govern solid-state physics systems, interactions among agents that describe systems in biology, sociology or the humanities often involve group interactions, and they also involve a larger number of possible states even for the most simplified description of reality. This begets the question: when can we be certain that an observed simulation outcome of an agent-based model is actually stable and valid in the large system-size limit? The latter is key for the correct determination of phase transitions between different stable solutions, and for the understanding of the underlying microscopic processes that led to these phase transitions. We show that a satisfactory answer can only be obtained by means of a complete stability analysis of subsystem solutions. A subsystem solution can be formed by any subset of all possible agent states. The winner between two subsystem solutions can be determined by the average moving direction of the invasion front that separates them, yet it is crucial that the competing subsystem solutions are characterised by a proper composition and spatiotemporal structure before the competition starts. We use the spatial public goods game with diverse tolerance as an example, but the approach has relevance for a wide variety of agent-based models.

  3. Designing learning management system interoperability in semantic web

    Science.gov (United States)

    Anistyasari, Y.; Sarno, R.; Rochmawati, N.

    2018-01-01

    The extensive adoption of learning management system (LMS) has set the focus on the interoperability requirement. Interoperability is the ability of different computer systems, applications or services to communicate, share and exchange data, information, and knowledge in a precise, effective and consistent way. Semantic web technology and the use of ontologies are able to provide the required computational semantics and interoperability for the automation of tasks in LMS. The purpose of this study is to design learning management system interoperability in the semantic web which currently has not been investigated deeply. Moodle is utilized to design the interoperability. Several database tables of Moodle are enhanced and some features are added. The semantic web interoperability is provided by exploited ontology in content materials. The ontology is further utilized as a searching tool to match user’s queries and available courses. It is concluded that LMS interoperability in Semantic Web is possible to be performed.

  4. Tool interoperability in SSE OI 2.0

    Science.gov (United States)

    Carmody, C. L.; Shotton, C. T.

    1988-01-01

    This paper presents a review of the concept and implementation of tool interoperability in the Space Station Software Support Environment (SSE) OI 2.0. By first providing a description of SSE, the paper describes the problem at hand, that is; the nature of the SSE that gives rise to the requirement for interoperability--between SSE workstations and hence, between the tools which reside on the workstations. Specifically, word processor and graphic tool interoperability are discussed. The concept for interoperability that is implemented in OI 2.0 is described, as is an overview of the implementation strategy. Some of the significant challenges that the development team had to overcome to bring about interoperability are described, perhaps as a checklist, or warning, to others who would bring about tool interoperability. Lastly, plans to extend tool interoperability to a third class of tools in OI 3.0 are described.

  5. Characterisation of Nature-Based Solutions for the Built Environment

    Directory of Open Access Journals (Sweden)

    Yangang Xing

    2017-01-01

    Full Text Available Nature has provided humankind with food, fuel, and shelter throughout evolutionary history. However, in contemporary cities, many natural landscapes have become degraded and replaced with impermeable hard surfaces (e.g., roads, paving, car parks and buildings. The reversal of this trend is dynamic, complex and still in its infancy. There are many facets of urban greening initiatives involving multiple benefits, sensitivities and limitations. The aim of this paper is to develop a characterisation method of nature based solutions for designing and retrofitting in the built environment, and to facilitate knowledge transfer between disciplines and for design optimisation. Based on a review of the literature across disciplines, key characteristics could be organised into four groups: policy and community initiatives, multiple benefits assessment, topology, and design options. Challenges and opportunities for developing a characterisation framework to improve the use of nature based solutions in the built environment are discussed.

  6. Surface phase transitions in cu-based solid solutions

    Science.gov (United States)

    Zhevnenko, S. N.; Chernyshikhin, S. V.

    2017-11-01

    We have measured surface energy in two-component Cu-based systems in H2 + Ar gas atmosphere. The experiments on solid Cu [Ag] and Cu [Co] solutions show presence of phase transitions on the surfaces. Isotherms of the surface energy have singularities (the minimum in the case of copper solid solutions with silver and the maximum in the case of solid solutions with cobalt). In both cases, the surface phase transitions cause deficiency of surface miscibility: formation of a monolayer (multilayer) (Cu-Ag) or of nanoscale particles (Cu-Co). At the same time, according to the volume phase diagrams, the concentration and temperature of the surface phase transitions correspond to the solid solution within the volume. The method permits determining the rate of diffusional creep in addition to the surface energy. The temperature and concentration dependence of the solid solutions' viscosity coefficient supports the fact of the surface phase transitions and provides insights into the diffusion properties of the transforming surfaces.

  7. The eXtensible ontology development (XOD) principles and tool implementation to support ontology interoperability.

    Science.gov (United States)

    He, Yongqun; Xiang, Zuoshuang; Zheng, Jie; Lin, Yu; Overton, James A; Ong, Edison

    2018-01-12

    Ontologies are critical to data/metadata and knowledge standardization, sharing, and analysis. With hundreds of biological and biomedical ontologies developed, it has become critical to ensure ontology interoperability and the usage of interoperable ontologies for standardized data representation and integration. The suite of web-based Ontoanimal tools (e.g., Ontofox, Ontorat, and Ontobee) support different aspects of extensible ontology development. By summarizing the common features of Ontoanimal and other similar tools, we identified and proposed an "eXtensible Ontology Development" (XOD) strategy and its associated four principles. These XOD principles reuse existing terms and semantic relations from reliable ontologies, develop and apply well-established ontology design patterns (ODPs), and involve community efforts to support new ontology development, promoting standardized and interoperable data and knowledge representation and integration. The adoption of the XOD strategy, together with robust XOD tool development, will greatly support ontology interoperability and robust ontology applications to support data to be Findable, Accessible, Interoperable and Reusable (i.e., FAIR).

  8. An interoperable standard system for the automatic generation and publication of the fire risk maps based on Fire Weather Index (FWI)

    Science.gov (United States)

    Julià Selvas, Núria; Ninyerola Casals, Miquel

    2015-04-01

    It has been implemented an automatic system to predict the fire risk in the Principality of Andorra, a small country located in the eastern Pyrenees mountain range, bordered by Catalonia and France, due to its location, his landscape is a set of a rugged mountains with an average elevation around 2000 meters. The system is based on the Fire Weather Index (FWI) that consists on different components, each one, measuring a different aspect of the fire danger calculated by the values of the weather variables at midday. CENMA (Centre d'Estudis de la Neu i de la Muntanya d'Andorra) has a network around 10 automatic meteorological stations, located in different places, peeks and valleys, that measure weather data like relative humidity, wind direction and speed, surface temperature, rainfall and snow cover every ten minutes; this data is sent daily and automatically to the system implemented that will be processed in the way to filter incorrect measurements and to homogenizer measurement units. Then this data is used to calculate all components of the FWI at midday and for the level of each station, creating a database with the values of the homogeneous measurements and the FWI components for each weather station. In order to extend and model this data to all Andorran territory and to obtain a continuous map, an interpolation method based on a multiple regression with spline residual interpolation has been implemented. This interpolation considerer the FWI data as well as other relevant predictors such as latitude, altitude, global solar radiation and sea distance. The obtained values (maps) are validated using a cross-validation leave-one-out method. The discrete and continuous maps are rendered in tiled raster maps and published in a web portal conform to Web Map Service (WMS) Open Geospatial Consortium (OGC) standard. Metadata and other reference maps (fuel maps, topographic maps, etc) are also available from this geoportal.

  9. Semantic Integration for Marine Science Interoperability Using Web Technologies

    Science.gov (United States)

    Rueda, C.; Bermudez, L.; Graybeal, J.; Isenor, A. W.

    2008-12-01

    The Marine Metadata Interoperability Project, MMI (http://marinemetadata.org) promotes the exchange, integration, and use of marine data through enhanced data publishing, discovery, documentation, and accessibility. A key effort is the definition of an Architectural Framework and Operational Concept for Semantic Interoperability (http://marinemetadata.org/sfc), which is complemented with the development of tools that realize critical use cases in semantic interoperability. In this presentation, we describe a set of such Semantic Web tools that allow performing important interoperability tasks, ranging from the creation of controlled vocabularies and the mapping of terms across multiple ontologies, to the online registration, storage, and search services needed to work with the ontologies (http://mmisw.org). This set of services uses Web standards and technologies, including Resource Description Framework (RDF), Web Ontology language (OWL), Web services, and toolkits for Rich Internet Application development. We will describe the following components: MMI Ontology Registry: The MMI Ontology Registry and Repository provides registry and storage services for ontologies. Entries in the registry are associated with projects defined by the registered users. Also, sophisticated search functions, for example according to metadata items and vocabulary terms, are provided. Client applications can submit search requests using the WC3 SPARQL Query Language for RDF. Voc2RDF: This component converts an ASCII comma-delimited set of terms and definitions into an RDF file. Voc2RDF facilitates the creation of controlled vocabularies by using a simple form-based user interface. Created vocabularies and their descriptive metadata can be submitted to the MMI Ontology Registry for versioning and community access. VINE: The Vocabulary Integration Environment component allows the user to map vocabulary terms across multiple ontologies. Various relationships can be established, for example

  10. Interoperable Medical Instrument Networking and Access System with Security Considerations for Critical Care

    Directory of Open Access Journals (Sweden)

    Deniz Gurkan

    2010-01-01

    Full Text Available The recent influx of electronic medical records in the health care field, coupled with the need of providing continuous care to patients in the critical care environment, has driven the need for interoperability of medical devices. Open standards are needed to support flexible processes and interoperability of medical devices, especially in intensive care units. In this paper, we present an interoperable networking and access architecture based on the CAN protocol. Predictability of the delay of medical data reports is a desirable attribute that can be realized using a tightly-coupled system architecture. Our simulations on network architecture demonstrate that a bounded delay for event reports offers predictability. In addition, we address security issues related to the storage of electronic medical records. We present a set of open source tools and tests to identify the security breaches, and appropriate measures that can be implemented to be compliant with the HIPAA rules.

  11. Comparison of Ring-Buffer-Based Packet Capture Solutions

    Energy Technology Data Exchange (ETDEWEB)

    Barker, Steven Andrew [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2015-10-01

    Traditional packet-capture solutions using commodity hardware incur a large amount of overhead as packets are copied multiple times by the operating system. This overhead slows sensor systems to a point where they are unable to keep up with high bandwidth traffic, resulting in dropped packets. Incomplete packet capture files hinder network monitoring and incident response efforts. While costly commercial hardware exists to capture high bandwidth traffic, several software-based approaches exist to improve packet capture performance using commodity hardware.

  12. Land based use of natural gas - distribution solutions

    International Nuclear Information System (INIS)

    Jordanger, Einar; Moelnvik, Mona J.; Owren, Geir; Einang, Per Magne; Grinden, Bjoern; Tangen, Grethe

    2002-05-01

    The report presents results from the project ''Landbasert bruk av naturgass - distribusjonsloesninger'' (Land based use of natural gas - distribution solutions). It describes the aims of the project, the political external conditions for the use of natural gas, some environmental profits by changing from petroleum and coal to natural gas, the Norwegian infrastructure, the optimisation of energy transport, strategic consequences of the introduction of LNG and the practical consequences of the Enova strategy

  13. A Game Theory Based Solution for Security Challenges in CRNs

    Science.gov (United States)

    Poonam; Nagpal, Chander Kumar

    2018-03-01

    Cognitive radio networks (CRNs) are being envisioned to drive the next generation Ad hoc wireless networks due to their ability to provide communications resilience in continuously changing environments through the use of dynamic spectrum access. Conventionally CRNs are dependent upon the information gathered by other secondary users to ensure the accuracy of spectrum sensing making them vulnerable to security attacks leading to the need of security mechanisms like cryptography and trust. However, a typical cryptography based solution is not a viable security solution for CRNs owing to their limited resources. Effectiveness of trust based approaches has always been, in question, due to credibility of secondary trust resources. Game theory with its ability to optimize in an environment of conflicting interests can be quite a suitable tool to manage an ad hoc network in the presence of autonomous selfish/malevolent/malicious and attacker nodes. The literature contains several theoretical proposals for augmenting game theory in the ad hoc networks without explicit/detailed implementation. This paper implements a game theory based solution in MATLAB-2015 to secure the CRN environment and compares the obtained results with the traditional approaches of trust and cryptography. The simulation result indicates that as the time progresses the game theory performs much better with higher throughput, lower jitter and better identification of selfish/malicious nodes.

  14. Internet of Things Heterogeneous Interoperable Network Architecture Design

    DEFF Research Database (Denmark)

    Bhalerao, Dipashree M.

    2014-01-01

    Internet of Thing‘s (IoT) state of the art deduce that there is no mature Internet of Things architecture available. Thesis contributes an abstract generic IoT system reference architecture development with specifications. Novelties of thesis are proposed solutions and implementations....... It is proved that reduction of data at a source will result in huge vertical scalability and indirectly horizontal also. Second non functional feature contributes in heterogeneous interoperable network architecture for constrained Things. To eliminate increasing number of gateways, Wi-Fi access point...... with Bluetooth, Zigbee (new access point is called as BZ-Fi) is proposed. Co-existence of Wi-Fi, Bluetooth, and Zigbee network technologies results in interference. To reduce the interference, orthogonal frequency division multiplexing (OFDM) is proposed tobe implemented in Bluetooth and Zigbee. The proposed...

  15. Interoperability of remote handling control system software modules at Divertor Test Platform 2 using middleware

    Energy Technology Data Exchange (ETDEWEB)

    Tuominen, Janne, E-mail: janne.m.tuominen@tut.fi [Tampere University of Technology, Department of Intelligent Hydraulics and Automation, Tampere (Finland); Rasi, Teemu; Mattila, Jouni [Tampere University of Technology, Department of Intelligent Hydraulics and Automation, Tampere (Finland); Siuko, Mikko [VTT, Technical Research Centre of Finland, Tampere (Finland); Esque, Salvador [F4E, Fusion for Energy, Torres Diagonal Litoral B3, Josep Pla2, 08019, Barcelona (Spain); Hamilton, David [ITER Organization, Route de Vinon sur Verdon, 13115 Saint Paul Lez Durance (France)

    2013-10-15

    Highlights: ► The prototype DTP2 remote handling control system is a heterogeneous collection of subsystems, each realizing a functional area of responsibility. ► Middleware provides well-known, reusable solutions to problems, such as heterogeneity, interoperability, security and dependability. ► A middleware solution was selected and integrated with the DTP2 RH control system. The middleware was successfully used to integrate all relevant subsystems and functionality was demonstrated. -- Abstract: This paper focuses on the inter-subsystem communication channels in a prototype distributed remote handling control system at Divertor Test Platform 2 (DTP2). The subsystems are responsible for specific tasks and, over the years, their development has been carried out using various platforms and programming languages. The communication channels between subsystems have different priorities, e.g. very high messaging rate and deterministic timing or high reliability in terms of individual messages. Generally, a control system's communication infrastructure should provide interoperability, scalability, performance and maintainability. An attractive approach to accomplish this is to use a standardized and proven middleware implementation. The selection of a middleware can have a major cost impact in future integration efforts. In this paper we present development done at DTP2 using the Object Management Group's (OMG) standard specification for Data Distribution Service (DDS) for ensuring communications interoperability. DDS has gained a stable foothold especially in the military field. It lacks a centralized broker, thereby avoiding a single-point-of-failure. It also includes an extensive set of Quality of Service (QoS) policies. The standard defines a platform- and programming language independent model and an interoperability wire protocol that enables DDS vendor interoperability, allowing software developers to avoid vendor lock-in situations.

  16. C3I and Modelling and Simulation (M&S) Interoperability

    Science.gov (United States)

    2004-03-01

    applications within the RNLA [2]. The C3I Framework uses commercial of the shelf publish/subscribe services ( Tibco Rendezvous) and a tailored information...support interoperability within their own domain. The C2WS system uses the ‘C3I Framework’ middleware, which is based on Tibco /Rendezvous. The...simulation systems use the HLA interoperability standard. We have developed a ’ Tibco -HLA gateway’ to connect TIB/RV on one side to HLA on the other side (see

  17. The Joint Lessons Learned System and Interoperability

    Science.gov (United States)

    1989-06-02

    should not be artificially separated. 4 8 That lesson would not be learned here. Another lesson which was learned, however, was that interservice... artificially high level of support masked the continuing rivalry between the Army and Air Force over mission priorities. 6 4 In spite of Air Force...knowledge concerning joint interoperability issues and lessons learned activities. -72- MAP 2 Central African Republic Sudan sangu . Bo"oo Cameroon tuie

  18. Radiation chemical studies of purine bases in aqueous solutions

    International Nuclear Information System (INIS)

    Infante, H.; Castro, J.W.; Arroyo, I.; Giron, E.; Infante, G.A.

    1984-01-01

    Rate constants of purine bases, such as xanthine and hypoxanthine, with the radiolytic species (e a-bar q and . OH) produced upon the action of ionizing radiation in aqueous solutions were determined using pulse radiolysis. The rate constants obtained were similar to other purine bases. Using radiochromatographic and spectroscopic techniques, destruction yields for these purine bases have been determined in the presence of suitable scavengers. The formation yields of several radiolytic products produced during the radiolysis of these compounds have been determined. Glycol and 7 and 8 hydroxy derivatives of both xanthine and hypoxanthine are the major products in the hydroxyl radical reactions. The dihydropurines and aminopyrimidines derivatives are the major products in the aqueous electron reactions. The results of these investigations are compared with those for other purine and pyrimidine bases. A basic mechanism for the radiolysis of purine bases is postulated and discussed

  19. THE Interoperability Challenge for the Geosciences: Stepping up from Interoperability between Disciplinary Siloes to Creating Transdisciplinary Data Platforms.

    Science.gov (United States)

    Wyborn, L. A.; Evans, B. J. K.; Trenham, C.; Druken, K. A.; Wang, J.

    2015-12-01

    The National Computational Infrastructure (NCI) at the Australian National University (ANU) has collocated over 10 PB of national and international data assets within a HPC facility to create the National Environmental Research Data Interoperability Platform (NERDIP). The data span a wide range of fields from the earth systems and environment (climate, coasts, oceans, and geophysics) through to astronomy, bioinformatics, and the social sciences. These diverse data collections are collocated on a major data storage node that is linked to a Petascale HPC and Cloud facility. Users can search across all of the collections and either log in and access the data directly, or they can access the data via standards-based web services. These collocated petascale data collections are theoretically a massive resource for interdisciplinary science at scales and resolutions never hitherto possible. But once collocated, multiple barriers became apparent that make cross-domain data integration very difficult and often so time consuming, that either less ambitious research goals are attempted or the project is abandoned. Incompatible content is only one half of the problem: other showstoppers are differing access models, licences and issues of ownership of derived products. Brokers can enable interdisciplinary research but in reality are we just delaying the inevitable? A call to action is required adopt a transdiciplinary approach at the conception of development of new multi-disciplinary systems whereby those across all the scientific domains, the humanities, social sciences and beyond work together to create a unity of informatics plaforms that interoperate horizontally across the multiple discipline boundaries, and also operate vertically to enable a diversity of people to access data from high end researchers, to undergraduate, school students and the general public. Once we master such a transdisciplinary approach to our vast global information assets, we will then achieve

  20. Cross-domain Collaborative Research and People Interoperability: Beyond Knowledge Representation Frameworks

    Science.gov (United States)

    Fox, P. A.; Diviacco, P.; Busato, A.

    2016-12-01

    Geo-scientific research collaboration commonly faces of complex systems where multiple skills and competences are needed at the same time. Efficacy of such collaboration among researchers then becomes of paramount importance. Multidisciplinary studies draw from domains that are far from each other. Researchers also need to understand: how to extract what data they need and eventually produce something that can be used by others. The management of information and knowledge in this perspective is non-trivial. Interoperability is frequently sought in computer-to-computer environements, so-as to overcome mismatches in vocabulary, data formats, coordinate reference system and so on. Successful researcher collaboration also relies on interoperability of the people! Smaller, synchronous and face-to-face settings for researchers are knownn to enhance people interoperability. However changing settings; either geographically; temporally; or with increasing the team size, diversity, and expertise requires people-computer-people-computer (...) interoperability. To date, knowledge representation framework have been proposed but not proven as necessary and sufficient to achieve multi-way interoperability. In this contribution, we address epistemology and sociology of science advocating for a fluid perspective where science is mostly a social construct, conditioned by cognitive issues; especially cognitive bias. Bias cannot be obliterated. On the contrary it must be carefully taken into consideration. Information-centric interfaces built from different perspectives and ways of thinking by actors with different point of views, approaches and aims, are proposed as a means for enhancing people interoperability in computer-based settings. The contribution will provide details on the approach of augmenting and interfacing to knowledge representation frameworks to the cognitive-conceptual frameworks for people that are needed to meet and exceed collaborative research goals in the 21st

  1. Ocean Data Interoperability Platform: developing a common global framework for marine data management

    Science.gov (United States)

    Glaves, Helen; Schaap, Dick

    2017-04-01

    elsewhere. To add a further layer of complexity there are also global initiatives providing marine data infrastructures e.g. IOC-IODE, POGO as well as those with a wider remit which includes environmental data e.g. GEOSS, COPERNICUS etc. Ecosystem level marine research requires a common framework for marine data management that supports the sharing of data across these regional and global data systems, and provides the user with access to the data available from these services via a single point of access. This framework must be based on existing data systems and established by developing interoperability between them. The Ocean Data and Interoperability Platform (ODIP/ODIP II) project brings together those organisations responsible for maintaining selected regional data infrastructures along with other relevant experts in order to identify the common standards and best practice necessary to underpin this framework, and to evaluate the differences and commonalties between the regional data infrastructures in order to establish interoperability between them for the purposes of data sharing. This coordinated approach is being demonstrated and validated through the development of a series of prototype interoperability solutions that demonstrate the mechanisms and standards necessary to facilitate the sharing of marine data across these existing data infrastructures.

  2. Nature-based agricultural solutions: Scaling perennial grains across Africa.

    Science.gov (United States)

    Peter, Brad G; Mungai, Leah M; Messina, Joseph P; Snapp, Sieglinde S

    2017-11-01

    Modern plant breeding tends to focus on maximizing yield, with one of the most ubiquitous implementations being shorter-duration crop varieties. It is indisputable that these breeding efforts have resulted in greater yields in ideal circumstances; however, many farmed locations across Africa suffer from one or more conditions that limit the efficacy of modern short-duration hybrids. In view of global change and increased necessity for intensification, perennial grains and long-duration varieties offer a nature-based solution for improving farm productivity and smallholder livelihoods in suboptimal agricultural areas. Specific conditions where perennial grains should be considered include locations where biophysical and social constraints reduce agricultural system efficiency, and where conditions are optimal for crop growth. Using a time-series of remotely-sensed data, we locate the marginal agricultural lands of Africa, identifying suboptimal temperature and precipitation conditions for the dominant crop, i.e., maize, as well as optimal climate conditions for two perennial grains, pigeonpea and sorghum. We propose that perennial grains offer a lower impact, sustainable nature-based solution to this subset of climatic drivers of marginality. Using spatial analytic methods and satellite-derived climate information, we demonstrate the scalability of perennial pigeonpea and sorghum across Africa. As a nature-based solution, we argue that perennial grains offer smallholder farmers of marginal lands a sustainable solution for enhancing resilience and minimizing risk in confronting global change, while mitigating social and edaphic drivers of low and variable production. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  3. Interoperability in planetary research for geospatial data analysis

    Science.gov (United States)

    Hare, Trent M.; Rossi, Angelo P.; Frigeri, Alessandro; Marmo, Chiara

    2018-01-01

    For more than a decade there has been a push in the planetary science community to support interoperable methods for accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (e.g., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized geospatial image formats, geologic mapping conventions, U.S. Federal Geographic Data Committee (FGDC) cartographic and metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Map Tile Services (cached image tiles), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they can be just as valuable for planetary domain. Another initiative, called VESPA (Virtual European Solar and Planetary Access), will marry several of the above geoscience standards and astronomy-based standards as defined by International Virtual Observatory Alliance (IVOA). This work outlines the current state of interoperability initiatives in use or in the process of being researched within the planetary geospatial community.

  4. The Challenges of Interoperable Data Discovery

    Science.gov (United States)

    Meaux, Melanie F.

    2005-01-01

    The Global Change Master Directory (GCMD) assists the oceanographic community in data discovery and access through its online metadata directory. The directory also offers data holders a means to post and search their oceanographic data through the GCMD portals, i.e. online customized subset metadata directories. The Gulf of Maine Ocean Data Partnership (GoMODP) has expressed interest in using the GCMD portals to increase the visibility of their data holding throughout the Gulf of Maine region and beyond. The purpose of the Gulf of Maine Ocean Data Partnership (GoMODP) is to "promote and coordinate the sharing, linking, electronic dissemination, and use of data on the Gulf of Maine region". The participants have decided that a "coordinated effort is needed to enable users throughout the Gulf of Maine region and beyond to discover and put to use the vast and growing quantities of data in their respective databases". GoMODP members have invited the GCMD to discuss further collaborations in view of this effort. This presentation. will focus on the GCMD GoMODP Portal - demonstrating its content and use for data discovery, and will discuss the challenges of interoperable data discovery. interoperability among metadata standards and vocabularies will be discussed. A short overview of the lessons learned at the Marine Metadata Interoperability (MMI) metadata workshop held in Boulder, Colorado on August 9-11, 2005 will be given.

  5. IHE cross-enterprise document sharing for imaging: interoperability testing software.

    Science.gov (United States)

    Noumeir, Rita; Renaud, Bérubé

    2010-09-21

    With the deployments of Electronic Health Records (EHR), interoperability testing in healthcare is becoming crucial. EHR enables access to prior diagnostic information in order to assist in health decisions. It is a virtual system that results from the cooperation of several heterogeneous distributed systems. Interoperability between peers is therefore essential. Achieving interoperability requires various types of testing. Implementations need to be tested using software that simulates communication partners, and that provides test data and test plans. In this paper we describe a software that is used to test systems that are involved in sharing medical images within the EHR. Our software is used as part of the Integrating the Healthcare Enterprise (IHE) testing process to test the Cross Enterprise Document Sharing for imaging (XDS-I) integration profile. We describe its architecture and functionalities; we also expose the challenges encountered and discuss the elected design solutions. EHR is being deployed in several countries. The EHR infrastructure will be continuously evolving to embrace advances in the information technology domain. Our software is built on a web framework to allow for an easy evolution with web technology. The testing software is publicly available; it can be used by system implementers to test their implementations. It can also be used by site integrators to verify and test the interoperability of systems, or by developers to understand specifications ambiguities, or to resolve implementations difficulties.

  6. IHE cross-enterprise document sharing for imaging: interoperability testing software

    Directory of Open Access Journals (Sweden)

    Renaud Bérubé

    2010-09-01

    Full Text Available Abstract Background With the deployments of Electronic Health Records (EHR, interoperability testing in healthcare is becoming crucial. EHR enables access to prior diagnostic information in order to assist in health decisions. It is a virtual system that results from the cooperation of several heterogeneous distributed systems. Interoperability between peers is therefore essential. Achieving interoperability requires various types of testing. Implementations need to be tested using software that simulates communication partners, and that provides test data and test plans. Results In this paper we describe a software that is used to test systems that are involved in sharing medical images within the EHR. Our software is used as part of the Integrating the Healthcare Enterprise (IHE testing process to test the Cross Enterprise Document Sharing for imaging (XDS-I integration profile. We describe its architecture and functionalities; we also expose the challenges encountered and discuss the elected design solutions. Conclusions EHR is being deployed in several countries. The EHR infrastructure will be continuously evolving to embrace advances in the information technology domain. Our software is built on a web framework to allow for an easy evolution with web technology. The testing software is publicly available; it can be used by system implementers to test their implementations. It can also be used by site integrators to verify and test the interoperability of systems, or by developers to understand specifications ambiguities, or to resolve implementations difficulties.

  7. MPEG-4 solutions for virtualizing RDP-based applications

    Science.gov (United States)

    Joveski, Bojan; Mitrea, Mihai; Ganji, Rama-Rao

    2014-02-01

    The present paper provides the proof-of-concepts for the use of the MPEG-4 multimedia scene representations (BiFS and LASeR) as a virtualization tool for RDP-based applications (e.g. MS Windows applications). Two main applicative benefits are thus granted. First, any legacy application can be virtualized without additional programming effort. Second, heterogeneous mobile devices (different manufacturers, OS) can collaboratively enjoy full multimedia experiences. From the methodological point of view, the main novelty consists in (1) designing an architecture allowing the conversion of the RDP content into a semantic multimedia scene-graph and its subsequent rendering on the client and (2) providing the underlying scene graph management and interactivity tools. Experiments consider 5 users and two RDP applications (MS Word and Internet Explorer), and benchmark our solution against two state-of-the-art technologies (VNC and FreeRDP). The visual quality is evaluated by six objective measures (e.g. PSNR<37dB, SSIM<0.99). The network traffic evaluation shows that: (1) for text editing, the MPEG-based solutions outperforms the VNC by a factor 1.8 while being 2 times heavier then the FreeRDP; (2) for Internet browsing, the MPEG solutions outperform both VNC and FreeRDP by factors of 1.9 and 1.5, respectively. The average round-trip times (less than 40ms) cope with real-time application constraints.

  8. Hybrid Fundamental Solution Based Finite Element Method: Theory and Applications

    Directory of Open Access Journals (Sweden)

    Changyong Cao

    2015-01-01

    Full Text Available An overview on the development of hybrid fundamental solution based finite element method (HFS-FEM and its application in engineering problems is presented in this paper. The framework and formulations of HFS-FEM for potential problem, plane elasticity, three-dimensional elasticity, thermoelasticity, anisotropic elasticity, and plane piezoelectricity are presented. In this method, two independent assumed fields (intraelement filed and auxiliary frame field are employed. The formulations for all cases are derived from the modified variational functionals and the fundamental solutions to a given problem. Generation of elemental stiffness equations from the modified variational principle is also described. Typical numerical examples are given to demonstrate the validity and performance of the HFS-FEM. Finally, a brief summary of the approach is provided and future trends in this field are identified.

  9. Theromdynamics of carbon in nickel-based multicomponent solid solutions

    International Nuclear Information System (INIS)

    Bradley, D.J.

    1978-04-01

    The activity coefficient of carbon in nickel, nickel-titanium, nickel-titanium-chromium, nickel-titanium-molybdenum and nickel-titanium-molybdenum-chromium alloys has been measured at 900, 1100 and 1215 0 C. The results indicate that carbon obeys Henry's Law over the range studied (0 to 2 at. percent). The literature for the nickel-carbon and iron-carbon systems are reviewed and corrected. For the activity of carbon in iron as a function of composition, a new relationship based on re-evaluation of the thermodynamics of the CO/CO 2 equilibrium is proposed. Calculations using this relationship reproduce the data to within 2.5 percent, but the accuracy of the calibrating standards used by many investigators to analyze for carbon is at best 5 percent. This explains the lack of agreement between the many precise sets of data. The values of the activity coefficient of carbon in the various solid solutions are used to calculate a set of parameters for the Kohler-Kaufman equation. The calculations indicate that binary interaction energies are not sufficient to describe the thermodynamics of carbon in some of the nickel-based solid solutions. The results of previous workers for carbon in nickel-iron alloys are completely described by inclusion of ternary terms in the Kohler-Kaufman equation. Most of the carbon solid solution at high temperatures in nickel and nickel-titantium alloys precipitates from solution on quenching in water. The precipitate is composed of very small particles (greater than 2.5 nm) of elemental carbon. The results of some preliminary thermomigration experiments are discussed and recommendations for further work are presented

  10. Theromdynamics of carbon in nickel-based multicomponent solid solutions

    Energy Technology Data Exchange (ETDEWEB)

    Bradley, D. J.

    1978-04-01

    The activity coefficient of carbon in nickel, nickel-titanium, nickel-titanium-chromium, nickel-titanium-molybdenum and nickel-titanium-molybdenum-chromium alloys has been measured at 900, 1100 and 1215/sup 0/C. The results indicate that carbon obeys Henry's Law over the range studied (0 to 2 at. percent). The literature for the nickel-carbon and iron-carbon systems are reviewed and corrected. For the activity of carbon in iron as a function of composition, a new relationship based on re-evaluation of the thermodynamics of the CO/CO/sub 2/ equilibrium is proposed. Calculations using this relationship reproduce the data to within 2.5 percent, but the accuracy of the calibrating standards used by many investigators to analyze for carbon is at best 5 percent. This explains the lack of agreement between the many precise sets of data. The values of the activity coefficient of carbon in the various solid solutions are used to calculate a set of parameters for the Kohler-Kaufman equation. The calculations indicate that binary interaction energies are not sufficient to describe the thermodynamics of carbon in some of the nickel-based solid solutions. The results of previous workers for carbon in nickel-iron alloys are completely described by inclusion of ternary terms in the Kohler-Kaufman equation. Most of the carbon solid solution at high temperatures in nickel and nickel-titantium alloys precipitates from solution on quenching in water. The precipitate is composed of very small particles (greater than 2.5 nm) of elemental carbon. The results of some preliminary thermomigration experiments are discussed and recommendations for further work are presented.

  11. Governance of Interoperability in Intergovernmental Services - Towards an Empirical Taxonomy

    Directory of Open Access Journals (Sweden)

    Herbert Kubicek

    2008-12-01

    Full Text Available High quality and comfortable online delivery of governmental services often requires the seamless exchange of data between two or more government agencies. Smooth data exchange, in turn, requires interoperability of the databases and workflows in the agencies involved. Interoperability (IOP is a complex issue covering purely technical aspects such as transmission protocols and data exchange formats, but also content-related semantic aspects such as identifiers and the meaning of codes as well as organizational, contractual or legal issues. Starting from IOP frameworks which provide classifications of what has to be standardized, this paper, based on an ongoing research project, adopts a political and managerial view and tries to clarify the governance of achieving IOP, i.e. where and by whom IOPstandards are developed and established and how they are put into operation. By analyzing 32 cases of successful implementation of IOP in E-Government services within the European Union empirical indicators for different aspects of governance are proposed and applied to develop an empirical taxonomy of different types of IOP governance which can be used for future comparative research regarding success factors, barriers etc.

  12. PyMOOSE: interoperable scripting in Python for MOOSE

    Directory of Open Access Journals (Sweden)

    Subhasis Ray

    2008-12-01

    Full Text Available Python is emerging as a common scripting language for simulators. This opens up many possibilities for interoperability in the form of analysis, interfaces, and communications between simulators. We report the integration of Python scripting with the Multi-scale Object Oriented Simulation Environment (MOOSE. MOOSE is a general-purpose simulation system for compartmental neuronal models and for models of signaling pathways based on chemical kinetics. We show how the Python-scripting version of MOOSE, PyMOOSE, combines the power of a compiled simulator with the versatility and ease of use of Python. We illustrate this by using Python numerical libraries to analyze MOOSE output online, and by developing a GUI in Python/Qt for a MOOSE simulation. Finally, we build and run a composite neuronal/signaling model that uses both the NEURON and MOOSE numerical engines, and Python as a bridge between the two. Thus PyMOOSE has a high degree of interoperability with analysis routines, with graphical toolkits, and with other simulators.

  13. Interoperability architecture for electric mobility

    NARCIS (Netherlands)

    Brand, Allard; Iacob, Maria Eugenia; van Sinderen, Marten J.; Chapurlat, V.

    2015-01-01

    The current architecture for electric mobility provides insufficient integration with the electricity system, since at this moment there is no possibility for influencing the charge process based on information from market parties such as the distribution system operator. Charging can neither be

  14. An Approach to Semantic Interoperability for Improved Capability Exchanges in Federations of Systems

    Science.gov (United States)

    Moschoglou, Georgios

    2013-01-01

    This study seeks an affirmative answer to the question whether a knowledge-based approach to system of systems interoperation using semantic web standards and technologies can provide the centralized control of the capability for exchanging data and services lacking in a federation of systems. Given the need to collect and share real-time…

  15. Telemedicine system interoperability architecture: concept description and architecture overview.

    Energy Technology Data Exchange (ETDEWEB)

    Craft, Richard Layne, II

    2004-05-01

    In order for telemedicine to realize the vision of anywhere, anytime access to care, it must address the question of how to create a fully interoperable infrastructure. This paper describes the reasons for pursuing interoperability, outlines operational requirements that any interoperability approach needs to consider, proposes an abstract architecture for meeting these needs, identifies candidate technologies that might be used for rendering this architecture, and suggests a path forward that the telemedicine community might follow.

  16. Interoperability technology assessment for joint C4ISR systems

    OpenAIRE

    Berzins, Valdis Andris; Luqi; Shultes, Bruce C.; Guo, Jiang; Allen, Jim; Cheng, Ngom; Gee, Karen; Nyugen, Tom; Stierna, Eria

    1999-01-01

    This study characterizes and assesses alternative approaches to software component interoperability in distributed environments typical of C4ISR systems. Interoperability is the ability of systems to provide services to and accept services from other systems, and to use the services so exchanged to enable them to operate effectively together. This study characterizes and assesses alternative approaches to software component interoperability in distributed environments. Candidate approaches in...

  17. SHIWA workflow interoperability solutions for neuroimaging data analysis

    NARCIS (Netherlands)

    Korkhov, Vladimir; Krefting, Dagmar; Montagnat, Johan; Truong Huu, Tram; Kukla, Tamas; Terstyanszky, Gabor; Manset, David; Caan, Matthan; Olabarriaga, Silvia

    2012-01-01

    Neuroimaging is a field that benefits from distributed computing infrastructures (DCIs) to perform data- and compute-intensive processing and analysis. Using grid workflow systems not only automates the processing pipelines, but also enables domain researchers to implement their expertise on how to

  18. A Survey on Smartphone-Based Crowdsensing Solutions

    Directory of Open Access Journals (Sweden)

    Willian Zamora

    2016-01-01

    Full Text Available In recent years, the widespread adoption of mobile phones, combined with the ever-increasing number of sensors that smartphones are equipped with, greatly simplified the generalized adoption of crowdsensing solutions by reducing hardware requirements and costs to a minimum. These factors have led to an outstanding growth of crowdsensing proposals from both academia and industry. In this paper, we provide a survey of smartphone-based crowdsensing solutions that have emerged in the past few years, focusing on 64 works published in top-ranked journals and conferences. To properly analyze these previous works, we first define a reference framework based on how we classify the different proposals under study. The results of our survey evidence that there is still much heterogeneity in terms of technologies adopted and deployment approaches, although modular designs at both client and server elements seem to be dominant. Also, the preferred client platform is Android, while server platforms are typically web-based, and client-server communications mostly rely on XML or JSON over HTTP. The main detected pitfall concerns the performance evaluation of the different proposals, which typically fail to make a scalability analysis despite being critical issue when targeting very large communities of users.

  19. A web services choreography scenario for interoperating bioinformatics applications.

    Science.gov (United States)

    de Knikker, Remko; Guo, Youjun; Li, Jin-Long; Kwan, Albert K H; Yip, Kevin Y; Cheung, David W; Cheung, Kei-Hoi

    2004-03-10

    Very often genome-wide data analysis requires the interoperation of multiple databases and analytic tools. A large number of genome databases and bioinformatics applications are available through the web, but it is difficult to automate interoperation because: 1) the platforms on which the applications run are heterogeneous, 2) their web interface is not machine-friendly, 3) they use a non-standard format for data input and output, 4) they do not exploit standards to define application interface and message exchange, and 5) existing protocols for remote messaging are often not firewall-friendly. To overcome these issues, web services have emerged as a standard XML-based model for message exchange between heterogeneous applications. Web services engines have been developed to manage the configuration and execution of a web services workflow. To demonstrate the benefit of using web services over traditional web interfaces, we compare the two implementations of HAPI, a gene expression analysis utility developed by the University of California San Diego (UCSD) that allows visual characterization of groups or clusters of genes based on the biomedical literature. This utility takes a set of microarray spot IDs as input and outputs a hierarchy of MeSH Keywords that correlates to the input and is grouped by Medical Subject Heading (MeSH) category. While the HTML output is easy for humans to visualize, it is difficult for computer applications to interpret semantically. To facilitate the capability of machine processing, we have created a workflow of three web services that replicates the HAPI functionality. These web services use document-style messages, which means that messages are encoded in an XML-based format. We compared three approaches to the implementation of an XML-based workflow: a hard coded Java application, Collaxa BPEL Server and Taverna Workbench. The Java program functions as a web services engine and interoperates with these web services using a web

  20. A web services choreography scenario for interoperating bioinformatics applications

    Directory of Open Access Journals (Sweden)

    Cheung David W

    2004-03-01

    Full Text Available Abstract Background Very often genome-wide data analysis requires the interoperation of multiple databases and analytic tools. A large number of genome databases and bioinformatics applications are available through the web, but it is difficult to automate interoperation because: 1 the platforms on which the applications run are heterogeneous, 2 their web interface is not machine-friendly, 3 they use a non-standard format for data input and output, 4 they do not exploit standards to define application interface and message exchange, and 5 existing protocols for remote messaging are often not firewall-friendly. To overcome these issues, web services have emerged as a standard XML-based model for message exchange between heterogeneous applications. Web services engines have been developed to manage the configuration and execution of a web services workflow. Results To demonstrate the benefit of using web services over traditional web interfaces, we compare the two implementations of HAPI, a gene expression analysis utility developed by the University of California San Diego (UCSD that allows visual characterization of groups or clusters of genes based on the biomedical literature. This utility takes a set of microarray spot IDs as input and outputs a hierarchy of MeSH Keywords that correlates to the input and is grouped by Medical Subject Heading (MeSH category. While the HTML output is easy for humans to visualize, it is difficult for computer applications to interpret semantically. To facilitate the capability of machine processing, we have created a workflow of three web services that replicates the HAPI functionality. These web services use document-style messages, which means that messages are encoded in an XML-based format. We compared three approaches to the implementation of an XML-based workflow: a hard coded Java application, Collaxa BPEL Server and Taverna Workbench. The Java program functions as a web services engine and interoperates

  1. A web services choreography scenario for interoperating bioinformatics applications

    Science.gov (United States)

    de Knikker, Remko; Guo, Youjun; Li, Jin-long; Kwan, Albert KH; Yip, Kevin Y; Cheung, David W; Cheung, Kei-Hoi

    2004-01-01

    Background Very often genome-wide data analysis requires the interoperation of multiple databases and analytic tools. A large number of genome databases and bioinformatics applications are available through the web, but it is difficult to automate interoperation because: 1) the platforms on which the applications run are heterogeneous, 2) their web interface is not machine-friendly, 3) they use a non-standard format for data input and output, 4) they do not exploit standards to define application interface and message exchange, and 5) existing protocols for remote messaging are often not firewall-friendly. To overcome these issues, web services have emerged as a standard XML-based model for message exchange between heterogeneous applications. Web services engines have been developed to manage the configuration and execution of a web services workflow. Results To demonstrate the benefit of using web services over traditional web interfaces, we compare the two implementations of HAPI, a gene expression analysis utility developed by the University of California San Diego (UCSD) that allows visual characterization of groups or clusters of genes based on the biomedical literature. This utility takes a set of microarray spot IDs as input and outputs a hierarchy of MeSH Keywords that correlates to the input and is grouped by Medical Subject Heading (MeSH) category. While the HTML output is easy for humans to visualize, it is difficult for computer applications to interpret semantically. To facilitate the capability of machine processing, we have created a workflow of three web services that replicates the HAPI functionality. These web services use document-style messages, which means that messages are encoded in an XML-based format. We compared three approaches to the implementation of an XML-based workflow: a hard coded Java application, Collaxa BPEL Server and Taverna Workbench. The Java program functions as a web services engine and interoperates with these web

  2. Adaptive solution of partial differential equations in multiwavelet bases

    International Nuclear Information System (INIS)

    Alpert, B.; Beylkin, G.; Gines, D.; Vozovoi, L.

    2002-01-01

    We construct multiresolution representations of derivative and exponential operators with linear boundary conditions in multiwavelet bases and use them to develop a simple, adaptive scheme for the solution of nonlinear, time-dependent partial differential equations. The emphasis on hierarchical representations of functions on intervals helps to address issues of both high-order approximation and efficient application of integral operators, and the lack of regularity of multiwavelets does not preclude their use in representing differential operators. Comparisons with finite difference, finite element, and spectral element methods are presented, as are numerical examples with the heat equation and Burgers' equation

  3. Bio-based lubricants for numerical solution of elastohydrodynamic lubrication

    Science.gov (United States)

    Cupu, Dedi Rosa Putra; Sheriff, Jamaluddin Md; Osman, Kahar

    2012-06-01

    This paper presents a programming code to provide numerical solution of elastohydrodynamic lubrication problem in line contacts which is modeled through an infinite cylinder on a plane to represent the application of roller bearing. In this simulation, vegetable oils will be used as bio-based lubricants. Temperature is assumed to be constant at 40°C. The results show that the EHL pressure for all vegetable oils was increasing from inlet flow until the center, then decrease a bit and rise to the peak pressure. The shapes of EHL film thickness for all tested vegetable oils are almost flat at contact region.

  4. Investigation of samarium solubility in the magnesium based solid solution

    International Nuclear Information System (INIS)

    Rokhlin, L.L.; Padezhnova, E.M.; Guzej, L.S.

    1976-01-01

    Electric resistance measurements and microscopic analysis were used to investigate the solubility of samarium in a magnesium-based solid solution. The constitutional diagram Mg-Sm on the magnesium side is of an eutectic type with the temperature of the eutectic transformation of 542 deg C. Samarium is partly soluble in solid magnesium, the less so, the lower is the temperature. The maximum solubility of samarium in magnesium (at the eutectic transformation point) is 5.8 % by mass (0.99 at. %). At 200 deg C, the solubility of samarium in magnesium is 0.4 % by mass (0.063 at. %)

  5. Professional SharePoint 2010 Cloud-Based Solutions

    CERN Document Server

    Fox, Steve; Stubbs, Paul; Follette, Donovan

    2011-01-01

    An authoritative guide to extending SharePoint's power with cloud-based services If you want to be part of the next major shift in the IT industry, you'll want this book. Melding two of the hottest trends in the industry—the widespread popularity of the SharePoint collaboration platform and the rapid rise of cloud computing—this practical guide shows developers how to extend their SharePoint solutions with the cloud's almost limitless capabilities. See how to get started, discover smart ways to leverage cloud data and services through Azure, start incorporating Twitter or LinkedIn

  6. Meeting People’s Needs in a Fully Interoperable Domotic Environment

    Directory of Open Access Journals (Sweden)

    Vittorio Miori

    2012-05-01

    Full Text Available The key idea underlying many Ambient Intelligence (AmI projects and applications is context awareness, which is based mainly on their capacity to identify users and their locations. The actual computing capacity should remain in the background, in the periphery of our awareness, and should only move to the center if and when necessary. Computing thus becomes ‘invisible’, as it is embedded in the environment and everyday objects. The research project described herein aims to realize an Ambient Intelligence-based environment able to improve users’ quality of life by learning their habits and anticipating their needs. This environment is part of an adaptive, context-aware framework designed to make today’s incompatible heterogeneous domotic systems fully interoperable, not only for connecting sensors and actuators, but for providing comprehensive connections of devices to users. The solution is a middleware architecture based on open and widely recognized standards capable of abstracting the peculiarities of underlying heterogeneous technologies and enabling them to co-exist and interwork, without however eliminating their differences. At the highest level of this infrastructure, the Ambient Intelligence framework, integrated with the domotic sensors, can enable the system to recognize any unusual or dangerous situations and anticipate health problems or special user needs in a technological living environment, such as a house or a public space.

  7. Solution-Based Processing of Monodisperse Two-Dimensional Nanomaterials.

    Science.gov (United States)

    Kang, Joohoon; Sangwan, Vinod K; Wood, Joshua D; Hersam, Mark C

    2017-04-18

    Exfoliation of single-layer graphene from bulk graphite and the subsequent discovery of exotic physics and emergent phenomena in the atomically thin limit has motivated the isolation of other two-dimensional (2D) layered nanomaterials. Early work on isolated 2D nanomaterial flakes has revealed a broad range of unique physical and chemical properties with potential utility in diverse applications. For example, the electronic and optical properties of 2D nanomaterials depend strongly on atomic-scale variations in thickness, enabling enhanced performance in optoelectronic technologies such as light emitters, photodetectors, and photovoltaics. Much of the initial research on 2D nanomaterials has relied on micromechanical exfoliation, which yields high-quality 2D nanomaterial flakes that are suitable for fundamental studies but possesses limited scalability for real-world applications. In an effort to overcome this limitation, solution-processing methods for isolating large quantities of 2D nanomaterials have emerged. Importantly, solution processing results in 2D nanomaterial dispersions that are amenable to roll-to-roll fabrication methods that underlie lost-cost manufacturing of thin-film transistors, transparent conductors, energy storage devices, and solar cells. Despite these advantages, solution-based exfoliation methods typically lack control over the lateral size and thickness of the resulting 2D nanomaterial flakes, resulting in polydisperse dispersions with heterogeneous properties. Therefore, post-exfoliation separation techniques are needed to achieve 2D nanomaterial dispersions with monodispersity in lateral size, thickness, and properties. In this Account, we survey the latest developments in solution-based separation methods that aim to produce monodisperse dispersions and thin films of emerging 2D nanomaterials such as graphene, boron nitride, transition metal dichalcogenides, and black phosphorus. First, we motivate the need for precise thickness

  8. Advanced solutions in combustion-based WtE technologies.

    Science.gov (United States)

    Martin, Johannes J E; Koralewska, Ralf; Wohlleben, Andreas

    2015-03-01

    Thermal treatment of waste by means of combustion in grate-based systems has gained world-wide acceptance as the preferred method for sustainable management and safe disposal of residual waste. In order to maintain this position and to address new challenges and/or priorities, these systems need to be further developed with a view to energy conservation, resource and climate protection and a reduction in the environmental impact in general. MARTIN GmbH has investigated continuously how the implementation of innovative concepts in essential parts of its grate-based Waste-to-Energy (WtE) combustion technology can be used to meet the above-mentioned requirements. As a result of these efforts, new "advanced solutions" were developed, four examples of which are shown in this article. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Two RFID-based solutions to enhance inpatient medication safety.

    Science.gov (United States)

    Chien, Hung-Yu; Yang, Chia-Chuan; Wu, Tzong-Chen; Lee, Chin-Feng

    2011-06-01

    Owing to the low cost and convenience of identifying an object without physical contact, Radio Frequency Identification (RFID) systems provide innovative, promising and efficient applications in many domains. An RFID grouping protocol is a protocol that allows an off-line verifier to collect and verify the evidence of two or more tags simultaneously present. Recently, Huang and Ku (J. Med. Syst, 2009) proposed an efficient grouping protocol to enhance medication safety for inpatients based on low-cost tags. However, the Huang-Ku scheme is not secure; an attacker can easily make up fake grouping records to cheat the verifier. This weakness would seriously endanger the safety of inpatient medication safety. This paper will show the weaknesses, and then propose two RFID-based solutions to enhance medication safety for two different scenarios. The proposed schemes are practical, secure and efficient for medication applications.

  10. Enterprise Interoperability - Proceedings of the 5th International IFIP Working Conference on Enterprise Interoperability, IWEI 2013

    NARCIS (Netherlands)

    van Sinderen, Marten J.; Oude Luttighuis, P.H.W.M.; Folmer, Erwin Johan Albert; Bosems, S.; Unknown, [Unknown

    IWEI is an International IFIP Working Conference covering all aspects of enterprise interoperability with the purpose of achieving flexible cross-organizational collaboration through integrated support at business and technical levels. It provides a forum for discussing ideas and results among both

  11. Effect of base oil polarity on micro and nano friction behaviour of base oil +ZDDP solutions

    OpenAIRE

    Tomala, Agnieszka; Naveira Suarez, Aldara; Gebeshuber, Ilse-Christine; Pasaribu, Rihard

    2009-01-01

    Ball-on-disc tribo tests and atomic force microscopy (AFM) were used to analyze the effect of base oil polarity on the friction behaviour of steel-steel contacts lubricated with base oil + zinc dialkyldithiophosphate (ZDDP) solutions. Understanding the lubrication properties of the first chemisorbed layer of additives on work pieces yields important information for the optimization of lubrication in various solutions, in particular with regard to the type of additive and amount needed.To char...

  12. Inter-Operability of ESA Science Archives

    Science.gov (United States)

    Arviset, Christophe; Guainazzi, Matteo; Salama, Alberto; Dowson, John; Hernández, José; Osuna, Pedro; Venet, Aurèle

    ESA Science Archives for ISO and XMM-Newton have been developed by the Science Operations and Data Systems Division in Villafranca, Spain. By using an open 3-tier architecture (Data Products and Database, Business Logic, User Interface) together with Java and XML technology, inter-operability has been achieved from these archives to external archives (NED/ SIMBAD, ADS, IRAS). Furthermore, that has allowed external archives (CDS, ADS, IRSA, HEASARC) to directly access ISO and XMM-Newton data without going through their standard user interfaces.

  13. RFID in libraries a step toward interoperability

    CERN Document Server

    Ayre, Lori Bowen

    2012-01-01

    The approval by The National Information Standards Organization (NISO) of a new standard for RFID in libraries is a big step toward interoperability among libraries and vendors. By following this set of practices and procedures, libraries can ensure that an RFID tag in one library can be used seamlessly by another, assuming both comply, even if they have different suppliers for tags, hardware, and software. In this issue of Library Technology Reports, Lori Bowen Ayre, an experienced implementer of automated materials handling systems, Provides background on the evolution of the standard

  14. AliEn - EDG Interoperability in ALICE

    OpenAIRE

    Bagnasco, S.; Barbera, R.; Buncic, P.; Carminati, F.; Cerello, P.; Saiz, P.

    2003-01-01

    AliEn (ALICE Environment) is a GRID-like system for large scale job submission and distributed data management developed and used in the context of ALICE, the CERN LHC heavy-ion experiment. With the aim of exploiting upcoming Grid resources to run AliEn-managed jobs and store the produced data, the problem of AliEn-EDG interoperability was addressed and an in-terface was designed. One or more EDG (European Data Grid) User Interface machines run the AliEn software suite (Cluster Monitor, Stora...

  15. Interoperable PKI Data Distribution in Computational Grids

    Energy Technology Data Exchange (ETDEWEB)

    Pala, Massimiliano; Cholia, Shreyas; Rea, Scott A.; Smith, Sean W.

    2008-07-25

    One of the most successful working examples of virtual organizations, computational grids need authentication mechanisms that inter-operate across domain boundaries. Public Key Infrastructures(PKIs) provide sufficient flexibility to allow resource managers to securely grant access to their systems in such distributed environments. However, as PKIs grow and services are added to enhance both security and usability, users and applications must struggle to discover available resources-particularly when the Certification Authority (CA) is alien to the relying party. This article presents how to overcome these limitations of the current grid authentication model by integrating the PKI Resource Query Protocol (PRQP) into the Grid Security Infrastructure (GSI).

  16. Web services for distributed and interoperable hydro-information systems

    Science.gov (United States)

    Horak, J.; Orlik, A.; Stromsky, J.

    2008-03-01

    Web services support the integration and interoperability of Web-based applications and enable machine-to-machine interaction. The concepts of web services and open distributed architecture were applied to the development of T-DSS, the prototype customised for web based hydro-information systems. T-DSS provides mapping services, database related services and access to remote components, with special emphasis placed on the output flexibility (e.g. multilingualism), where SOAP web services are mainly used for communication. The remote components are represented above all by remote data and mapping services (e.g. meteorological predictions), modelling and analytical systems (currently HEC-HMS, MODFLOW and additional utilities), which support decision making in water management.

  17. CAD Services: an Industry Standard Interface for Mechanical CAD Interoperability

    Science.gov (United States)

    Claus, Russell; Weitzer, Ilan

    2002-01-01

    Most organizations seek to design and develop new products in increasingly shorter time periods. At the same time, increased performance demands require a team-based multidisciplinary design process that may span several organizations. One approach to meet these demands is to use 'Geometry Centric' design. In this approach, design engineers team their efforts through one united representation of the design that is usually captured in a CAD system. Standards-based interfaces are critical to provide uniform, simple, distributed services that enable the 'Geometry Centric' design approach. This paper describes an industry-wide effort, under the Object Management Group's (OMG) Manufacturing Domain Task Force, to define interfaces that enable the interoperability of CAD, Computer Aided Manufacturing (CAM), and Computer Aided Engineering (CAE) tools. This critical link to enable 'Geometry Centric' design is called: Cad Services V1.0. This paper discusses the features of this standard and proposed application.

  18. The 2nd DBCLS BioHackathon: interoperable bioinformatics Web services for integrated applications

    Directory of Open Access Journals (Sweden)

    Katayama Toshiaki

    2011-08-01

    Full Text Available Abstract Background The interaction between biological researchers and the bioinformatics tools they use is still hampered by incomplete interoperability between such tools. To ensure interoperability initiatives are effectively deployed, end-user applications need to be aware of, and support, best practices and standards. Here, we report on an initiative in which software developers and genome biologists came together to explore and raise awareness of these issues: BioHackathon 2009. Results Developers in attendance came from diverse backgrounds, with experts in Web services, workflow tools, text mining and visualization. Genome biologists provided expertise and exemplar data from the domains of sequence and pathway analysis and glyco-informatics. One goal of the meeting was to evaluate the ability to address real world use cases in these domains using the tools that the developers represented. This resulted in i a workflow to annotate 100,000 sequences from an invertebrate species; ii an integrated system for analysis of the transcription factor binding sites (TFBSs enriched based on differential gene expression data obtained from a microarray experiment; iii a workflow to enumerate putative physical protein interactions among enzymes in a metabolic pathway using protein structure data; iv a workflow to analyze glyco-gene-related diseases by searching for human homologs of glyco-genes in other species, such as fruit flies, and retrieving their phenotype-annotated SNPs. Conclusions Beyond deriving prototype solutions for each use-case, a second major purpose of the BioHackathon was to highlight areas of insufficiency. We discuss the issues raised by our exploration of the problem/solution space, concluding that there are still problems with the way Web services are modeled and annotated, including: i the absence of several useful data or analysis functions in the Web service "space"; ii the lack of documentation of methods; iii lack of

  19. Implementation and Evaluation of Four Interoperable Open Standards for the Internet of Things

    Science.gov (United States)

    Jazayeri, Mohammad Ali; Liang, Steve H. L.; Huang, Chih-Yuan

    2015-01-01

    Recently, researchers are focusing on a new use of the Internet called the Internet of Things (IoT), in which enabled electronic devices can be remotely accessed over the Internet. As the realization of IoT concept is still in its early stages, manufacturers of Internet-connected devices and IoT web service providers are defining their proprietary protocols based on their targeted applications. Consequently, IoT becomes heterogeneous in terms of hardware capabilities and communication protocols. Addressing these heterogeneities by following open standards is a necessary step to communicate with various IoT devices. In this research, we assess the feasibility of applying existing open standards on resource-constrained IoT devices. The standard protocols developed in this research are OGC PUCK over Bluetooth, TinySOS, SOS over CoAP, and OGC SensorThings API. We believe that by hosting open standard protocols on IoT devices, not only do the devices become self-describable, self-contained, and interoperable, but innovative applications can also be easily developed with standardized interfaces. In addition, we use memory consumption, request message size, response message size, and response latency to benchmark the efficiency of the implemented protocols. In all, this research presents and evaluates standard-based solutions to better understand the feasibility of applying existing standards to the IoT vision. PMID:26402683

  20. Implementation and Evaluation of Four Interoperable Open Standards for the Internet of Things.

    Science.gov (United States)

    Jazayeri, Mohammad Ali; Liang, Steve H L; Huang, Chih-Yuan

    2015-09-22

    Recently, researchers are focusing on a new use of the Internet called the Internet of Things (IoT), in which enabled electronic devices can be remotely accessed over the Internet. As the realization of IoT concept is still in its early stages, manufacturers of Internet-connected devices and IoT web service providers are defining their proprietary protocols based on their targeted applications. Consequently, IoT becomes heterogeneous in terms of hardware capabilities and communication protocols. Addressing these heterogeneities by following open standards is a necessary step to communicate with various IoT devices. In this research, we assess the feasibility of applying existing open standards on resource-constrained IoT devices. The standard protocols developed in this research are OGC PUCK over Bluetooth, TinySOS, SOS over CoAP, and OGC SensorThings API. We believe that by hosting open standard protocols on IoT devices, not only do the devices become self-describable, self-contained, and interoperable, but innovative applications can also be easily developed with standardized interfaces. In addition, we use memory consumption, request message size, response message size, and response latency to benchmark the efficiency of the implemented protocols. In all, this research presents and evaluates standard-based solutions to better understand the feasibility of applying existing standards to the IoT vision.

  1. SOLUTIONING

    Directory of Open Access Journals (Sweden)

    Maria de Hoyos Guajardo, Ph.D. Candidate, M.Sc., B.Eng.

    2004-11-01

    Full Text Available The theory that is presented below aims to conceptualise how a group of undergraduate students tackle non-routine mathematical problems during a problem-solving course. The aim of the course is to allow students to experience mathematics as a creative process and to reflect on their own experience. During the course, students are required to produce a written ‘rubric’ of their work, i.e., to document their thoughts as they occur as well as their emotionsduring the process. These ‘rubrics’ were used as the main source of data.Students’ problem-solving processes can be explained as a three-stage process that has been called ‘solutioning’. This process is presented in the six sections below. The first three refer to a common area of concern that can be called‘generating knowledge’. In this way, generating knowledge also includes issues related to ‘key ideas’ and ‘gaining understanding’. The third and the fourth sections refer to ‘generating’ and ‘validating a solution’, respectively. Finally, once solutions are generated and validated, students usually try to improve them further before presenting them as final results. Thus, the last section deals with‘improving a solution’. Although not all students go through all of the stages, it may be said that ‘solutioning’ considers students’ main concerns as they tackle non-routine mathematical problems.

  2. Design and study of geosciences data share platform :platform framework, data interoperability, share approach

    Science.gov (United States)

    Lu, H.; Yi, D.

    2010-12-01

    The Deep Exploration is one of the important approaches to the Geoscience research. Since 1980s we had started it and achieved a lot of data. Researchers usually integrate both data of space exploration and deep exploration to study geological structures and represent the Earth’s subsurface, and analyze and explain on the base of integrated data. Due to the different exploration approach it results the heterogeneity of data, and therefore the data achievement is always of the import issue to make the researchers confused. The problem of data share and interaction has to be solved during the development of the SinoProbe research project. Through the research of domestic and overseas well-known exploration project and geosciences data platform, the subject explores the solution of data share and interaction. Based on SOA we present the deep exploration data share framework which comprises three level: data level is used for the solution of data store and the integration of the heterogeneous data; medial level provides the data service of geophysics, geochemistry, etc. by the means of Web service, and carry out kinds of application combination by the use of GIS middleware and Eclipse RCP; interaction level provides professional and non-professional customer the access to different accuracy data. The framework adopts GeoSciML data interaction approach. GeoSciML is a geosciences information markup language, as an application of the OpenGIS Consortium’s (OGC) Geography Markup Language (GML). It transfers heterogeneous data into one earth frame and implements inter-operation. We dissertate in this article the solution how to integrate the heterogeneous data and share the data in the project of SinoProbe.

  3. A Collaborative System Software Solution for Modeling Business Flows Based on Automated Semantic Web Service Composition

    Directory of Open Access Journals (Sweden)

    Ion SMEUREANU

    2009-01-01

    Full Text Available Nowadays, business interoperability is one of the key factors for assuring competitive advantage for the participant business partners. In order to implement business cooperation, scalable, distributed and portable collaborative systems have to be implemented. This article presents some of the mostly used technologies in this field. Furthermore, it presents a software application architecture based on Business Process Modeling Notation standard and automated semantic web service coupling for modeling business flow in a collaborative manner. The main business processes will be represented in a single, hierarchic flow diagram. Each element of the diagram will represent calls to semantic web services. The business logic (the business rules and constraints will be structured with the help of OWL (Ontology Web Language. Moreover, OWL will also be used to create the semantic web service specifications.

  4. Semantic and Syntactic Object Correlation in the Object-Oriented Method for Interoperability

    National Research Council Canada - National Science Library

    Shedd, Stephen

    2002-01-01

    In today's military interoperability is not a luxury, it is a necessity. Unfortunately, differences in data representation between various systems greatly complicate the task of achieving interoperability...

  5. Establishing Interoperability of a Blog Archive through Linked Open Data

    DEFF Research Database (Denmark)

    Kalb, Hendrik; Lazaridou, Paraskevi; Trier, Matthias

    2013-01-01

    on archived data. However, interoperability among BlogForever archives, as well as with other digital libraries, is necessary in order to avoid silos of data. In this paper, we reveal some of our efforts to establish interoperability through the application of Linked Open data....

  6. Reference architecture for interoperability testing of Electric Vehicle charging

    NARCIS (Netherlands)

    Lehfuss, F.; Nohrer, M.; Werkmany, E.; Lopezz, J.A.; Zabalaz, E.

    2015-01-01

    This paper presents a reference architecture for interoperability testing of electric vehicles as well as their support equipment with the smart grid and the e-Mobility environment. Pan-European Electric Vehicle (EV)-charging is currently problematic as there are compliance and interoperability

  7. Interactive test tool for interoperable C-ITS development

    NARCIS (Netherlands)

    Voronov, A.; Englund, C.; Bengtsson, H.H.; Chen, L.; Ploeg, J.; Jongh, J.F.C.M. de; Sluis, H.J.D. van de

    2015-01-01

    This paper presents the architecture of an Interactive Test Tool (ITT) for interoperability testing of Cooperative Intelligent Transport Systems (C-ITS). Cooperative systems are developed by different manufacturers at different locations, which makes interoperability testing a tedious task. Up until

  8. A maturity model for interoperability in eHealth

    NARCIS (Netherlands)

    van Velsen, Lex Stefan; Oude Nijeweme-d'Hollosy, Wendeline; Hermens, Hermanus J.

    2016-01-01

    Interoperability, the ability of different technological applications to exchange data, is viewed by many as an important goal for eHealth, as it can save money and improve the quality of care and patient safety. However, creating an interoperable infrastructure for eHealth is a difficult task. In

  9. Connected Lighting System Interoperability Study Part 1: Application Programming Interfaces

    Energy Technology Data Exchange (ETDEWEB)

    Gaidon, Clement [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Poplawski, Michael [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2017-10-31

    First in a series of studies that focuses on interoperability as realized by the use of Application Programming Interfaces (APIs), explores the diversity of such interfaces in several connected lighting systems; characterizes the extent of interoperability that they provide; and illustrates challenges, limitations, and tradeoffs that were encountered during this exploration.

  10. Interoperability of Demand Response Resources Demonstration in NY

    Energy Technology Data Exchange (ETDEWEB)

    Wellington, Andre

    2014-03-31

    The Interoperability of Demand Response Resources Demonstration in NY (Interoperability Project) was awarded to Con Edison in 2009. The objective of the project was to develop and demonstrate methodologies to enhance the ability of customer sited Demand Response resources to integrate more effectively with electric delivery companies and regional transmission organizations.

  11. Evolving Interoperable Data Systems Through Regional Collaborations

    Science.gov (United States)

    Howard, M. K.

    2008-12-01

    The Gulf of Mexico Coastal Ocean Observing System (GCOOS) is a federation of independent sub-regional observing systems. Most of these systems were in operation long before the Integrated Ocean Observing System (IOOS) Data Management and Communications (DMAC) guidelines were established. Hence, each local data management system evolved independently and interoperability was never a consideration. Achieving the goal of building an automated and largely unattended machine-to-machine interoperable data system for the region has proven to be more than a resource and technological challenge. Challenges also fall within the organizational and cultural realms. In 2008 NOAA funds were used to build the first instance of a GCOOS regional data portal and to harmonize the local data management systems of ten principal sub-regional data providers. Early efforts were focused on regional data catalogs, adoption of a common vocabulary for parameters, and deploying data service access points using common interfaces. This was done in full partnership between the data providers and the portal builders with the intent that local data providers remain independent nodes capable of participating in the vision of IOOS on their own. The data portal serves the region primarily as a central point for fusions of data and products.

  12. Open Health Tools: Tooling for Interoperable Healthcare

    Directory of Open Access Journals (Sweden)

    Skip McGaughey

    2008-11-01

    Full Text Available The Open Health Tools initiative is creating an ecosystem focused on the production of software tooling that promotes the exchange of medical information across political, geographic, cultural, product, and technology lines. At its core, OHT believes that the availability of high-quality tooling that interoperates will propel the industry forward, enabling organizations and vendors to build products and systems that effectively work together. This will ?raise the interoperability bar? as a result of having tools that just work. To achieve these lofty goals, careful consideration must be made to the constituencies that will be most affected by an OHT-influenced world. This document outlines a vision of OHT?s impact to these stakeholders. It does not explain the OHT process itself or how the OHT community operates. Instead, we place emphasis on the impact of that process within the health industry. The catchphrase ?code is king? underpins this document, meaning that the manifestation of any open source community lies in the products and technology it produces.

  13. Trust and Privacy Solutions Based on Holistic Service Requirements

    Science.gov (United States)

    Sánchez Alcón, José Antonio; López, Lourdes; Martínez, José-Fernán; Rubio Cifuentes, Gregorio

    2015-01-01

    The products and services designed for Smart Cities provide the necessary tools to improve the management of modern cities in a more efficient way. These tools need to gather citizens’ information about their activity, preferences, habits, etc. opening up the possibility of tracking them. Thus, privacy and security policies must be developed in order to satisfy and manage the legislative heterogeneity surrounding the services provided and comply with the laws of the country where they are provided. This paper presents one of the possible solutions to manage this heterogeneity, bearing in mind these types of networks, such as Wireless Sensor Networks, have important resource limitations. A knowledge and ontology management system is proposed to facilitate the collaboration between the business, legal and technological areas. This will ease the implementation of adequate specific security and privacy policies for a given service. All these security and privacy policies are based on the information provided by the deployed platforms and by expert system processing. PMID:26712752

  14. Upon a Home Assistant Solution Based on Raspberry Pi Platform

    Directory of Open Access Journals (Sweden)

    Alexandru Florentin IFTIMIE

    2017-01-01

    Full Text Available Our ongoing research on Internet of Things (IoT has been focused on a project aiming to creating a proof of concept for a distributed system capable of controlling common devices found in a house such as TVs, air conditioning units, and other electrical devices. In order to automate these devices, the system integrates various sensors and actuators and, depending of user’s needs and creativity in conceiving and implementing new commands, the system is able to take care and execute the respective commands in a safe and secure manner. This paper presents our current research results upon a personal home assistant solution designed and built around Raspberry Pi V3 platform. The distributed, client-server approach enables users to control home electric and electronic devices from an Android based mobile application.

  15. Common business objects: Demonstrating interoperability in the oil and gas industry

    International Nuclear Information System (INIS)

    McLellan, S.G.; Abusalbi, N.; Brown, J.; Quinlivan, W.F.

    1997-01-01

    The PetroTechnical Open Software Corp. (POSC) was organized in 1990 to define technical methods to make it easier to design interoperable data solutions for oil and gas companies. When POSC rolls out seed implementations, oilfield service members must validate them, correct any errors or ambiguities, and champion these corrections into the original specifications before full integration into POSC-compliant, commercial products. Organizations like POSC are assuming a new role of promoting formation of projects where E and P companies and vendors jointly test their pieces of the migration puzzle on small subsets of the whole problem. The authors describe three such joint projects. While confirming the value of such open cross-company cooperation, these cases also help to redefine interoperability in terms of business objects that will be common across oilfield companies, their applications, access software, data, or data stores

  16. Gastric Outlet Obstruction Palliation: A Novel Stent-Based Solution

    Directory of Open Access Journals (Sweden)

    Natasha M. Rueth

    2010-06-01

    Full Text Available Gastric outlet obstruction (GOO after esophagectomy is a morbid outcome and significantly hinders quality of life for end-stage esophageal cancer patients. In the pre-stent era, palliation consisted of chemotherapy, radiation, tumor ablation, or stricture dilation. In the current era, palliative stenting has emerged as an additional tool; however, migration and tumor ingrowth are ongoing challenges. To mitigate these challenges, we developed a novel, hybrid, stent-based approach for the palliative management of GOO. We present a patient with esophageal cancer diagnosed with recurrent, metastatic disease 1 year after esophagectomy. She developed dehydration and intractable emesis, which significantly interfered with her quality of life. For palliation, we dilated the stenosis and proceeded with our stent-based solution. Using a combined endoscopic and fluoroscopic approach, we placed a 12-mm silicone salivary bypass tube across the pylorus, where it kinked slightly because of local tumor biology. To bridge this defect and ensure luminal patency, we placed a nitinol tracheobronchial stent through the silicone stent. Clinically, the patient had immediate relief from her pre-operative symptoms and was discharged home on a liquid diet. In conclusion, GOO and malignant dysphagia after esophagectomy are significant challenges for patients with end-stage disease. Palliative stenting is a viable option, but migration and tumor ingrowth are common complications. The hybrid approach presented here provides a unique solution to these potential pitfalls. The flared silicone tube minimized the chance of migration and impaired tumor ingrowth. The nitinol stent aided with patency and overcame the challenges of the soft tube. This novel strategy achieved palliation, describing another endoscopic option in the treatment of malignant GOO.

  17. Thermodynamics of dilute aqueous solutions of imidazolium based ionic liquids

    International Nuclear Information System (INIS)

    Singh, Tejwant; Kumar, Arvind

    2011-01-01

    Research highlights: → The thermodynamic behaviour of aqueous imidazolium ILs has been investigated. → Volumetric and ultrasonic results indicated the hydrophobic hydration of ILs. → Viscometric studies revealed studied ionic liquids as water-structure makers. → Hydration number increased with increase in alkyl chain length of the cation. - Abstract: Experimental measurements of density ρ, speed of sound u, and viscosity η of aqueous solutions of various 1-alkyl-3-methylimidazolium based ionic liquid (IL) solutions have been performed in dilute concentration regime at 298.15 K to get insight into hydration behaviour of ILs. The investigated ILs are based on 1-alkyl-3-methylimidazolium cation, [C n mim] having [BF 4 ] - , [Cl] - , [C 1 OSO 3 ] - , and [C 8 OSO 3 ] - as anions where n = 4 or 8. Several thermodynamic parameters like apparent molar volume φ V , isentropic compressibility β s , and viscosity B-coefficients have been derived from experimental data. Limiting value of apparent molar volume has been discussed in terms of intrinsic molar volume (V int ) molar electrostriction volume (V elec ), molar disordered (V dis ), and cage volume (V cage ). Viscosity B-coefficients have been used to quantify the kosmotropic or chaotropic nature of ILs. Hydration number of ILs obtained using elctrostriction volume, isentropic compressibility, viscosity, and differential scanning calorimetry have been found to be comparative within the experimental error. The hydrophobic hydration has found to play an important role in hydration of ILs as compared to hydration due to hydrogen bonding and electrostriction. Limiting molar properties, hydration numbers, and B-coefficients have been discussed in terms of alkyl chain length of cation or nature of anion.

  18. PERSPECTIVES ON INTEROPERABILITY INTEGRATION WITHIN NATO DEFENSE PLANNING PROCESS

    Directory of Open Access Journals (Sweden)

    Florian CIOCAN

    2011-01-01

    Full Text Available Interoperability is not a new area of effort at NATO level. In fact, interoperability and more specifi cally standardization, has been a key element of the Alliance’s approach to fi elding forces for decades. But as the security and operational environment has been in a continuous change, the need to face the new threats and the current involvement in challenging operations in Afghanistan and elsewhere alongside with the necessity to interoperate at lower and lower levels of command with an increasing number of nations, including non-NATO ISAF partners, NGOs, and other organizations, have made the task even more challenging. In this respect Interoperability Integration within NATO Defense Planning Process will facilitate the timely identifi cation, development and delivery of required forces and capabilities that are interoperable and adequately prepared, equipped, trained and supported to undertake the Alliance’s full spectrum of missions.

  19. Nature-based solutions for resilient landscapes and cities.

    Science.gov (United States)

    Lafortezza, Raffaele; Chen, Jiquan; van den Bosch, Cecil Konijnendijk; Randrup, Thomas B

    2017-12-04

    Nature-based solutions (NBS) are increasingly applied to guide the design of resilient landscapes and cities to enable them to reach economic development goals with beneficial outcomes for the environment and society. The NBS concept is closely related to other concepts including sustainability, resilience, ecosystem services, coupled human and environment, and green (blue) infrastructure; however, NBS represent a more efficient and cost-effective approach to development than traditional approaches. The European Commission is actively engaged in investing in NBS as a driver in developing ecosystem services-based approaches throughout Europe and the world. The pool of knowledge and expertise presented in this Special Issue of Environmental Research highlights the applications of NBS as 'living' and adaptable tools to boost the capacity of landscapes and cities to face today's critical environmental, economic and societal challenges. Based on the literature and papers of this Special Issue, we propose five specific challenges for the future of NBS. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Trust Model to Enhance Security and Interoperability of Cloud Environment

    Science.gov (United States)

    Li, Wenjuan; Ping, Lingdi

    Trust is one of the most important means to improve security and enable interoperability of current heterogeneous independent cloud platforms. This paper first analyzed several trust models used in large and distributed environment and then introduced a novel cloud trust model to solve security issues in cross-clouds environment in which cloud customer can choose different providers' services and resources in heterogeneous domains can cooperate. The model is domain-based. It divides one cloud provider's resource nodes into the same domain and sets trust agent. It distinguishes two different roles cloud customer and cloud server and designs different strategies for them. In our model, trust recommendation is treated as one type of cloud services just like computation or storage. The model achieves both identity authentication and behavior authentication. The results of emulation experiments show that the proposed model can efficiently and safely construct trust relationship in cross-clouds environment.

  1. Adaptation of interoperability standards for cross domain usage

    Science.gov (United States)

    Essendorfer, B.; Kerth, Christian; Zaschke, Christian

    2017-05-01

    As globalization affects most aspects of modern life, challenges of quick and flexible data sharing apply to many different domains. To protect a nation's security for example, one has to look well beyond borders and understand economical, ecological, cultural as well as historical influences. Most of the time information is produced and stored digitally and one of the biggest challenges is to receive relevant readable information applicable to a specific problem out of a large data stock at the right time. These challenges to enable data sharing across national, organizational and systems borders are known to other domains (e.g., ecology or medicine) as well. Solutions like specific standards have been worked on for the specific problems. The question is: what can the different domains learn from each other and do we have solutions when we need to interlink the information produced in these domains? A known problem is to make civil security data available to the military domain and vice versa in collaborative operations. But what happens if an environmental crisis leads to the need to quickly cooperate with civil or military security in order to save lives? How can we achieve interoperability in such complex scenarios? The paper introduces an approach to adapt standards from one domain to another and lines out problems that have to be overcome and limitations that may apply.

  2. Optimisation-Based Solution Methods for Set Partitioning Models

    DEFF Research Database (Denmark)

    Rasmussen, Matias Sevel

    The scheduling of crew, i.e. the construction of work schedules for crew members, is often not a trivial task, but a complex puzzle. The task is complicated by rules, restrictions, and preferences. Therefore, manual solutions as well as solutions from standard software packages are not always su......_cient with respect to solution quality and solution time. Enhancement of the overall solution quality as well as the solution time can be of vital importance to many organisations. The _elds of operations research and mathematical optimisation deal with mathematical modelling of di_cult scheduling problems (among...... other topics). The _elds also deal with the development of sophisticated solution methods for these mathematical models. This thesis describes the set partitioning model which has been widely used for modelling crew scheduling problems. Integer properties for the set partitioning model are shown...

  3. Agile Management and Interoperability Testing of SDN/NFV‐Enriched 5G Core Networks

    Directory of Open Access Journals (Sweden)

    Taesang Choi

    2018-02-01

    Full Text Available In the fifth generation (5G era, the radio internet protocol capacity is expected to reach 20 Gb/s per sector, and ultralarge content traffic will travel across a faster wireless/wireline access network and packet core network. Moreover, the massive and mission‐critical Internet of Things is the main differentiator of 5G services. These types of real‐time and large‐bandwidth‐consuming services require a radio latency of less than 1 ms and an end‐to‐end latency of less than a few milliseconds. By distributing 5G core nodes closer to cell sites, the backhaul traffic volume and latency can be significantly reduced by having mobile devices download content immediately from a closer content server. In this paper, we propose a novel solution based on software‐defined network and network function virtualization technologies in order to achieve agile management of 5G core network functionalities with a proof‐of‐concept implementation targeted for the PyeongChang Winter Olympics and describe the results of interoperability testing experiences between two core networks.

  4. Analysis of Android Device-Based Solutions for Fall Detection.

    Science.gov (United States)

    Casilari, Eduardo; Luque, Rafael; Morón, María-José

    2015-07-23

    Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs) has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones) have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources) to fall detection solutions.

  5. Secure Interoperable Open Smart Grid Demonstration Project

    Energy Technology Data Exchange (ETDEWEB)

    Magee, Thoman [Consolidated Edison Company Of New York, Inc., NY (United States)

    2014-12-28

    The Consolidated Edison, Inc., of New York (Con Edison) Secure Interoperable Open Smart Grid Demonstration Project (SGDP), sponsored by the United States (US) Department of Energy (DOE), demonstrated that the reliability, efficiency, and flexibility of the grid can be improved through a combination of enhanced monitoring and control capabilities using systems and resources that interoperate within a secure services framework. The project demonstrated the capability to shift, balance, and reduce load where and when needed in response to system contingencies or emergencies by leveraging controllable field assets. The range of field assets includes curtailable customer loads, distributed generation (DG), battery storage, electric vehicle (EV) charging stations, building management systems (BMS), home area networks (HANs), high-voltage monitoring, and advanced metering infrastructure (AMI). The SGDP enables the seamless integration and control of these field assets through a common, cyber-secure, interoperable control platform, which integrates a number of existing legacy control and data systems, as well as new smart grid (SG) systems and applications. By integrating advanced technologies for monitoring and control, the SGDP helps target and reduce peak load growth, improves the reliability and efficiency of Con Edison’s grid, and increases the ability to accommodate the growing use of distributed resources. Con Edison is dedicated to lowering costs, improving reliability and customer service, and reducing its impact on the environment for its customers. These objectives also align with the policy objectives of New York State as a whole. To help meet these objectives, Con Edison’s long-term vision for the distribution grid relies on the successful integration and control of a growing penetration of distributed resources, including demand response (DR) resources, battery storage units, and DG. For example, Con Edison is expecting significant long-term growth of DG

  6. Developing data interoperability using standards: A wheat community use case [version 2; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Esther Dzale Yeumo

    2017-12-01

    Full Text Available In this article, we present a joint effort of the wheat research community, along with data and ontology experts, to develop wheat data interoperability guidelines. Interoperability is the ability of two or more systems and devices to cooperate and exchange data, and interpret that shared information. Interoperability is a growing concern to the wheat scientific community, and agriculture in general, as the need to interpret the deluge of data obtained through high-throughput technologies grows. Agreeing on common data formats, metadata, and vocabulary standards is an important step to obtain the required data interoperability level in order to add value by encouraging data sharing, and subsequently facilitate the extraction of new information from existing and new datasets. During a period of more than 18 months, the RDA Wheat Data Interoperability Working Group (WDI-WG surveyed the wheat research community about the use of data standards, then discussed and selected a set of recommendations based on consensual criteria. The recommendations promote standards for data types identified by the wheat research community as the most important for the coming years: nucleotide sequence variants, genome annotations, phenotypes, germplasm data, gene expression experiments, and physical maps. For each of these data types, the guidelines recommend best practices in terms of use of data formats, metadata standards and ontologies. In addition to the best practices, the guidelines provide examples of tools and implementations that are likely to facilitate the adoption of the recommendations. To maximize the adoption of the recommendations, the WDI-WG used a community-driven approach that involved the wheat research community from the start, took into account their needs and practices, and provided them with a framework to keep the recommendations up to date. We also report this approach’s potential to be generalizable to other (agricultural domains.

  7. Middleware Interoperability for Robotics: A ROS-YARP Framework

    Directory of Open Access Journals (Sweden)

    Plinio Moreno

    2016-10-01

    Full Text Available Middlewares are fundamental tools for progress in research and applications in robotics. They enable the integration of multiple heterogeneous sensing and actuation devices, as well as providing general purpose modules for key robotics functions (kinematics, navigation, planning. However, no existing middleware yet provides a complete set of functionalities for all robotics applications, and many robots may need to rely on more than one framework. This paper focuses on the interoperability between two of the most prevalent middleware in robotics: YARP and ROS. Interoperability between middlewares should ideally allow users to execute existing software without the necessity of: (i changing the existing code, and (ii writing hand-coded ``bridges'' for each use-case. We propose a framework enabling the communication between existing YARP modules and ROS nodes for robotics applications in an automated way. Our approach generates the ``bridging gap'' code from a configuration file, connecting YARP ports and ROS topics through code-generated YARP Bottles. %%The configuration file must describe: (i the sender entities, (ii the way to group and convert the information read from the sender, (iii the structure of the output message and (iv the receiving entity. Our choice for the many inputs to one output is the most common use-case in robotics applications, where examples include filtering, decision making and visualization. %We support YARP/ROS and ROS/YARP sender/receiver configurations, which are demonstrated in a humanoid on wheels robot that uses YARP for upper body motor control and visual perception, and ROS for mobile base control and navigation algorithms.

  8. Using ontologies to improve semantic interoperability in health data

    Directory of Open Access Journals (Sweden)

    Harshana Liyanage

    2015-07-01

    Full Text Available The present–day health data ecosystem comprises a wide array of complex heterogeneous data sources. A wide range of clinical, health care, social and other clinically relevant information are stored in these data sources. These data exist either as structured data or as free-text. These data are generally individual personbased records, but social care data are generally case based and less formal data sources may be shared by groups. The structured data may be organised in a proprietary way or be coded using one-of-many coding, classification or terminologies that have often evolved in isolation and designed to meet the needs of the context that they have been developed. This has resulted in a wide range of semantic interoperability issues that make the integration of data held on these different systems changing. We present semantic interoperability challenges and describe a classification of these. We propose a four-step process and a toolkit for those wishing to work more ontologically, progressing from the identification and specification of concepts to validating a final ontology. The four steps are: (1 the identification and specification of data sources; (2 the conceptualisation of semantic meaning; (3 defining to what extent routine data can be used as a measure of the process or outcome of care required in a particular study or audit and (4 the formalisation and validation of the final ontology. The toolkit is an extension of a previous schema created to formalise the development of ontologies related to chronic disease management. The extensions are focused on facilitating rapid building of ontologies for time-critical research studies. 

  9. Interoperability science cases with the CDPP tools

    Science.gov (United States)

    Nathanaël, J.; Cecconi, B.; André, N.; Bouchemit, M.; Gangloff, M.; Budnik, E.; Jacquey, C.; Pitout, F.; Durand, J.; Rouillard, A.; Lavraud, B.; Genot, V. N.; Popescu, D.; Beigbeder, L.; Toniutti, J. P.; Caussarieu, S.

    2017-12-01

    Data exchange protocols are never as efficient as when they are invisible for the end user who is then able to discover data, to cross compare observations and modeled data and finally to perform in depth analysis. Over the years these protocols, including SAMP from IVOA, EPN-TAP from the Europlanet 2020 RI community, backed by standard web-services, have been deployed in tools designed by the French Centre de Données de la Physique des Plasmas (CDPP) including AMDA, the Propagation Tool, 3DView, ... . This presentation will focus on science cases which show the capability of interoperability in the planetary and heliophysics contexts, involving both CDPP and companion tools. Europlanet 2020 RI has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 654208.

  10. Solving Interoperability in Translational Health. Perspectives of Students from the International Partnership in Health Informatics Education (IPHIE) 2016 Master Class.

    Science.gov (United States)

    Turner, Anne M; Facelli, Julio C; Jaspers, Monique; Wetter, Thomas; Pfeifer, Daniel; Gatewood, Laël Cranmer; Adam, Terry; Li, Yu-Chuan; Lin, Ming-Chin; Evans, R Scott; Beukenhorst, Anna; van Mens, Hugo Johan Theodoore; Tensen, Esmee; Bock, Christian; Fendrich, Laura; Seitz, Peter; Suleder, Julian; Aldelkhyyel, Ranyah; Bridgeman, Kent; Hu, Zhen; Sattler, Aaron; Guo, Shin-Yi; Mohaimenul, Islam Md Mohaimenul; Anggraini Ningrum, Dina Nur; Tung, Hsin-Ru; Bian, Jiantano; Plasek, Joseph M; Rommel, Casey; Burke, Juandalyn; Sohih, Harkirat

    2017-06-20

    In the summer of 2016 an international group of biomedical and health informatics faculty and graduate students gathered for the 16th meeting of the International Partnership in Health Informatics Education (IPHIE) masterclass at the University of Utah campus in Salt Lake City, Utah. This international biomedical and health informatics workshop was created to share knowledge and explore issues in biomedical health informatics (BHI). The goal of this paper is to summarize the discussions of biomedical and health informatics graduate students who were asked to define interoperability, and make critical observations to gather insight on how to improve biomedical education. Students were assigned to one of four groups and asked to define interoperability and explore potential solutions to current problems of interoperability in health care. We summarize here the student reports on the importance and possible solutions to the "interoperability problem" in biomedical informatics. Reports are provided from each of the four groups of highly qualified graduate students from leading BHI programs in the US, Europe and Asia. International workshops such as IPHIE provide a unique opportunity for graduate student learning and knowledge sharing. BHI faculty are encouraged to incorporate into their curriculum opportunities to exercise and strengthen student critical thinking to prepare our students for solving health informatics problems in the future.

  11. Exact angular momentum projection based on cranked HFB solution

    Energy Technology Data Exchange (ETDEWEB)

    Enami, Kenichi; Tanabe, Kosai; Yosinaga, Naotaka [Saitama Univ., Urawa (Japan). Dept. of Physics

    1998-03-01

    Exact angular momentum projection of cranked HFB solutions is carried out. It is reconfirmed from this calculation that cranked HFB solutions reproduce the intrinsic structure of deformed nucleus. The result also indicates that the energy correction from projection is important for further investigation of nuclear structure. (author)

  12. Radiation effects on viscosimetry of protein based solutions

    International Nuclear Information System (INIS)

    Sabato, S.F.; Lacroix, M.

    2002-01-01

    Due to their good functional properties allied to their excellent nutritional value, milk protein isolates and soy protein concentrates have gained a crescent interest. These proteins could have their structural properties improved when some treatments are applied, such as gamma irradiation, alone or in presence of other compounds, as a plasticizer. In this work, solutions of those proteins were mixed with a generally recognized as safe plasticizer, glycerol. These mixtures (8% protein (w/v) base) at two ratios 1:1 and 2:1 (protein:glycerol) were submitted to a gamma irradiation treatment ( 60 Co), at doses 0, 5, 15 and 25 kGy, and their rheological performance was studied. As irradiation dose increased viscosity measurements decayed significantly (p<0.05) for mixture soy/glycerol and calcium caseinate/glycerol. The mixture sodium caseinate/glycerol showed a trend to form aggregation of macromolecules with dose of 5 kGy, while the apparent viscosity for dispersions containing whey/glycerol remained almost constant as irradiation dose increases. In the case of soy protein isolate and sodium caseinate, a mixture of 2:1 showed a significant higher viscosity (p<0.05) than a mixture of 1:1

  13. Two RFID-based solutions for secure inpatient medication administration.

    Science.gov (United States)

    Yen, Yi-Chung; Lo, Nai-Wei; Wu, Tzong-Chen

    2012-10-01

    Medication error can easily cause serious health damage to inpatients in hospital. Consequently, the whole society has to spend huge amount of extra resources for additional therapies and medication on those affected inpatients. In order to prevent medication errors, secure inpatient medication administration system is required in a hospital. Using RFID technology, such administration system provides automated medication verification for inpatient's medicine doses and generates corresponding medication evidence, which may be audited later for medical dispute. Recently, Peris-Lopez et al. (Int. J. Med. Inform., 2011) proposed an IS-RFID system to enhance inpatient medication safety. Nevertheless, IS-RFID system does not detect the denial of proof attack efficiently and the generated medication evidence cannot defend against counterfeit evidence generated from the hospital. That is, the hospital possesses enough privilege from the design of IS-RFID system to modify generated medication evidence whenever it is necessary. Hence, we design two lightweight RFID-based solutions for secure inpatient medication administration, one for online verification environment and the other for offline validation situation, to achieve system security on evidence generation and provide early detection on denial of proof attack.

  14. A SOA-Based Solution to Monitor Vaccination Coverage Among HIV-Infected Patients in Liguria.

    Science.gov (United States)

    Giannini, Barbara; Gazzarata, Roberta; Sticchi, Laura; Giacomini, Mauro

    2016-01-01

    Vaccination in HIV-infected patients constitutes an essential tool in the prevention of the most common infectious diseases. The Ligurian Vaccination in HIV Program is a proposed vaccination schedule specifically dedicated to this risk group. Selective strategies are proposed within this program, employing ICT (Information and Communication) tools to identify this susceptible target group, to monitor immunization coverage over time and to manage failures and defaulting. The proposal is to connect an immunization registry system to an existing regional platform that allows clinical data re-use among several medical structures, to completely manage the vaccination process. This architecture will adopt a Service Oriented Architecture (SOA) approach and standard HSSP (Health Services Specification Program) interfaces to support interoperability. According to the presented solution, vaccination administration information retrieved from the immunization registry will be structured according to the specifications within the immunization section of the HL7 (Health Level 7) CCD (Continuity of Care Document) document. Immunization coverage will be evaluated through the continuous monitoring of serology and antibody titers gathered from the hospital LIS (Laboratory Information System) structured into a HL7 Version 3 (v3) Clinical Document Architecture Release 2 (CDA R2).

  15. A School with Solutions: Implementing a Solution-Focused/Adlerian-Based Comprehensive School Counseling Program.

    Science.gov (United States)

    LaFountain, Rebecca M.; Garner, Nadine E.

    This book explains how counselors can integrate the theories of solution focused and Adlerian counseling into a comprehensive developmental counseling curriculum. Following an introduction in Chapter 1, Chapter 2 explains how support needs to be developed among the staff to implement a comprehensive school program. The comprehensive developmental…

  16. Solution-Processed Smart Window Platforms Based on Plasmonic Electrochromics

    KAUST Repository

    Abbas, Sara

    2018-04-30

    Electrochromic smart windows offer a viable route to reducing the consumption of buildings energy, which represents about 30% of the worldwide energy consumption. Smart windows are far more compelling than current static windows in that they can dynamically modulate the solar spectrum depending on climate and lighting conditions or simply to meet personal preferences. The latest generation of smart windows relies on nominally transparent metal oxide nanocrystal materials whose chromism can be electrochemically controlled using the plasmonic effect. Plasmonic electrochromic materials selectively control the near infrared (NIR) region of the solar spectrum, responsible for solar heat, without affecting the visible transparency. This is in contrast to conventional electrochromic materials which block both the visible and NIR and thus enables electrochromic devices to reduce the energy consumption of a building or a greenhouse in warm climate regions due to enhancements of both visible lighting and heat blocking. Despite this edge, this technology can benefit from important developments, including low-cost solution-based manufacturing on flexible substrates while maintaining durability and coloration efficiency, demonstration of independent control in the NIR and visible spectra, and demonstration of self-powering capabilities. This thesis is focused on developing low-temperature and all-solution processed plasmonic electrochromic devices and dual-band electrochromic devices. We demonstrate new device fabrication approaches in terms of materials and processes which enhance electrochromic performance all the while maintaining low processing temperatures. Scalable fabrication methods are used to highlight compatibility with high throughput, continuous roll-to-roll fabrication on flexible substrates. In addition, a dualband plasmonic electrochromic device was developed by combining the plasmonic layer with a conventional electrochromic ion storage layer. This enables

  17. Model-driven approach to enterprise interoperability at the technical service level

    NARCIS (Netherlands)

    Kadka, Ravi; Sapkota, Brahmananda; Ferreira Pires, Luis; van Sinderen, Marten J.; Jansen, Slinger

    2013-01-01

    Enterprise Interoperability is the ability of enterprises to interoperate in order to achieve their business goals. Although the purpose of enterprise interoperability is determined at the business level, the use of technical (IT) services to support business services implies that interoperability

  18. Segment-based Eyring-Wilson viscosity model for polymer solutions

    International Nuclear Information System (INIS)

    Sadeghi, Rahmat

    2005-01-01

    A theory-based model is presented for correlating viscosity of polymer solutions and is based on the segment-based Eyring mixture viscosity model as well as the segment-based Wilson model for describing deviations from ideality. The model has been applied to several polymer solutions and the results show that it is reliable both for correlation and prediction of the viscosity of polymer solutions at different molar masses and temperature of the polymer

  19. A Cultural Framework for the Interoperability of C2 Systems

    National Research Council Canada - National Science Library

    Slay, Jill

    2002-01-01

    In considering some of the difficulties experienced in coalition operations, it becomes apparent that attention is needed, is in establishing a cultural framework for the interoperability of personnel (the human agents...

  20. CCSDS SM and C Mission Operations Interoperability Prototype

    Science.gov (United States)

    Lucord, Steven A.

    2010-01-01

    This slide presentation reviews the prototype of the Spacecraft Monitor and Control (SM&C) Operations for interoperability among other space agencies. This particular prototype uses the German Space Agency (DLR) to test the ideas for interagency coordination.

  1. Model-driven development of service compositions for enterprise interoperability

    NARCIS (Netherlands)

    Khadka, Ravi; Sapkota, Brahmananda; Ferreira Pires, Luis; Jansen, Slinger; van Sinderen, Marten J.; Johnson, Pontus

    2011-01-01

    Service-Oriented Architecture (SOA) has emerged as an architectural style to foster enterprise interoperability, as it claims to facilitate the flexible composition of loosely coupled enterprise applications and thus alleviates the heterogeneity problem among enterprises. Meanwhile, Model-Driven

  2. An interoperability architecture for the health information exchange in Rwanda

    CSIR Research Space (South Africa)

    Crichton, R

    2012-08-01

    Full Text Available of an architecture to support: interoperability between existing health information systems already in use in the country; incremental extension into a fully integrated national health information system without substantial reengineering; and scaling, from a single...

  3. Patterns in Standards and Technologies for Economic Information Systems Interoperability

    Directory of Open Access Journals (Sweden)

    Vasile Irimia

    2012-06-01

    Full Text Available This paper presets results from a review of the current standards used for collaboration between economic information systems, including web services and service oriented architecture, EDI, ebXML framework, RosettaNet framework, cXML, xCBL UBL, BPMN, BPEL, WS-CDL, ASN.1, and others. Standards have a key role in promoting economic information system interoperability, and thus enable collaboration. Analyzing the current standards, technologies and applications used for economic information systems interoperability has revealed a common pattern that runs through all of them. From this pattern we construct a basic model of interoperability around which we relate and judge all standards, technologies and applications for economic information systems interoperability.

  4. Improving NATO's Interoperability Through U.S. Precision Weapons

    National Research Council Canada - National Science Library

    Westhauser, Todd

    1998-01-01

    .... This paper compares and contrasts two U.S. advanced precision weapons capabilities, the Paveway LGBs using buddy-lasing tactics and the JDAM, against the criteria of training, cost, interoperability...

  5. Positive train control interoperability and networking research : final report.

    Science.gov (United States)

    2015-12-01

    This document describes the initial development of an ITC PTC Shared Network (IPSN), a hosted : environment to support the distribution, configuration management, and IT governance of Interoperable : Train Control (ITC) Positive Train Control (PTC) s...

  6. Semantic Interoperability, E-Health and Australian Health Statistics.

    Science.gov (United States)

    Goodenough, Sally

    2009-06-01

    E-health implementation in Australia will depend upon interoperable computer systems to share information and data across the health sector. Semantic interoperability, which preserves the meaning of information and data when it is shared or re-purposed, is critical for safe clinical care, and also for any re-use of the information or data for other purposes. One such re-use is for national health statistics. Usable statistics rely on comparable and consistent data, and current practice is to use agreed national data standards to achieve this. The standardisation and interoperability needed to support e-health should also provide strong support for national health statistics. This report discusses some of the semantic interoperability issues involved in moving from the current data supply process for national health statistics to an e-health-enabled future.

  7. Conical intersections of free energy surfaces in solution: Effect of electron correlation on a protonated Schiff base in methanol solution

    International Nuclear Information System (INIS)

    Mori, Toshifumi; Nakano, Katsuhiro; Kato, Shigeki

    2010-01-01

    The minimum energy conical intersection (MECI) optimization method with taking account of the dynamic electron correlation effect [T. Mori and S. Kato, Chem. Phys. Lett. 476, 97 (2009)] is extended to locate the MECI of nonequilibrium free energy surfaces in solution. A multistate electronic perturbation theory is introduced into the nonequilibrium free energy formula, which is defined as a function of solute and solvation coordinates. The analytical free energy gradient and interstate coupling vectors are derived, and are applied to locate MECIs in solution. The present method is applied to study the cis-trans photoisomerization reaction of a protonated Schiff base molecule (PSB3) in methanol (MeOH) solution. It is found that the effect of dynamic electron correlation largely lowers the energy of S 1 state. We also show that the solvation effect strongly stabilizes the MECI obtained by twisting the terminal C=N bond to become accessible in MeOH solution, whereas the conical intersection is found to be unstable in gas phase. The present study indicates that both electron correlation and solvation effects are important in the photoisomerization reaction of PSB3. The effect of counterion is also examined, and seems to be rather small in solution. The structures of free energy surfaces around MECIs are also discussed.

  8. A novel wound rinsing solution based on nano colloidal silver

    Directory of Open Access Journals (Sweden)

    Soheila Kordestani

    2014-10-01

    Full Text Available Objective(s: The present study aimed to investigate the antiseptic properties of a colloidal nano silver wound rinsing solution to inhibit a wide range of pathogens including bacteria, viruses and fungus present in chronic and acute wounds. Materials and Methods:The wound rinsing solution named SilvoSept® was prepared using colloidal nano silver suspension. Physicochemical properties, effectiveness against microorganism including  Staphylocoocous aureus ATCC 6538P, Pseudomonas aeruginosa ATCC 9027, Escherichia coli ATCC 8739 ,Candida albicans ATCC 10231, Aspergillus niger ATCC 16404, MRSA , Mycobacterium spp. , HSV-1 and H1N1, and biocompatibility tests were carried out according to relevant standards . Results: X-ray diffraction (XRD scan was performed on the sample and verify single phase of silver particles in the compound. The size of the silver particles in the solution, measured by dynamic light scattering (DLS techniqu, ranged 80-90 nm. Transmission electron microscopy (TEM revealed spherical shape with smooth surface of the silver nanoparticles. SilvoSept® reduced 5 log from the initial count of 107 CFU/mL of Staphylocoocous aureus ATCC 6538P, Pseudomonas aeruginosa ATCC 9027, Escherichia coli ATCC 8739, Candida albicans ATCC 10231, Aspergillus niger ATCC 16404, MRSA, Mycobacterium spp. Further assessments of SilvoSept solution exhibited a significant inhibition on the replication of HSV-1 and H1N1. The biocompatibility studies showed that the solution was non-allergic, non-irritant and noncytotoxic. Conclusion: Findings of the present study showed that SilvoSept® wound rinsing solution containing nano silver particles is an effective antiseptic solution against a wide spectrum of microorganism. This compound can be a suitable candidate for wound irrigation.   

  9. Interoperability, Enterprise Architectures, and IT Governance in Government

    OpenAIRE

    Scholl , Hans ,; Kubicek , Herbert; Cimander , Ralf

    2011-01-01

    Part 4: Architecture, Security and Interoperability; International audience; Government represents a unique, and also uniquely complex, environment for interoperation of information systems as well as for integration of workflows and processes across governmental levels and branches. While private-sector organizations by and large have the capacity to implement “enterprise architectures” in a relatively straightforward fashion, for notable reasons governments do not enjoy such luxury. For thi...

  10. Ensuring Sustainable Data Interoperability Across the Natural and Social Sciences

    Science.gov (United States)

    Downs, R. R.; Chen, R. S.

    2015-12-01

    Both the natural and social science data communities are attempting to address the long-term sustainability of their data infrastructures in rapidly changing research, technological, and policy environments. Many parts of these communities are also considering how to improve the interoperability and integration of their data and systems across natural, social, health, and other domains. However, these efforts have generally been undertaken in parallel, with little thought about how different sustainability approaches may impact long-term interoperability from scientific, legal, or economic perspectives, or vice versa, i.e., how improved interoperability could enhance—or threaten—infrastructure sustainability. Scientific progress depends substantially on the ability to learn from the legacy of previous work available for current and future scientists to study, often by integrating disparate data not previously assembled. Digital data are less likely than scientific publications to be usable in the future unless they are managed by science-oriented repositories that can support long-term data access with the documentation and services needed for future interoperability. We summarize recent discussions in the social and natural science communities on emerging approaches to sustainability and relevant interoperability activities, including efforts by the Belmont Forum E-Infrastructures project to address global change data infrastructure needs; the Group on Earth Observations to further implement data sharing and improve data management across diverse societal benefit areas; and the Research Data Alliance to develop legal interoperability principles and guidelines and to address challenges faced by domain repositories. We also examine emerging needs for data interoperability in the context of the post-2015 development agenda and the expected set of Sustainable Development Goals (SDGs), which set ambitious targets for sustainable development, poverty reduction, and

  11. GEOSS interoperability for Weather, Ocean and Water

    Science.gov (United States)

    Richardson, David; Nyenhuis, Michael; Zsoter, Ervin; Pappenberger, Florian

    2013-04-01

    "Understanding the Earth system — its weather, climate, oceans, atmosphere, water, land, geodynamics, natural resources, ecosystems, and natural and human-induced hazards — is crucial to enhancing human health, safety and welfare, alleviating human suffering including poverty, protecting the global environment, reducing disaster losses, and achieving sustainable development. Observations of the Earth system constitute critical input for advancing this understanding." With this in mind, the Group on Earth Observations (GEO) started implementing the Global Earth Observation System of Systems (GEOSS). GEOWOW, short for "GEOSS interoperability for Weather, Ocean and Water", is supporting this objective. GEOWOW's main challenge is to improve Earth observation data discovery, accessibility and exploitability, and to evolve GEOSS in terms of interoperability, standardization and functionality. One of the main goals behind the GEOWOW project is to demonstrate the value of the TIGGE archive in interdisciplinary applications, providing a vast amount of useful and easily accessible information to the users through the GEO Common Infrastructure (GCI). GEOWOW aims at developing funcionalities that will allow easy discovery, access and use of TIGGE archive data and of in-situ observations, e.g. from the Global Runoff Data Centre (GRDC), to support applications such as river discharge forecasting.TIGGE (THORPEX Interactive Grand Global Ensemble) is a key component of THORPEX: a World Weather Research Programme to accelerate the improvements in the accuracy of 1-day to 2 week high-impact weather forecasts for the benefit of humanity. The TIGGE archive consists of ensemble weather forecast data from ten global NWP centres, starting from October 2006, which has been made available for scientific research. The TIGGE archive has been used to analyse hydro-meteorological forecasts of flooding in Europe as well as in China. In general the analysis has been favourable in terms of

  12. The advanced microgrid. Integration and interoperability

    Energy Technology Data Exchange (ETDEWEB)

    Bower, Ward Isaac [Ward Bower Innovations, LLC, Albuquerque, NM (United Staes); Ton, Dan T. [U.S. Dept. of Energy, Washington, DC (United States); Guttromson, Ross [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Glover, Steven F [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Stamp, Jason Edwin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bhatnagar, Dhruv [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Reilly, Jim [Reily Associates, Pittston, PA (United States)

    2014-02-01

    This white paper focuses on "advanced microgrids," but sections do, out of necessity, reference today's commercially available systems and installations in order to clearly distinguish the differences and advances. Advanced microgrids have been identified as being a necessary part of the modern electrical grid through a two DOE microgrid workshops, the National Institute of Standards and Technology, Smart Grid Interoperability Panel and other related sources. With their grid-interconnectivity advantages, advanced microgrids will improve system energy efficiency and reliability and provide enabling technologies for grid-independence to end-user sites. One popular definition that has been evolved and is used in multiple references is that a microgrid is a group of interconnected loads and distributed-energy resources within clearly defined electrical boundaries that acts as a single controllable entity with respect to the grid. A microgrid can connect and disconnect from the grid to enable it to operate in both grid-connected or island-mode. Further, an advanced microgrid can then be loosely defined as a dynamic microgrid.

  13. Ocean Data Interoperability Platform (ODIP): using regional data systems for global ocean research

    Science.gov (United States)

    Schaap, D.; Thijsse, P.; Glaves, H.

    2017-12-01

    Ocean acidification, loss of coral reefs, sustainable exploitation of the marine environment are just a few of the challenges researchers around the world are currently attempting to understand and address. However, studies of these ecosystem level challenges are impossible unless researchers can discover and re-use the large volumes of interoperable multidisciplinary data that are currently only accessible through regional and global data systems that serve discreet, and often discipline specific, user communities. The plethora of marine data systems currently in existence are also using different standards, technologies and best practices making re-use of the data problematic for those engaged in interdisciplinary marine research. The Ocean Data Interoperability Platform (ODIP) is responding to this growing demand for discoverable, accessible and reusable data by establishing the foundations for a common global framework for marine data management. But creation of such an infrastructure is a major undertaking, and one that needs to be achieved in part by establishing different levels of interoperability across existing regional and global marine e-infrastructures. Workshops organised by ODIP II facilitate dialogue between selected regional and global marine data systems in an effort to identify potential solutions that integrate these marine e-infrastructures. The outcomes of these discussions have formed the basis for a number of prototype development tasks that aim to demonstrate effective sharing of data across multiple data systems, and allow users to access data from more than one system through a single access point. The ODIP II project is currently developing four prototype solutions that are establishing interoperability between selected regional marine data management infrastructures in Europe, the USA, Canada and Australia, and with the global POGO, IODE Ocean Data Portal (ODP) and GEOSS systems. The potential impact of implementing these solutions for

  14. NASA and Industry Benefits of ACTS High Speed Network Interoperability Experiments

    Science.gov (United States)

    Zernic, M. J.; Beering, D. R.; Brooks, D. E.

    2000-01-01

    This paper provides synopses of the design. implementation, and results of key high data rate communications experiments utilizing the technologies of NASA's Advanced Communications Technology Satellite (ACTS). Specifically, the network protocol and interoperability performance aspects will be highlighted. The objectives of these key experiments will be discussed in their relevant context to NASA missions, as well as, to the comprehensive communications industry. Discussion of the experiment implementation will highlight the technical aspects of hybrid network connectivity, a variety of high-speed interoperability architectures, a variety of network node platforms, protocol layers, internet-based applications, and new work focused on distinguishing between link errors and congestion. In addition, this paper describes the impact of leveraging government-industry partnerships to achieve technical progress and forge synergistic relationships. These relationships will be the key to success as NASA seeks to combine commercially available technology with its own internal technology developments to realize more robust and cost effective communications for space operations.

  15. Combining Archetypes with Fast Health Interoperability Resources in Future-proof Health Information Systems.

    Science.gov (United States)

    Bosca, Diego; Moner, David; Maldonado, Jose Alberto; Robles, Montserrat

    2015-01-01

    Messaging standards, and specifically HL7 v2, are heavily used for the communication and interoperability of Health Information Systems. HL7 FHIR was created as an evolution of the messaging standards to achieve semantic interoperability. FHIR is somehow similar to other approaches like the dual model methodology as both are based on the precise modeling of clinical information. In this paper, we demonstrate how we can apply the dual model methodology to standards like FHIR. We show the usefulness of this approach for data transformation between FHIR and other specifications such as HL7 CDA, EN ISO 13606, and openEHR. We also discuss the advantages and disadvantages of defining archetypes over FHIR, and the consequences and outcomes of this approach. Finally, we exemplify this approach by creating a testing data server that supports both FHIR resources and archetypes.

  16. Interoperability Matter: Levels of Data Sharing, Starting from a 3d Information Modelling

    Science.gov (United States)

    Tommasi, C.; Achille, C.

    2017-02-01

    Nowadays, the adoption of BIM processes in the AEC (Architecture, Engineering and Construction) industry means to be oriented towards synergistic workflows, based on informative instruments capable of realizing the virtual model of the building. The target of this article is to speak about the interoperability matter, approaching the subject through a theoretical part and also a practice example, in order to show how these notions are applicable in real situations. In particular, the case study analysed belongs to the Cultural Heritage field, where it is possible to find some difficulties - both in the modelling and sharing phases - due to the complexity of shapes and elements. Focusing on the interoperability between different software, the questions are: What and how many kind of information can I share? Given that this process leads also to a standardization of the modelled parts, is there the possibility of an accuracy loss?

  17. A cellular-based solution for radio communications in MOUT

    NARCIS (Netherlands)

    Overduin, R.

    2005-01-01

    A short-term and potentially cost-effective solution is proposed for tactical radio communications in Military Operations in Urban Terrain (MOUT) for the Royal Netherlands Army (RNLA). Measurements and computer simulations presented show that on average, outdoor ranges in MOUT as attainable with

  18. LED-based Photometric Stereo: Modeling, Calibration and Numerical Solutions

    DEFF Research Database (Denmark)

    Quéau, Yvain; Durix, Bastien; Wu, Tao

    2018-01-01

    We conduct a thorough study of photometric stereo under nearby point light source illumination, from modeling to numerical solution, through calibration. In the classical formulation of photometric stereo, the luminous fluxes are assumed to be directional, which is very difficult to achieve in pr...

  19. An analytical solution based on mobility and multicriteria ...

    Indian Academy of Sciences (India)

    C AMALI

    2017-10-14

    Oct 14, 2017 ... selection algorithm is needed to select the target network for maximizing the end user satisfaction. The existing works do not .... then a handover is performed to the target network for high- velocity users. If the network with ...... pricing scheme is considered to provide solutions for the challenges behind the ...

  20. A Solution Generator Algorithm for Decision Making based Automated Negotiation in the Construction Domain

    Directory of Open Access Journals (Sweden)

    Arazi Idrus

    2017-12-01

    Full Text Available In this paper, we present our work-in-progress of a proposed framework for automated negotiation in the construction domain. The proposed framework enables software agents to conduct negotiations and autonomously make value-based decisions. The framework consists of three main components which are, solution generator algorithm, negotiation algorithm, and conflict resolution algorithm. This paper extends the discussion on the solution generator algorithm that enables software agents to generate solutions and rank them from 1st to nth solution for the negotiation stage of the operation. The solution generator algorithm consists of three steps which are, review solutions, rank solutions, and form ranked solutions. For validation purpose, we present a scenario that utilizes the proposed algorithm to rank solutions. The validation shows that the algorithm is promising, however, it also highlights the conflict between different parties that needs further negotiation action.

  1. A Working Framework for Enabling International Science Data System Interoperability

    Science.gov (United States)

    Hughes, J. Steven; Hardman, Sean; Crichton, Daniel J.; Martinez, Santa; Law, Emily; Gordon, Mitchell K.

    2016-07-01

    For diverse scientific disciplines to interoperate they must be able to exchange information based on a shared understanding. To capture this shared understanding, we have developed a knowledge representation framework that leverages ISO level reference models for metadata registries and digital archives. This framework provides multi-level governance, evolves independent of the implementation technologies, and promotes agile development, namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. The knowledge representation is captured in an ontology through a process of knowledge acquisition. Discipline experts in the role of stewards at the common, discipline, and project levels work to design and populate the ontology model. The result is a formal and consistent knowledge base that provides requirements for data representation, integrity, provenance, context, identification, and relationship. The contents of the knowledge base are translated and written to files in suitable formats to configure system software and services, provide user documentation, validate input, and support data analytics. This presentation will provide an overview of the framework, present a use case that has been adopted by an entire science discipline at the international level, and share some important lessons learned.

  2. Integrated Water Hazards Engineering Based on Mapping, Nature-Based and Technical Solutions

    Science.gov (United States)

    Halbac-Cotoara-Zamfir, Rares; Herban, Sorin; Stolte, Jannes; Bozan, Csaba

    2017-10-01

    Climate change is expected to alter average temperature and precipitation values and to increase the variability of precipitation events, which may lead to even more intense and frequent water hazards. Water hazards engineering is the branch of engineering concerned with the application of scientific and engineering principles for protection of human populations from the effects of water hazards; protection of environments, both local and global, from the potentially deleterious effects of water hazards; and improvement of environmental quality for mitigating the negative effects of water hazards. An integrated approach of water hazards engineering based on mapping, nature-based and technical solutions will constitute a feasible solution in the process of adapting to challenges generated by climate changes worldwide. This paper will debate this concept also providing some examples from several European countries.

  3. A method for calculating the acid-base equilibria in aqueous and nonaqueous electrolyte solutions

    Science.gov (United States)

    Tanganov, B. B.; Alekseeva, I. A.

    2017-06-01

    Concentrations of particles in acid-base equilibria in aqueous and nonaqueous solutions of electrolytes are calculated on the basis of logarithmic charts, activity coefficients, and equilibrium constants.

  4. A Service Oriented Architecture Approach to Achieve Interoperability between Immunization Information Systems in Iran.

    Science.gov (United States)

    Hosseini, Masoud; Ahmadi, Maryam; Dixon, Brian E

    2014-01-01

    Clinical decision support (CDS) systems can support vaccine forecasting and immunization reminders; however, immunization decision-making requires data from fragmented, independent systems. Interoperability and accurate data exchange between immunization information systems (IIS) is an essential factor to utilize Immunization CDS systems. Service oriented architecture (SOA) and Health Level 7 (HL7) are dominant standards for web-based exchange of clinical information. We implemented a system based on SOA and HL7 v3 to support immunization CDS in Iran. We evaluated system performance by exchanging 1500 immunization records for roughly 400 infants between two IISs. System turnaround time is less than a minute for synchronous operation calls and the retrieved immunization history of infants were always identical in different systems. CDS generated reports were accordant to immunization guidelines and the calculations for next visit times were accurate. Interoperability is rare or nonexistent between IIS. Since inter-state data exchange is rare in United States, this approach could be a good prototype to achieve interoperability of immunization information.

  5. A Web Service Protocol Realizing Interoperable Internet of Things Tasking Capability

    Science.gov (United States)

    Huang, Chih-Yuan; Wu, Cheng-Hung

    2016-01-01

    The Internet of Things (IoT) is an infrastructure that interconnects uniquely-identifiable devices using the Internet. By interconnecting everyday appliances, various monitoring, and physical mashup applications can be constructed to improve human’s daily life. In general, IoT devices provide two main capabilities: sensing and tasking capabilities. While the sensing capability is similar to the World-Wide Sensor Web, this research focuses on the tasking capability. However, currently, IoT devices created by different manufacturers follow different proprietary protocols and are locked in many closed ecosystems. This heterogeneity issue impedes the interconnection between IoT devices and damages the potential of the IoT. To address this issue, this research aims at proposing an interoperable solution called tasking capability description that allows users to control different IoT devices using a uniform web service interface. This paper demonstrates the contribution of the proposed solution by interconnecting different IoT devices for different applications. In addition, the proposed solution is integrated with the OGC SensorThings API standard, which is a Web service standard defined for the IoT sensing capability. Consequently, the Extended SensorThings API can realize both IoT sensing and tasking capabilities in an integrated and interoperable manner. PMID:27589759

  6. A Web Service Protocol Realizing Interoperable Internet of Things Tasking Capability.

    Science.gov (United States)

    Huang, Chih-Yuan; Wu, Cheng-Hung

    2016-08-31

    The Internet of Things (IoT) is an infrastructure that interconnects uniquely-identifiable devices using the Internet. By interconnecting everyday appliances, various monitoring, and physical mashup applications can be constructed to improve human's daily life. In general, IoT devices provide two main capabilities: sensing and tasking capabilities. While the sensing capability is similar to the World-Wide Sensor Web, this research focuses on the tasking capability. However, currently, IoT devices created by different manufacturers follow different proprietary protocols and are locked in many closed ecosystems. This heterogeneity issue impedes the interconnection between IoT devices and damages the potential of the IoT. To address this issue, this research aims at proposing an interoperable solution called tasking capability description that allows users to control different IoT devices using a uniform web service interface. This paper demonstrates the contribution of the proposed solution by interconnecting different IoT devices for different applications. In addition, the proposed solution is integrated with the OGC SensorThings API standard, which is a Web service standard defined for the IoT sensing capability. Consequently, the Extended SensorThings API can realize both IoT sensing and tasking capabilities in an integrated and interoperable manner.

  7. A New IMS Based Inter-working Solution

    Science.gov (United States)

    Zhu, Zhongwen; Brunner, Richard

    With the evolution of third generation network, more and more multimedia services are developed and deployed. Any new service to be deployed in IMS network is required to inter-work with existing Internet communities or legacy terminal users in order to appreciate the end users, who are the main drivers for the service to succeed. The challenge for Inter-working between IMS (IP Multimedia Subsystem) and non-IMS network is “how to handle recipient’s address”. This is because each network has its own routable address schema. For instance, the address for Google Talk user is xmpp:xyz@google.com, which is un-routable in IMS network. Hereafter a new Inter-working (IW) solution between IMS and non-IMS network is proposed for multimedia services that include Instant Messaging, Chat, and File transfer, etc. It is an end-to-end solution built on IMS infrastructure. The Public Service Identity (PSI) defined in 3GPP standard (3rd Generation Partnership Project) is used to allow terminal clients to allocate this IW service. When sending the SIP (Session Initial Protocol) request out for multimedia services, the terminal includes the recipient’s address in the payload instead of the “Request-URI” header. In the network, the proposed solution provides the mapping rules between different networks in MM-IW (Multimedia IW). The detailed technical description and the corresponding use cases are present. The comparison with other alternatives is made. The benefits of the proposed solution are highlighted.

  8. Distributed GIS Systems, Open Specifications and Interoperability: How do They Relate to the Sustainable Management of Natural Resources?

    Science.gov (United States)

    Rafael Moreno-Sanchez

    2006-01-01

    The aim of this is paper is to provide a conceptual framework for the session: “The role of web-based Geographic Information Systems in supporting sustainable management.” The concepts of sustainability, sustainable forest management, Web Services, Distributed Geographic Information Systems, interoperability, Open Specifications, and Open Source Software are defined...

  9. Advances in a Distributed Approach for Ocean Model Data Interoperability

    Directory of Open Access Journals (Sweden)

    Richard P. Signell

    2014-03-01

    Full Text Available An infrastructure for earth science data is emerging across the globe based on common data models and web services. As we evolve from custom file formats and web sites to standards-based web services and tools, data is becoming easier to distribute, find and retrieve, leaving more time for science. We describe recent advances that make it easier for ocean model providers to share their data, and for users to search, access, analyze and visualize ocean data using MATLAB® and Python®. These include a technique for modelers to create aggregated, Climate and Forecast (CF metadata convention datasets from collections of non-standard Network Common Data Form (NetCDF output files, the capability to remotely access data from CF-1.6-compliant NetCDF files using the Open Geospatial Consortium (OGC Sensor Observation Service (SOS, a metadata standard for unstructured grid model output (UGRID, and tools that utilize both CF and UGRID standards to allow interoperable data search, browse and access. We use examples from the U.S. Integrated Ocean Observing System (IOOS® Coastal and Ocean Modeling Testbed, a project in which modelers using both structured and unstructured grid model output needed to share their results, to compare their results with other models, and to compare models with observed data. The same techniques used here for ocean modeling output can be applied to atmospheric and climate model output, remote sensing data, digital terrain and bathymetric data.

  10. Advances in a distributed approach for ocean model data interoperability

    Science.gov (United States)

    Signell, Richard P.; Snowden, Derrick P.

    2014-01-01

    An infrastructure for earth science data is emerging across the globe based on common data models and web services. As we evolve from custom file formats and web sites to standards-based web services and tools, data is becoming easier to distribute, find and retrieve, leaving more time for science. We describe recent advances that make it easier for ocean model providers to share their data, and for users to search, access, analyze and visualize ocean data using MATLAB® and Python®. These include a technique for modelers to create aggregated, Climate and Forecast (CF) metadata convention datasets from collections of non-standard Network Common Data Form (NetCDF) output files, the capability to remotely access data from CF-1.6-compliant NetCDF files using the Open Geospatial Consortium (OGC) Sensor Observation Service (SOS), a metadata standard for unstructured grid model output (UGRID), and tools that utilize both CF and UGRID standards to allow interoperable data search, browse and access. We use examples from the U.S. Integrated Ocean Observing System (IOOS®) Coastal and Ocean Modeling Testbed, a project in which modelers using both structured and unstructured grid model output needed to share their results, to compare their results with other models, and to compare models with observed data. The same techniques used here for ocean modeling output can be applied to atmospheric and climate model output, remote sensing data, digital terrain and bathymetric data.

  11. An Interoperable System toward Cardiac Risk Stratification from ECG Monitoring

    Science.gov (United States)

    Mora-Jiménez, Inmaculada; Ramos-López, Javier; Quintanilla Fernández, Teresa; García-García, Antonio; Díez-Mazuela, Daniel; García-Alberola, Arcadi

    2018-01-01

    Many indices have been proposed for cardiovascular risk stratification from electrocardiogram signal processing, still with limited use in clinical practice. We created a system integrating the clinical definition of cardiac risk subdomains from ECGs and the use of diverse signal processing techniques. Three subdomains were defined from the joint analysis of the technical and clinical viewpoints. One subdomain was devoted to demographic and clinical data. The other two subdomains were intended to obtain widely defined risk indices from ECG monitoring: a simple-domain (heart rate turbulence (HRT)), and a complex-domain (heart rate variability (HRV)). Data provided by the three subdomains allowed for the generation of alerts with different intensity and nature, as well as for the grouping and scrutinization of patients according to the established processing and risk-thresholding criteria. The implemented system was tested by connecting data from real-world in-hospital electronic health records and ECG monitoring by considering standards for syntactic (HL7 messages) and semantic interoperability (archetypes based on CEN/ISO EN13606 and SNOMED-CT). The system was able to provide risk indices and to generate alerts in the health records to support decision-making. Overall, the system allows for the agile interaction of research and clinical practice in the Holter-ECG-based cardiac risk domain. PMID:29494497

  12. An Interoperable System toward Cardiac Risk Stratification from ECG Monitoring

    Directory of Open Access Journals (Sweden)

    Cristina Soguero-Ruiz

    2018-03-01

    Full Text Available Many indices have been proposed for cardiovascular risk stratification from electrocardiogram signal processing, still with limited use in clinical practice. We created a system integrating the clinical definition of cardiac risk subdomains from ECGs and the use of diverse signal processing techniques. Three subdomains were defined from the joint analysis of the technical and clinical viewpoints. One subdomain was devoted to demographic and clinical data. The other two subdomains were intended to obtain widely defined risk indices from ECG monitoring: a simple-domain (heart rate turbulence (HRT, and a complex-domain (heart rate variability (HRV. Data provided by the three subdomains allowed for the generation of alerts with different intensity and nature, as well as for the grouping and scrutinization of patients according to the established processing and risk-thresholding criteria. The implemented system was tested by connecting data from real-world in-hospital electronic health records and ECG monitoring by considering standards for syntactic (HL7 messages and semantic interoperability (archetypes based on CEN/ISO EN13606 and SNOMED-CT. The system was able to provide risk indices and to generate alerts in the health records to support decision-making. Overall, the system allows for the agile interaction of research and clinical practice in the Holter-ECG-based cardiac risk domain.

  13. Interoperable Solar Data and Metadata via LISIRD 3

    Science.gov (United States)

    Wilson, A.; Lindholm, D. M.; Pankratz, C. K.; Snow, M. A.; Woods, T. N.

    2015-12-01

    LISIRD 3 is a major upgrade of the LASP Interactive Solar Irradiance Data Center (LISIRD), which serves several dozen space based solar irradiance and related data products to the public. Through interactive plots, LISIRD 3 provides data browsing supported by data subsetting and aggregation. Incorporating a semantically enabled metadata repository, LISIRD 3 users see current, vetted, consistent information about the datasets offered. Users can now also search for datasets based on metadata fields such as dataset type and/or spectral or temporal range. This semantic database enables metadata browsing, so users can discover the relationships between datasets, instruments, spacecraft, mission and PI. The database also enables creation and publication of metadata records in a variety of formats, such as SPASE or ISO, making these datasets more discoverable. The database also enables the possibility of a public SPARQL endpoint, making the metadata browsable in an automated fashion. LISIRD 3's data access middleware, LaTiS, provides dynamic, on demand reformatting of data and timestamps, subsetting and aggregation, and other server side functionality via a RESTful OPeNDAP compliant API, enabling interoperability between LASP datasets and many common tools. LISIRD 3's templated front end design, coupled with the uniform data interface offered by LaTiS, allows easy integration of new datasets. Consequently the number and variety of datasets offered by LISIRD has grown to encompass several dozen, with many more to come. This poster will discuss design and implementation of LISIRD 3, including tools used, capabilities enabled, and issues encountered.

  14. Lactated Ringer-based storage solutions are equally well suited for the storage of fresh osteochondral allografts as cell culture medium-based storage solutions.

    Science.gov (United States)

    Harb, Afif; von Horn, Alexander; Gocalek, Kornelia; Schäck, Luisa Marilena; Clausen, Jan; Krettek, Christian; Noack, Sandra; Neunaber, Claudia

    2017-07-01

    Due to the rising interest in Europe to treat large cartilage defects with osteochondrale allografts, research aims to find a suitable solution for long-term storage of osteochondral allografts. This is further encouraged by the fact that legal restrictions currently limit the use of the ingredients from animal or human sources that are being used in other regions of the world (e.g. in the USA). Therefore, the aim of this study was A) to analyze if a Lactated Ringer (LR) based solution is as efficient as a Dulbecco modified Eagle's minimal essential medium (DMEM) in maintaining chondrocyte viability and B) at which storage temperature (4°C vs. 37°C) chondrocyte survival of the osteochondral allograft is optimally sustained. 300 cartilage grafts were collected from knees of ten one year-old Black Head German Sheep. The grafts were stored in four different storage solutions (one of them DMEM-based, the other three based on Lactated Ringer Solution), at two different temperatures (4 and 37°C) for 14 and 56days. At both points in time, chondrocyte survival as well as death rate, Glycosaminoglycan (GAG) content, and Hydroxyproline (HP) concentration were measured and compared between the grafts stored in the different solutions and at the different temperatures. Independent of the storage solutions tested, chondrocyte survival rates were higher when stored at 4°C compared to storage at 37°C both after short-term (14days) and long-term storage (56days). At no point in time did the DMEM-based solution show a superior chondrocyte survival compared to lactated Ringer based solution. GAG and HP content were comparable across all time points, temperatures and solutions. LR based solutions that contain only substances that are approved in Germany may be just as efficient for storing grafts as the USA DMEM-based solution gold standard. Moreover, in the present experiment storage of osteochondral allografts at 4°C was superior to storage at 37°C. Copyright © 2017

  15. Interoperability challenges for the Sustainable Management of seagrass meadows (Invited)

    Science.gov (United States)

    Nativi, S.; Pastres, R.; Bigagli, L.; Venier, C.; Zucchetta, M.; Santoro, M.

    2013-12-01

    Seagrass meadows (marine angiosperm plants) occupy less than 0.2% of the global ocean surface, annually store about 10-18% of the so-called 'Blue Carbon', i.e. the Carbon stored in coastal vegetated areas. Recent literature estimates that the flux to the long-term carbon sink in seagrasses represents 10-20% of seagrasses global average production. Such figures can be translated into economic benefits, taking into account that a ton of carbon dioxide in Europe is paid at around 15 € in the carbon market. This means that the organic carbon retained in seagrass sediments in the Mediterranean is worth 138 - 1128 billion €, which represents 6-23 € per square meter. This is 9-35 times more than one square meter of tropical forest soil (0.66 € per square meter), or 5-17 times when considering both the above and the belowground compartments in tropical forests. According the most conservative estimations, about 10% of the Mediterranean meadows have been lost during the last century. In the framework of the GEOSS (Global Earth Observation System of Systems) initiative, the MEDINA project (funded by the European Commission and coordinated by the University of Ca'Foscari in Venice) prepared a showcase as part of the GEOSS Architecture Interoperability Pilot -phase 6 (AIP-6). This showcase aims at providing a tool for the sustainable management of seagrass meadows along the Mediterranean coastline. The application is based on an interoperability framework providing a set of brokerage services to easily ingest and run a Habitat Suitability model (a model predicting the probability a given site to provide a suitable habitat for the development of seagrass meadow and the average coverage expected). The presentation discusses such a framework explaining how the input data is discovered, accessed and processed to ingest the model (developed in the MEDINA project). Furthermore, the brokerage framework provides the necessary services to run the model and visualize results

  16. Promoting Savings at Tax Time through a Video-Based Solution-Focused Brief Coaching Intervention

    OpenAIRE

    Lance Palmer; Teri Pichot; Irina Kunovskaya

    2016-01-01

    Solution-focused brief coaching, based on solution-focused brief therapy, is a well-established practice model and is used widely to help individuals progress toward desired outcomes in a variety of settings. This papers presents the findings of a pilot study that examined the impact of a video-based solution-focused brief coaching intervention delivered in conjunction with income tax preparation services at a Volunteer Income Tax Assistance location (n = 212). Individuals receiving tax prepa...

  17. Solution of wave-like equation based on Haar wavelet

    Directory of Open Access Journals (Sweden)

    Naresh Berwal

    2012-11-01

    Full Text Available Wavelet transform and wavelet analysis are powerful mathematical tools for many problems. Wavelet also can be applied in numerical analysis. In this paper, we apply Haar wavelet method to solve wave-like equation with initial and boundary conditions known. The fundamental idea of Haar wavelet method is to convert the differential equations into a group of algebraic equations, which involves a finite number or variables. The results and graph show that the proposed way is quite reasonable when compared to exact solution.

  18. Interoperability Guidelines for Lithuanian E-Learning Management Systems

    Directory of Open Access Journals (Sweden)

    Eugenijus Kurilovas

    2011-08-01

    Full Text Available Purpose – the paper aims to analyse e-learning content and repositories along with the problems of learning organisation interoperability. The main objective of the paper is to analyse scientific research results and the newest international experience in the area and to provide interoperability guidelines and recommendations for the implementation of appropriate Lithuanian state programmes. The learning content and repositories recommendations are designed for the implementation of the Lithuanian education portal project as well as Lithuanian Virtual University (LVU programme’s information services’ (LABT / eLABa and e-learning services’ (LieDM sub-programmes. The whole education institution recommendations are designed for the maintenance and development of LVU programme’s management services’ (LieMSIS system.Design/methodology/approach – methods used for the general analysis of proposed interoperability guidelines (reccomendations were bibliographic research and comparative analysis of Lithuanian and foreign scientific works published in periodicals and large-scale EU-funded interoperability projects deliverables. System analysis and comparative analysis methods were used in order to formulate and analyse systems’ interoperability guidelines and recommendations. The author employed the experimental research method while working in the appropriate EU-funded interoperability projects to form the guidelines (recommendations. In order to summarize the results, the evaluative research method was used..Findings – the international guidelines and recommendations presented in the paper could be suitable for implementation while developing Lithuanian state education information systems such as the Lithuanian education portal, the Lithuanian academic libraries’ (eLABa system, the Lithuanian distance learning system (LieDM, and the Lithuanian universities’ management system (LieMSIS.Research limitations/implications – the paper

  19. Interoperability Guidelines for Lithuanian E-Learning Management Systems

    Directory of Open Access Journals (Sweden)

    Eugenijus Kurilovas

    2013-08-01

    Full Text Available Purpose – the paper aims to analyse e-learning content and repositories along with the problems of learning organisation interoperability. The main objective of the paper is to analyse scientific research results and the newest international experience in the area and to provide interoperability guidelines and recommendations for the implementation of appropriate Lithuanian state programmes. The learning content and repositories recommendations are designed for the implementation of the Lithuanian education portal project as well as Lithuanian Virtual University (LVU programme’s information services’ (LABT / eLABa and e-learning services’ (LieDM sub-programmes. The whole education institution recommendations are designed for the maintenance and development of LVU programme’s management services’ (LieMSIS system.Design/methodology/approach – methods used for the general analysis of proposed interoperability guidelines (reccomendations were bibliographic research and comparative analysis of Lithuanian and foreign scientific works published in periodicals and large-scale EU-funded interoperability projects deliverables. System analysis and comparative analysis methods were used in order to formulate and analyse systems’ interoperability guidelines and recommendations. The author employed the experimental research method while working in the appropriate EU-funded interoperability projects to form the guidelines (recommendations. In order to summarize the results, the evaluative research method was used.Findings – the international guidelines and recommendations presented in the paper could be suitable for implementation while developing Lithuanian state education information systems such as the Lithuanian education portal, the Lithuanian academic libraries’ (eLABa system, the Lithuanian distance learning system (LieDM, and the Lithuanian universities’ management system (LieMSIS.Research limitations/implications – the paper

  20. A software solution to enable roaming between satellite and terrestrial 3G networks

    Science.gov (United States)

    Davies, Phil; Hartwell, Gareth; Shave, Nick

    2002-07-01

    As the number and type of communications systems in use around the world multiplies so the need for gateways to allow these systems to interoperate increases. This is particularly true for global satellite systems that potentially need to interoperate with systems complying to widely different standards around the world. The ETSI proposed S-UMTS system and (significant) bandwidth allocation is intended to provide an important overlay to future terrestrial 3rd Generation networks. S-UMTS is specifically targeted at providing services in remote regions, or potentially just as important, when out of coverage of the subscriber's home network or a network with a suitable roaming agreement. Thus, satellite/terrestrial interoperability is a key issue to enable subscribers the standard and diversity of service offered by terrestrial systems plus the broad coverage offered by satellite systems. There are considerable technical challenges involved in achieving true interoperability between networks in the 2.5 and 3G world, and therefore in achieving the true vision of S-UMTS. One of the key technical issues is inter-working between different standards, which will continue into the 3G period due to the large number of different types of network (GSM, IS-41, PDC, CDMA, satellite, etc) employed throughout the world. Packet data roaming onto satellite networks is set to become a new important capability for business users in support of the global mobile office as GPRS and 3G services become more widespread. When out of terrestrial network coverage or when the availability of standard call or data transfer charges globally is needed, roaming onto satellite networks provides a very attractive solution. Using the Inmarsat system as an example this paper addresses how interoperability between satellite and terrestrial networks can be assured through the use of standard computer equipment running software-based gateways.

  1. Enhancing Science Teaching through Performing Marbling Art Using Basic Solutions and Base Indicators

    Science.gov (United States)

    Çil, Emine; Çelik, Kevser; Maçin, Tuba; Demirbas, Gülay; Gökçimen, Özlem

    2014-01-01

    Basic solutions are an indispensable part of our daily life. Basic solutions are commonly used in industries such as the textile industry, oil refineries, the fertilizer industry, and pharmaceutical products. Most cleaning agents, such as soap, detergent, and bleach, and some of our foods, such as chocolate and eggs, include bases. Bases are the…

  2. Supply Chain-based Solution to Prevent Fuel Tax Evasion

    Energy Technology Data Exchange (ETDEWEB)

    Franzese, Oscar [ORNL; Capps, Gary J [ORNL; Daugherty, Michael [United States Department of Transportation (USDOT), Federal Highway Administration (FHWA); Siekmann, Adam [ORNL; Lascurain, Mary Beth [ORNL; Barker, Alan M [ORNL

    2016-01-01

    The primary source of funding for the United States transportation system is derived from motor fuel and other highway use taxes. Loss of revenue attributed to fuel-tax evasion has been assessed to be somewhere between $1 billion per year, or approximately 25% of the total tax collected. Any solution that addresses this problem needs to include not only the tax-collection agencies and auditors, but also the carriers transporting oil products and the carriers customers. This paper presents a system developed by the Oak Ridge National Laboratory for the Federal Highway Administration which has the potential to reduce or eliminate many fuel-tax evasion schemes. The solution balances the needs of tax-auditors and those of the fuel-hauling companies and their customers. The technology was deployed and successfully tested during an eight-month period on a real-world fuel-hauling fleet. Day-to-day operations of the fleet were minimally affected by their interaction with this system. The results of that test are discussed in this paper.

  3. Personal health records: is rapid adoption hindering interoperability?

    Science.gov (United States)

    Studeny, Jana; Coustasse, Alberto

    2014-01-01

    The establishment of the Meaningful Use criteria has created a critical need for robust interoperability of health records. A universal definition of a personal health record (PHR) has not been agreed upon. Standardized code sets have been built for specific entities, but integration between them has not been supported. The purpose of this research study was to explore the hindrance and promotion of interoperability standards in relationship to PHRs to describe interoperability progress in this area. The study was conducted following the basic principles of a systematic review, with 61 articles used in the study. Lagging interoperability has stemmed from slow adoption by patients, creation of disparate systems due to rapid development to meet requirements for the Meaningful Use stages, and rapid early development of PHRs prior to the mandate for integration among multiple systems. Findings of this study suggest that deadlines for implementation to capture Meaningful Use incentive payments are supporting the creation of PHR data silos, thereby hindering the goal of high-level interoperability.

  4. Achieving Interoperability in GEOSS - How Close Are We?

    Science.gov (United States)

    Arctur, D. K.; Khalsa, S. S.; Browdy, S. F.

    2010-12-01

    A primary goal of the Global Earth Observing System of System (GEOSS) is improving the interoperability between the observational, modelling, data assimilation, and prediction systems contributed by member countries. The GEOSS Common Infrastructure (GCI) comprises the elements designed to enable discovery and access to these diverse data and information sources. But to what degree can the mechanisms for accessing these data, and the data themselves, be considered interoperable? Will the separate efforts by Communities of Practice within GEO to build their own portals, such as for Energy, Biodiversity, and Air Quality, lead to fragmentation or synergy? What communication and leadership do we need with these communities to improve interoperability both within and across such communities? The Standards and Interoperability Forum (SIF) of GEO's Architecture and Data Committee has assessed progress towards achieving the goal of global interoperability and made recommendations regarding evolution of the architecture and overall data strategy to ensure fulfillment of the GEOSS vision. This presentation will highlight the results of this study, and directions for further work.

  5. Evaluating Sustainability Models for Interoperability through Brokering Software

    Science.gov (United States)

    Pearlman, Jay; Benedict, Karl; Best, Mairi; Fyfe, Sue; Jacobs, Cliff; Michener, William; Nativi, Stefano; Powers, Lindsay; Turner, Andrew

    2016-04-01

    Sustainability of software and research support systems is an element of innovation that is not often discussed. Yet, sustainment is essential if we expect research communities to make the time investment to learn and adopt new technologies. As the Research Data Alliance (RDA) is developing new approaches to interoperability, the question of uptake and sustainability is important. Brokering software sustainability is one of the areas that is being addressed in RDA. The Business Models Team of the Research Data Alliance Brokering Governance Working Group examined several support models proposed to promote the long-term sustainability of brokering middleware. The business model analysis includes examination of funding source, implementation frameworks and challenges, and policy and legal considerations. Results of this comprehensive analysis highlight advantages and disadvantages of the various models with respect to the specific requirements for brokering services. We offer recommendations based on the outcomes of this analysis that suggest that hybrid funding models present the most likely avenue to long term sustainability.

  6. Solution or suspension - Does it matter for lipid based systems?

    DEFF Research Database (Denmark)

    Larsen, A T; Holm, R; Müllertz, A

    2017-01-01

    In this study, the potential of co-administering an aqueous suspension with a placebo lipid vehicle, i.e. chase dosing, was investigated in rats relative to the aqueous suspension alone or a solution of the drug in the lipid vehicle. The lipid investigated in the present study was Labrafil M2125CS...... and three evaluated poorly soluble model compounds, danazol, cinnarizine and halofantrine. For cinnarizine and danazol the oral bioavailability in rats after chase dosing or dosing the compound dissolved in Labrafil M21515CS was similar and significantly higher than for the aqueous suspension...... or a lower solubility in the colloidal structures formed during digestion, but other mechanisms may also be involved. The study thereby supported the potential of chase dosing as a potential dosing regimen in situations where it is beneficial to have a drug in the solid state, e.g. due to chemical stability...

  7. Mutagenicity of irradiated solutions of nuclei acid bases and nucleosides in Salmonella typhimurium

    International Nuclear Information System (INIS)

    Wilmer, J.; Schubert, J.

    1981-01-01

    Solutions of nucleic acid bases, nucleosides and a nucleotide, saturated with either N 2 , N 2 O or O 2 , were irradiated and tested for mutagenicity towards Salmonella typhimurium, with and without pre-incubation. Irradiated solutions of the nuclei acid bases were all non-mutagenic. Irradiated solutions of the nucleosides showed mutagenicity in S. typhimurium TA100 (pre-incubation assay). Generally, the mutagenicity followed the order: N 2 O > N 2 > O 2 . The results show that the formation of mutagenic radiolytic products is initiated by attack of mainly solutions of the nucleotide thymidine-5'-monophosphate, no mutagenicity could be detected. (orig.)

  8. Operational Interoperability Challenges on the Example of GEOSS and WIS

    Science.gov (United States)

    Heene, M.; Buesselberg, T.; Schroeder, D.; Brotzer, A.; Nativi, S.

    2015-12-01

    The following poster highlights the operational interoperability challenges on the example of Global Earth Observation System of Systems (GEOSS) and World Meteorological Organization Information System (WIS). At the heart of both systems is a catalogue of earth observation data, products and services but with different metadata management concepts. While in WIS a strong governance with an own metadata profile for the hundreds of thousands metadata records exists, GEOSS adopted a more open approach for the ten million records. Furthermore, the development of WIS - as an operational system - follows a roadmap with committed downwards compatibility while the GEOSS development process is more agile. The poster discusses how the interoperability can be reached for the different metadata management concepts and how a proxy concept helps to couple two different systems which follow a different development methodology. Furthermore, the poster highlights the importance of monitoring and backup concepts as a verification method for operational interoperability.

  9. Combination of graph heuristics in producing initial solution of curriculum based course timetabling problem

    Science.gov (United States)

    Wahid, Juliana; Hussin, Naimah Mohd

    2016-08-01

    The construction of population of initial solution is a crucial task in population-based metaheuristic approach for solving curriculum-based university course timetabling problem because it can affect the convergence speed and also the quality of the final solution. This paper presents an exploration on combination of graph heuristics in construction approach in curriculum based course timetabling problem to produce a population of initial solutions. The graph heuristics were set as single and combination of two heuristics. In addition, several ways of assigning courses into room and timeslot are implemented. All settings of heuristics are then tested on the same curriculum based course timetabling problem instances and are compared with each other in terms of number of population produced. The result shows that combination of saturation degree followed by largest degree heuristic produce the highest number of population of initial solutions. The results from this study can be used in the improvement phase of algorithm that uses population of initial solutions.

  10. Business intelligence and capacity planning: web-based solutions.

    Science.gov (United States)

    James, Roger

    2010-07-01

    Income (activity) and expenditure (costs) form the basis of a modern hospital's 'business intelligence'. However, clinical engagement in business intelligence is patchy. This article describes the principles of business intelligence and outlines some recent developments using web-based applications.

  11. Location-based solutions in the Experience centre

    DEFF Research Database (Denmark)

    Witzner Hansen, Dan; Alapetite, Alexandre Philippe Bernard; Holdgaard, Nanna

    2008-01-01

    In this paper we present a prototype system for location-based guiding. A user survey has been conducted and the observations are used to support design choices. The prototype allows for both indoor and outdoor navigation at and in the vicinity of the experience centre NaturBornholm1 in Denmark...... using a combination of Bluetooth, GPS and QRcodes. Bluetooth and GPS are used for location-based information and QR-codes are used to convey user preferences....

  12. Location-based solutions in the Experience centre

    DEFF Research Database (Denmark)

    Hansen, Dan Witzner; Alapetite, Alexandre; Holdgaard, Nanna

    In this paper we present a prototype system for location-based guiding. A user survey has been conducted and the observations are used to support design choices. The prototype allows for both indoor and outdoor navigation at and in the vicinity of the experience centre NaturBornholm1 in Denmark...... using a combination of Bluetooth, GPS and QR-codes. Bluetooth and GPS are used for location-based information and QR-codes are used to convey user preferences....

  13. 75 FR 66752 - Smart Grid Interoperability Standards; Notice of Technical Conference

    Science.gov (United States)

    2010-10-29

    ... Energy Regulatory Commission Smart Grid Interoperability Standards; Notice of Technical Conference... regulatory authorities that also are considering the adoption of Smart Grid Interoperability Standards.../FERC Collaborative on Smart Response (Collaborative), in the International D Ballroom at the Omni Hotel...

  14. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation (presentation)

    Science.gov (United States)

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  15. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation.

    Science.gov (United States)

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  16. Report on the IFIP WG5.8 International Workshop on Enterprise Interoperability (IWEI 2008)

    NARCIS (Netherlands)

    van Sinderen, Marten J.; Johnson, P.; Kutvonen, L.

    2008-01-01

    Enterprise interoperability is a growing research topic, rooted in various sub-disciplines from computer science and business management. Enterprise interoperability addresses intra- and inter-organizational collaboration and is characterized by the objective of aligning business level and

  17. Requirements for and barriers towards interoperable ehealth technology in primary care

    NARCIS (Netherlands)

    Oude Nijeweme-d'Hollosy, Wendeline; van Velsen, Lex Stefan; Huygens, Martine; Hermens, Hermanus J.

    Despite eHealth technology's rapid growth, eHealth applications are rarely embedded within primary care, mostly because systems lack interoperability. This article identifies requirements for, and barriers towards, interoperable eHealth technology from healthcare professionals' perspective -- the

  18. Exploring NASA GES DISC Data with Interoperable Services

    Science.gov (United States)

    Zhao, Peisheng; Yang, Wenli; Hegde, Mahabal; Wei, Jennifer C.; Kempler, Steven; Pham, Long; Teng, William; Savtchenko, Andrey

    2015-01-01

    Overview of NASA GES DISC (NASA Goddard Earth Science Data and Information Services Center) data with interoperable services: Open-standard and Interoperable Services Improve data discoverability, accessibility, and usability with metadata, catalogue and portal standards Achieve data, information and knowledge sharing across applications with standardized interfaces and protocols Open Geospatial Consortium (OGC) Data Services and Specifications Web Coverage Service (WCS) -- data Web Map Service (WMS) -- pictures of data Web Map Tile Service (WMTS) --- pictures of data tiles Styled Layer Descriptors (SLD) --- rendered styles.

  19. Interoperation of World-Wide Production e-Science Infrastructures

    CERN Document Server

    Riedel, M; Soddemann, T; Field, L; Navarro, JP; Casey, J; Litmaath, M; Baud, J; Koblitz, B; Catlett, C; Skow, D; Wang, S; Saeki, Y; Sato, H; Matsuoka, S; Geddes, N

    Many production Grid and e-Science infrastructures have begun to offer services to end-users during the past several years with an increasing number of scientific applications that require access to a wide variety of resources and services in multiple Grids. Therefore, the Grid Interoperation Now—Community Group of the Open Grid Forum—organizes and manages interoperation efforts among those production Grid infrastructures to reach the goal of a world-wide Grid vision on a technical level in the near future. This contribution highlights fundamental approaches of the group and discusses open standards in the context of production e-Science infrastructures.

  20. Improved semantic interoperability for content reuse through knowledge organization systems

    Directory of Open Access Journals (Sweden)

    José Antonio Moreiro González

    2012-04-01

    Full Text Available The Knowledge Organization Systems (KOS are resources designed to improve the knowledge interoperability, management and retrieval. As increases the web resources, it’s evidenced the lack of KOS, with the consequent impact in the resources interoperability. The KOSS are, by definition, complicated and costly tools, so much in his creation as in his management. The reuse of similar organizational structures is a necessary element in this context. They analyses experiences of reuse of The KOS and signals like the new standards are impinged on this appearance.

  1. A method for valuing architecture-based business transformation and measuring the value of solutions architecture

    OpenAIRE

    Slot, R.G.

    2010-01-01

    Enterprise and Solution Architecture are key in today’s business environment. It is surprising that the foundation and business case for these activities are nonexistent; the financial value for the business of these activities is largely undetermined. To determine business value of enterprise and solution architecture, this thesis shows how to measure and quantify, in business terms, the value of enterprise architecture-based on business transformation and the value of solution architecture.

  2. The impact of interoperability of electronic health records on ambulatory physician practices: a discrete-event simulation study

    Directory of Open Access Journals (Sweden)

    Yuan Zhou

    2014-02-01

    Full Text Available Background The effect of health information technology (HIT on efficiency and workload among clinical and nonclinical staff has been debated, with conflicting evidence about whether electronic health records (EHRs increase or decrease effort. None of this paper to date, however, examines the effect of interoperability quantitatively using discrete event simulation techniques.Objective To estimate the impact of EHR systems with various levels of interoperability on day-to-day tasks and operations of ambulatory physician offices.Methods Interviews and observations were used to collect workflow data from 12 adult primary and specialty practices. A discrete event simulation model was constructed to represent patient flows and clinical and administrative tasks of physicians and staff members.Results High levels of EHR interoperability were associated with reduced time spent by providers on four tasks: preparing lab reports, requesting lab orders, prescribing medications, and writing referrals. The implementation of an EHR was associated with less time spent by administrators but more time spent by physicians, compared with time spent at paper-based practices. In addition, the presence of EHRs and of interoperability did not significantly affect the time usage of registered nurses or the total visit time and waiting time of patients.Conclusion This paper suggests that the impact of using HIT on clinical and nonclinical staff work efficiency varies, however, overall it appears to improve time efficiency more for administrators than for physicians and nurses.

  3. Solution of the weighted symmetric similarity transformations based on quaternions

    Science.gov (United States)

    Mercan, H.; Akyilmaz, O.; Aydin, C.

    2017-12-01

    A new method through Gauss-Helmert model of adjustment is presented for the solution of the similarity transformations, either 3D or 2D, in the frame of errors-in-variables (EIV) model. EIV model assumes that all the variables in the mathematical model are contaminated by random errors. Total least squares estimation technique may be used to solve the EIV model. Accounting for the heteroscedastic uncertainty both in the target and the source coordinates, that is the more common and general case in practice, leads to a more realistic estimation of the transformation parameters. The presented algorithm can handle the heteroscedastic transformation problems, i.e., positions of the both target and the source points may have full covariance matrices. Therefore, there is no limitation such as the isotropic or the homogenous accuracy for the reference point coordinates. The developed algorithm takes the advantage of the quaternion definition which uniquely represents a 3D rotation matrix. The transformation parameters: scale, translations, and the quaternion (so that the rotation matrix) along with their covariances, are iteratively estimated with rapid convergence. Moreover, prior least squares (LS) estimation of the unknown transformation parameters is not required to start the iterations. We also show that the developed method can also be used to estimate the 2D similarity transformation parameters by simply treating the problem as a 3D transformation problem with zero (0) values assigned for the z-components of both target and source points. The efficiency of the new algorithm is presented with the numerical examples and comparisons with the results of the previous studies which use the same data set. Simulation experiments for the evaluation and comparison of the proposed and the conventional weighted LS (WLS) method is also presented.

  4. Myocardial protection against global ischemia with Krebs-Henseleit buffer-based cardioplegic solution.

    Science.gov (United States)

    Minasian, Sarkis M; Galagudza, Michael M; Dmitriev, Yuri V; Kurapeev, Dmitry I; Vlasov, Timur D

    2013-04-02

    The Krebs-Henseleit buffer is the best perfusion solution for isolated mammalian hearts. We hypothesized that a Krebs-Henseleit buffer-based cardioplegic solution might provide better myocardial protection than well-known crystalloid cardioplegic solutions because of its optimal electrolyte and glucose levels, presence of buffer systems, and mild hyperosmolarity. Isolated Langendorff-perfused rat hearts were subjected to either global ischemia without cardioplegia (controls) or cardioplegic arrest for either 60 or 180 min, followed by 120 min of reperfusion. The modified Krebs-Henseleit buffer-based cardioplegic solution (mKHB) and St. Thomas' Hospital solution No. 2 (STH2) were studied. During global ischemia, the temperatures of the heart and the cardioplegic solutions were maintained at either 37°C (60 min of ischemia) or 22°C (moderate hypothermia, 180 min of ischemia). Hemodynamic parameters were registered throughout the experiments. The infarct size was determined through histochemical examination. Cardioplegia with the mKHB solution at moderate hypothermia resulted in a minimal infarct size (5 ± 3%) compared to that in the controls and STH2 solution (35 ± 7% and 19 ± 9%, respectively; P Krebs-Henseleit buffer-based cardioplegic might be superior to the standard crystalloid solution (STH2).

  5. Location-based solutions in the Experience centre

    DEFF Research Database (Denmark)

    Hansen, Dan Witzner; Alapetite, Alexandre; Holdgaard, Nanna

    2009-01-01

    In this paper we present a prototype system for location-based guiding. A user survey has been conducted and the observations are used to support design choices. The prototype allows for both indoor and outdoor navigation at and in the vicinity of the NaturBornholm' experience centre in Denmark...... using a combination of Bluetooth, Near Field Communication (NFC), GPS and QR codes. Bluetooth, NFC and GPS are used for location-based information and QR codes are used to convey user preferences....

  6. Location-based solutions in the experience center

    DEFF Research Database (Denmark)

    Witzner Hansen, Dan; Alapetite, Alexandre; Holdgaard, Nanna

    2009-01-01

    In this paper we present a prototype system for location-based guiding. A user survey has been conducted and the observations are used to support design choices. The prototype allows for both indoor and outdoor navigation at and in the vicinity of the NaturBornholm [1] experience centre in Denmark...... using a combination of Bluetooth, Near field communication (NFC), GPS and QR-codes. Bluetooth, NFC, and GPS are used for location-based information and QR-codes are used to convey user preferences. [1] http://naturbornholm.dk...

  7. Location-based solutions in the experience center

    DEFF Research Database (Denmark)

    Witzner Hansen, Dan; Alapetite, Alexandre; Holdgaard, Nanna

    2009-01-01

    In this paper we present a prototype system for location-based guiding. A user survey has been conducted and the observations are used to support design choices. The prototype allows for both indoor and outdoor navigation at and in the vicinity of the NaturBornholm [1] experience centre in Denmar...... using a combination of Bluetooth, Near field communication (NFC), GPS and QR-codes. Bluetooth, NFC, and GPS are used for location-based information and QR-codes are used to convey user preferences. [1] http://naturbornholm.dk...

  8. Capturing Sensor Metadata for Cross-Domain Interoperability

    Science.gov (United States)

    Fredericks, J.

    2015-12-01

    Envision a world where a field operator turns on an instrument, and is queried for information needed to create standardized encoded descriptions that, together with the sensor manufacturer knowledge, fully describe the capabilities, limitations and provenance of observational data. The Cross-Domain Observational Metadata Environmental Sensing Network (X-DOMES) pilot project (with support from the NSF/EarthCube IA) is taking the first steps needed in realizing this vision. The knowledge of how an observable physical property becomes a measured observation must be captured at each stage of its creation. Each sensor-based observation is made through the use of applied technologies, each with specific limitations and capabilities. Environmental sensors typically provide a variety of options that can be configured differently for each unique deployment, affecting the observational results. By capturing the information (metadata) at each stage of its generation, a more complete and accurate description of data provenance can be communicated. By documenting the information in machine-harvestable, standards-based encodings, metadata can be shared across disciplinary and geopolitical boundaries. Using standards-based frameworks enables automated harvesting and translation to other community-adopted standards, which facilitates the use of shared tools and workflows. The establishment of a cross-domain network of stakeholders (sensor manufacturers, data providers, domain experts, data centers), called the X-DOMES Network, provides a unifying voice for the specification of content and implementation of standards, as well as a central repository for sensor profiles, vocabularies, guidance and product vetting. The ability to easily share fully described observational data provides a better understanding of data provenance and enables the use of common data processing and assessment workflows, fostering a greater trust in our shared global resources. The X-DOMES Network

  9. Studying boat-based bear viewing: Methodological challenges and solutions

    Science.gov (United States)

    Sarah Elmeligi

    2007-01-01

    Wildlife viewing, a growing industry throughout North America, holds much potential for increased revenue and public awareness regarding species conservation. In Alaska and British Columbia, grizzly bear (Ursus arctos) viewing is becoming more popular, attracting tourists from around the world. Viewing is typically done from a land-based observation...

  10. People counting with stereo cameras : two template-based solutions

    NARCIS (Netherlands)

    Englebienne, Gwenn; van Oosterhout, Tim; Kröse, B.J.A.

    2012-01-01

    People counting is a challenging task with many applications. We propose a method with a fixed stereo camera that is based on projecting a template onto the depth image. The method was tested on a challenging outdoor dataset with good results and runs in real time.

  11. Earth Based Views of Solute Profiles on Mars (Invited)

    Science.gov (United States)

    Amundson, R.

    2013-12-01

    'Historical accounts of planetary evolution are mostly written in stone' (1), but the last chapter of that history is embedded in its soil. Soil properties reflect the effects of prevailing environmental boundary conditions. Solute profiles are powerful indicators of the direction and magnitude of water flow. I briefly review the chemistry of salt profiles from deserts formed by upward vs. downward migrating water, use this as a basis for interpreting aspects of Mars hydrological history. The Noachian-aged Meridiani Planum land surface is exposed in the Endurance and Victoria Craters. These craters have been estimated to be ~ craters and the pre-excavation alteration of the landscape by aqueous processes. Crater profiles include APXS 'asis' (fresh surface), brushed , and RAT'd samples. Using RAT'd samples as a baseline, the gains and losses of elements in the surficial samples can be assessed (Fig. 1). The calculations reveal similar trends of surface alteration within a crater (Victoria) and between two craters (Fig. 1). The asis samples are enriched in Na2O, Al2O3, CaO, and Br (and depleted in MgO, SO3, Cl, K2O, MnO, FeO) relative to the RAT'd material. Brushing drastically reduces these differences. These data show that the alteration is very surficial. The RAT'd samples appear to represent pre-impact chemical profiles of the sediment (Fig. 2). It has previously been reported that the upper ~1m at Victoria has been visibly altered by diagenesis (3). Both Endurance (4) and Victoria craters have remarkably similar depth profiles (relative to the lowest sampling point) of SO3, Cl, and Br. The salt profiles, combined with observations of physical alteration, suggest modest pedogenic alteration of the landsurface sometime prior to impact. The sequence of the SO3 and Cl is consistent only with downward aqueous transport, as clearly illustrated by comparison to Earth soils that form by groundwater evaporation vs. downward moving meteoric water. While the total water

  12. Solution immersed silicon (SIS)-based biosensors: a new approach in biosensing.

    Science.gov (United States)

    Diware, M S; Cho, H M; Chegal, W; Cho, Y J; Jo, J H; O, S W; Paek, S H; Yoon, Y H; Kim, D

    2015-02-07

    A novel, solution immersed silicon (SIS)-based sensor has been developed which employs the non-reflecting condition (NRC) for a p-polarized wave. The SIS sensor's response is almost independent of change in the refractive index (RI) of a buffer solution (BS) which makes it capable of measuring low-concentration and/or low-molecular-weight compounds.

  13. An integrable, web-based solution for easy assessment of video-recorded performances

    DEFF Research Database (Denmark)

    Subhi, Yousif; Todsen, Tobias; Konge, Lars

    2014-01-01

    , and access to this information should be restricted to select personnel. A local software solution may also ease the need for customization to local needs and integration into existing user databases or project management software. We developed an integrable web-based solution for easy assessment of video...

  14. Solution-Focused Therapy: Strength-Based Counseling for Children with Social Phobia

    Science.gov (United States)

    George, Cindy M.

    2008-01-01

    Solution-focused therapy is proposed as an effective strength-based model for children with social phobia. Social phobia is described along with the etiology and prevailing treatment approaches. A case illustration demonstrates the application of solution-focused therapy with a child who experienced social phobia. Implications for counseling and…

  15. 78 FR 40474 - Sustaining Power Solutions LLC; Supplemental Notice That Initial Market-Based Rate Filing...

    Science.gov (United States)

    2013-07-05

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Sustaining Power Solutions LLC; Supplemental Notice That Initial Market... in the above-referenced proceeding, of Sustaining Power Solutions LLC's application for market-based...

  16. Model-based fuzzy control solutions for a laboratory Antilock Braking System

    DEFF Research Database (Denmark)

    Precup, Radu-Emil; Spataru, Sergiu; Rǎdac, Mircea-Bogdan

    2010-01-01

    This paper gives two original model-based fuzzy control solutions dedicated to the longitudinal slip control of Antilock Braking System laboratory equipment. The parallel distributed compensation leads to linear matrix inequalities which guarantee the global stability of the fuzzy control systems....... Real-time experimental results validate the new fuzzy control solutions....

  17. 75 FR 63462 - Smart Grid Interoperability Standards; Notice of Docket Designation for Smart Grid...

    Science.gov (United States)

    2010-10-15

    ... Energy Regulatory Commission Smart Grid Interoperability Standards; Notice of Docket Designation for Smart Grid Interoperability Standards October 7, 2010. 1. The Energy Independence and Security Act of... interoperability of smart grid devices and systems, including protocols and model standards for information...

  18. Pemanfaatan Google API Untuk Model Interoperability Web Berbasis PHP Dengan Google Drive

    OpenAIRE

    Sumiari, Ni Kadek

    2015-01-01

    Dalam sebuah website tercapinya interoperability suatu system sangatlah penting. Penggunaan database berbasis Mysql, Sql Server ataupun oracle memang sudah sangat lumrah dipergunakan dalam sebuah system berbasis website. Namun penggunaan database tersebut tidak bisa menjamin apakah interoperability dari system tersebut dapat tercapai. Selain dari keamanan data dari segi implementasi system juga cukup sulit. Salah satu solusi dalam mencapi interoperability suatu system berbasis website adalah...

  19. Interoperability of Services in an Open Broadband Market : Cases from the Netherlands

    NARCIS (Netherlands)

    Burgmeijer, J.

    2006-01-01

    End-to-end interoperability of broadband services and networks is a condition for an open broadband market. A business model for broadband service interoperability is given. Two cases from the Netherlands, of initiatives from the market to reach interoperability, are presented: E-norm and FIST VoIP.

  20. Image Based Solution to Occlusion Problem for Multiple Robots Navigation

    Directory of Open Access Journals (Sweden)

    Taj Mohammad Khan

    2012-04-01

    Full Text Available In machine vision, occlusions problem is always a challenging issue in image based mapping and navigation tasks. This paper presents a multiple view vision based algorithm for the development of occlusion-free map of the indoor environment. The map is assumed to be utilized by the mobile robots within the workspace. It has wide range of applications, including mobile robot path planning and navigation, access control in restricted areas, and surveillance systems. We used wall mounted fixed camera system. After intensity adjustment and background subtraction of the synchronously captured images, the image registration was performed. We applied our algorithm on the registered images to resolve the occlusion problem. This technique works well even in the existence of total occlusion for a longer period.

  1. A Study on Pig Slaughter Traceability Solution Based on RFID

    OpenAIRE

    Luo, Qingyao; Xiong, Benhai; Geng, Zhi; Yang, Liang; Pan, Jiayi

    2010-01-01

    International audience; Due to an assembly line production and poor environment conditions in pig slaughterhouses, collection of slaughter tracing information is not a simple thing. Based on the UHF radio frequency identification (RFID) technologies, this study designed a RFID tag for carcass, a RS232-PS2 data conversion line and some data norms such as the RFID carcass tag and partition meat label norm, and developed online reading and writing system for RFID tags, accomplished RFID identifi...

  2. Point based graphics rendering with unified scalability solutions.

    OpenAIRE

    Bull, L.

    2006-01-01

    Standard real-time 3D graphics rendering algorithms use brute force polygon rendering, with complexity linear in the number of polygons and little regard for limiting processing to data that contributes to the image. Modern hardware can now render smaller scenes to pixel levels of detail, relaxing surface connectivity requirements. Sub-linear scalability optimizations are typically self-contained, requiring specific data structures, without shared functions and data. A new point based renderi...

  3. Whispering Gallery Mode Based Optical Fiber Sensor for Measuring Concentration of Salt Solution

    Directory of Open Access Journals (Sweden)

    Chia-Chin Chiang

    2013-01-01

    Full Text Available An optical fiber solution-concentration sensor based on whispering gallery mode (WGM is proposed in this paper. The WGM solution-concentration sensors were used to measure salt solutions, in which the concentrations ranged from 1% to 25% and the wavelength drifted from the left to the right. The experimental results showed an average sensitivity of approximately 0.372 nm/% and an R2 linearity of 0.8835. The proposed WGM sensors are of low cost, feasible for mass production, and durable for solution-concentration sensing.

  4. Preparation and evaluation of HPMC-based pirfenidone solution in vivo.

    Science.gov (United States)

    Yang, Mei; Yang, Yang-Fan; Lei, Ming; Ye, Cheng-Tian; Zhao, Chun-Shun; Xu, Jian-Gang; Wu, Kai-Li; Yu, Min-Bin

    2017-01-01

    Pirfenidone (PFD) has exhibited therapeutic potential in the treatment of cell proliferative disorders. The previously developed 0.5% water-based PFD eye drops by our team exhibited antiscarring effectiveness and ocular safety but with a limit of short half-life and poor bioavailability. To increase bioavailability of the water-based PFD eye drops, we prepared a viscous solution by adding hydroxypropyl methylcellulose (HPMC, F4M), which acted as a viscosity-enhancer. Subsequently, we compared the HPMC-based PFD solution with the water-based PFD eye drops. PFD solution with 1% HPMC (w/v) was prepared, and the viscosities at different shear rates were measured to investigate its rheology. PFD concentrations in the tear, aqueous humor, conjunctiva, cornea, and sclerae of New Zealand rabbits were detected at different time points with high-performance liquid chromatography (HPLC) following single instillation of the 0.5% PFD (w/v) water-based eye drops or HPMC-based solution. Compared with the 0.5% water-based PFD eye drops, the HPMC-based solution increased the PFD levels in tears and prolonged the residence time from 10 to more than 20 min (p solution exhibited the higher bioavailability.

  5. Datacube Interoperability, Encoding Independence, and Analytics

    Science.gov (United States)

    Baumann, Peter; Hirschorn, Eric; Maso, Joan

    2017-04-01

    Datacubes are commonly accepted as an enabling paradigm which provides a handy abstraction for accessing and analyzing the zillions of image files delivered by the manifold satellite instruments and climate simulations, among others. Additionally, datacubes are the classic model for statistical and OLAP datacubes, so a further information category can be integrated. From a standards perspective, spatio-temporal datacubes naturally are included in the concept of coverages which encompass regular and irregular grids, point clouds, and general meshes - or, more abstractly, digital representations of spatio-temporally varying phenomena. ISO 19123, which is identical to OGC Abstract Topic 6, gives a high-level abstract definition which is complemented by the OGC Coverage Implementation Schema (CIS) which is an interoperable, yet format independent concretization of the abstract model. Currently, ISO is working on adopting OGC CIS as ISO 19123-2; the existing ISO 19123 standard is under revision by one of the abstract authors and will become ISO 19123-1. The roadmap agreed by ISO further foresees adoption of the OGC Web Coverage Service (WCS) as an ISO standard so that a complete data and service model will exist. In 2016, INSPIRE has adopted WCS as Coverage Download Service, including the datacube analytics language Web Coverage Processing Service (WCPS). The rasdaman technology (www.rasdaman.org) is both OGC and INSPIRE Reference Implementation. In the global EarthServer initiative rasdaman database sizes are exceeding 250 TB today, heading for the Petabyte frontier well in 2017. Technically, CIS defines a compact, efficient model for representing multi-dimensional datacubes in several ways. The classical coverage cube defines a domain set (where are values?), a range set (what are these values?), and range type (what do the values mean?), as well as a "bag" for arbitrary metadata. With CIS 1.1, coordinate/value pair sequences have been added, as well as tiled

  6. Interoperable Access to NCAR Research Data Archive Collections

    Science.gov (United States)

    Schuster, D.; Ji, Z.; Worley, S. J.; Manross, K.

    2014-12-01

    The National Center for Atmospheric Research (NCAR) Research Data Archive (RDA) provides free access to 600+ observational and gridded dataset collections. The RDA is designed to support atmospheric and related sciences research, updated frequently where datasets have ongoing production, and serves data to 10,000 unique users annually. The traditional data access options include web-based direct archive file downloads, user selected data subsets and format conversions produced by server-side computations, and client and cURL-based APIs for routine scripted data retrieval. To enhance user experience and utility, the RDA now also offers THREDDS Data Server (TDS) access for many highly valued dataset collections. TDS offered datasets are presented as aggregations, enabling users to access an entire dataset collection, that can be comprised of 1000's of files, through a single virtual file. The OPeNDAP protocol, supported by the TDS, allows compatible tools to open and access these virtual files remotely, and make the native data file format transparent to the end user. The combined functionality (TDS/OPeNDAP) gives users the ability to browse, select, visualize, and download data from a complete dataset collection without having to transfer archive files to a local host. This presentation will review the TDS basics and describe the specific TDS implementation on the RDA's diverse archive of GRIB-1, GRIB-2, and gridded NetCDF formatted dataset collections. Potential future TDS implementation on in-situ observational dataset collections will be discussed. Illustrative sample cases will be used to highlight the end users benefits from this interoperable data access to the RDA.

  7. Fast Wideband Solutions Obtained Using Model Based Parameter Estimation with Method of Moments

    Directory of Open Access Journals (Sweden)

    F. Kaburcuk

    2017-10-01

    Full Text Available Integration of the Model Based Parameter Estimation (MBPE technique into Method of Moments (MOM provides fast solutions over a wide frequency band to solve radiation and scattering problems. The MBPE technique uses the Padé rational function to approximate solutions over a wide frequency band from a solution at a fixed frequency. In this paper, the MBPE technique with MOM is applied to a thin-wire antenna. The solutions obtained by repeated simulations of MOM agree very well with the solutions obtained by MBPE technique in a single simulation. Therefore, MBPE technique according to MOM provides a remarkable saving in the computation time. Computed results show that solutions at a wider frequency band of interest are achieved in a single simulation.

  8. Empirical agent-based modelling challenges and solutions

    CERN Document Server

    Barreteau, Olivier

    2014-01-01

    This instructional book showcases techniques to parameterise human agents in empirical agent-based models (ABM). In doing so, it provides a timely overview of key ABM methodologies and the most innovative approaches through a variety of empirical applications.  It features cutting-edge research from leading academics and practitioners, and will provide a guide for characterising and parameterising human agents in empirical ABM.  In order to facilitate learning, this text shares the valuable experiences of other modellers in particular modelling situations. Very little has been published in the area of empirical ABM, and this contributed volume will appeal to graduate-level students and researchers studying simulation modeling in economics, sociology, ecology, and trans-disciplinary studies, such as topics related to sustainability. In a similar vein to the instruction found in a cookbook, this text provides the empirical modeller with a set of 'recipes'  ready to be implemented. Agent-based modeling (AB...

  9. Managing and delivering of 3D geo data across institutions has a web based solution - intermediate results of the project GeoMol.

    Science.gov (United States)

    Gietzel, Jan; Schaeben, Helmut; Gabriel, Paul

    2014-05-01

    The increasing relevance of geological information for policy and economy at transnational level has recently been recognized by the European Commission, who has called for harmonized information related to reserves and resources in the EU Member States. GeoMol's transnational approach responds to that, providing consistent and seamless 3D geological information of the Alpine Foreland Basins based on harmonized data and agreed methodologies. However, until recently no adequate tool existed to ensure full interoperability among the involved GSOs and to distribute the multi-dimensional information of a transnational project facing diverse data policy, data base systems and software solutions. In recent years (open) standards describing 2D spatial data have been developed and implemented in different software systems including production environments for 2D spatial data (like regular 2D-GI-Systems). Easy yet secured access to the data is of upmost importance and thus priority for any spatial data infrastructure. To overcome limitations conditioned by highly sophisticated and platform dependent geo modeling software packages functionalities of a web portals can be utilized. Thus, combining a web portal with a "check-in-check-out" system allows distributed organized editing of data and models but requires standards for the exchange of 3D geological information to ensure interoperability. Another major concern is the management of large models and the ability of 3D tiling into spatially restricted models with refined resolution, especially when creating countrywide models . Using GST ("Geosciences in Space and Time") developed initially at TU Bergakademie Freiberg and continuously extended by the company GiGa infosystems, incorporating these key issues and based on an object-relational data model, it is possible to check out parts or whole models for edits and check in again after modification. GST is the core of GeoMol's web-based collaborative environment designed to

  10. Metadata behind the interoperability of wireless sensor networks

    NARCIS (Netherlands)

    Ballari, D.E.; Wachowicz, M.; Manso-Callejo, M.A.

    2009-01-01

    Wireless Sensor Networks (WSNs) produce changes of status that are frequent, dynamic and unpredictable, and cannot be represented using a linear cause-effect approach. Consequently, a new approach is needed to handle these changes in order to support dynamic interoperability. Our approach is to

  11. Interoperable transactions in business models: A structured approach

    NARCIS (Netherlands)

    Weigand, H.; Verharen, E.; Dignum, F.P.M.

    1996-01-01

    Recent database research has given much attention to the specification of "flexible" transactions that can be used in interoperable systems. Starting from a quite different angle, Business Process Modelling has approached the area of communication modelling as well (the Language/Action

  12. Proposed Specifications for International Interoperability on Repaired Bomb Damaged Runways

    Science.gov (United States)

    1981-01-01

    ESL-TR-81-03 PROPOSED SPECIFICATIONS FOR INTERNATIONAL INTEROPERABILITY ON REPAIRED BOMB DAMAGED RUNWAYS CALDWELL, LAPSLEY R. LT COL. USAF GERARDI... Lapsley R., Lt Col, USAF xctard,., Anthony G. IN-HOUSE 9. PERFORk, AG’ •)RGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT. PROJ :CT, TASKAREA & WORK

  13. Pragmatic Interoperability: A Systematic Review of Published Definitions

    NARCIS (Netherlands)

    Asuncion, C.H.; van Sinderen, Marten J.; Bernus, Peter; Doumeingts, Guy; Fox, Mark

    2010-01-01

    Enabling the interoperability between applications requires agreement in the format and meaning (syntax and semantics) of exchanged data including the ordering of message exchanges. However, today’s researchers argue that these are not enough to achieve a complete, effective and meaningful

  14. Managing Uncertainty: The Road Towards Better Data Interoperability

    NARCIS (Netherlands)

    Herschel, M.; van Keulen, Maurice

    Data interoperability encompasses the many data management activities needed for effective information management in anyone´s or any organization´s everyday work such as data cleaning, coupling, fusion, mapping, and information extraction. It is our conviction that a significant amount of money and

  15. Waveform Diversity and Design for Interoperating Radar Systems

    Science.gov (United States)

    2013-01-01

    University Di Pisa Department Di Ingegneria Dell Informazione Elettronica, Informatica , Telecomunicazioni Via Girolamo Caruso 16 Pisa, Italy 56122...NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) University Di Pisa Department Di Ingegneria Dell Informazione Elettronica, Informatica ...DIPARTIMENTO DI INGEGNERIA DELL’INFORMAZIONE ELETTRONICA, INFORMATICA , TELECOMUNICAZIONI WAVEFORM DIVERSITY AND DESIGN FOR INTEROPERATING

  16. Towards Cross-Organizational Innovative Business Process Interoperability Services

    Science.gov (United States)

    Karacan, Ömer; Del Grosso, Enrico; Carrez, Cyril; Taglino, Francesco

    This paper presents the vision and initial results of the COIN (FP7-IST-216256) European project for the development of open source Collaborative Business Process Interoperability (CBPip) in cross-organisational business collaboration environments following the Software-as-a-Service Utility (SaaS-U) paradigm.

  17. Design of large-scale enterprise interoperable value webs

    NARCIS (Netherlands)

    Hofman, W.J.

    2011-01-01

    Still a lot of enterprises are faced with the issue of interoperability. Whereas large enterprises are able to implement the required technology, SMEs (Small and Medium sized Enterprises) face challenges as they lack knowledge and budget. Enterprises have defined their specific semantics and

  18. The MADE reference information model for interoperable pervasive telemedicine systems

    NARCIS (Netherlands)

    Fung, L.S.N.; Jones, Valerie M.; Hermens, Hermanus J.

    2017-01-01

    Objectives: The main objective is to develop and validate a reference information model (RIM) to support semantic interoperability of pervasive telemedicine systems. The RIM is one component within a larger, computer-interpretable "MADE language" developed by the authors in the context of the

  19. 47 CFR 0.192 - Emergency Response Interoperability Center.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Emergency Response Interoperability Center. 0.192 Section 0.192 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL COMMISSION ORGANIZATION..., industry representatives, and service providers. [75 FR 28207, May 20, 2010] ...

  20. Information and documentation - Thesauri and interoperability with other vocabularies

    DEFF Research Database (Denmark)

    Lykke, Marianne; Dalbin, Sylvie; Smedt, Johan De

    ISO 25964-2:2013 is applicable to thesauri and other types of vocabulary that are commonly used for information retrieval. It describes, compares and contrasts the elements and features of these vocabularies that are implicated when interoperability is needed. It gives recommendations...... for the establishment and maintenance of mappings between multiple thesauri, or between thesauri and other types of vocabularies....

  1. Evaluation of a Web-based Online Grant Application Review Solution

    Directory of Open Access Journals (Sweden)

    Marius Daniel PETRISOR

    2013-12-01

    Full Text Available This paper focuses on the evaluation of a web-based application used in grant application evaluations, software developed in our university, and underlines the need for simple solutions, based on recent technology, specifically tailored to one’s needs. We asked the reviewers to answer a short questionnaire, in order to assess their satisfaction with such a web-based grant application evaluation solution. All 20 reviewers accepted to answer the questionnaire, which contained 8 closed items (YES/NO answers related to reviewer’s previous experience in evaluating grant applications, previous use of such software solutions and his familiarity in using computer systems. The presented web-based application, evaluated by the users, shown a high level of acceptance and those respondents stated that they are willing to use such a solution in the future.

  2. Methods of noninvasive electrophysiological heart examination basing on solution of inverse problem of electrocardiography

    Science.gov (United States)

    Grigoriev, M.; Babich, L.

    2015-09-01

    The article represents the main noninvasive methods of heart electrical activity examination, theoretical bases of solution of electrocardiography inverse problem, application of different methods of heart examination in clinical practice, and generalized achievements in this sphere in global experience.

  3. A web-based solution for 3D medical image visualization

    Science.gov (United States)

    Hou, Xiaoshuai; Sun, Jianyong; Zhang, Jianguo

    2015-03-01

    In this presentation, we present a web-based 3D medical image visualization solution which enables interactive large medical image data processing and visualization over the web platform. To improve the efficiency of our solution, we adopt GPU accelerated techniques to process images on the server side while rapidly transferring images to the HTML5 supported web browser on the client side. Compared to traditional local visualization solution, our solution doesn't require the users to install extra software or download the whole volume dataset from PACS server. By designing this web-based solution, it is feasible for users to access the 3D medical image visualization service wherever the internet is available.

  4. Thermodynamics of hydrogen bonding and van der Waals interactions of organic solutes in solutions of imidazolium based ionic liquids: “Structure-property” relationships

    International Nuclear Information System (INIS)

    Varfolomeev, Mikhail A.; Khachatrian, Artashes A.; Akhmadeev, Bulat S.; Solomonov, Boris N.

    2016-01-01

    Highlights: • Solution enthalpies of organic solutes in imidazolium based ionic liquids were measured. • van der Waals interactions scale of imidazolium based ionic liquids was proposed. • Enthalpies of solvation of organic solutes in ionic liquids were determined. • Hydrogen bond enthalpies of organic solutes with ionic liquids were calculated. • Relationships between structure of ionic liquids and thermochemical data were obtained. - Abstract: In the present work thermochemistry of intermolecular interactions of organic compounds in solutions of imidazolium based ionic liquids (ILs) has been studied using solution calorimetry method. Enthalpies of solution at infinite dilution of non-polar (alkanes, aromatic hydrocarbons) and polar (alcohols, amides, and etc.) organic solutes in two ionic liquids 1-butyl-3-methylimidazolium tetrafluoroborate and 1-butyl-3-methylimidazolium trifluoromethanesulfonate were measured at 298.15 K. The scale of van der Waals interactions of imidazolium based ILs has been proposed on the basis of solution enthalpies of n-alkanes in their media. The effect of the cation and anion structure of ILs on the enthalpies of solvation was analyzed. Enthalpies of hydrogen bonding of organic solutes with imidazolium based ILs were determined. It has been shown that these values are close to zero for proton acceptor solutes. At the same time, enthalpies of hydrogen bonding of proton donor solutes with ionic liquids are increased depending the anion: tetrafluoroborate ≈ bis(trifluoromethylsulfonyl)imide < 2-(2-methoxyethoxy)ethyl sulfate < trifluoromethanesulfonate. Enthalpies of van der Waals interactions and hydrogen bonding in the solutions of imidazolium based ionic liquids were compared with the same data for molecular solvents.

  5. Non-Invasive Acoustic-Based Monitoring of Heavy Water and Uranium Process Solutions

    Energy Technology Data Exchange (ETDEWEB)

    Pantea, Cristian [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sinha, Dipen N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Lakis, Rollin Evan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Beedle, Christopher Craig [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Davis, Eric Sean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-10-20

    This presentation includes slides on Project Goals; Heavy Water Production Monitoring: A New Challenge for the IAEA; Noninvasive Measurements in SFAI Cell; Large Scatter in Literature Values; Large Scatter in Literature Values; Highest Precision Sound Speed Data Available: New Standard in H/D; ~400 pts of data; Noninvasive Measurements in SFAI Cell; New funding from NA241 SGTech; Uranium Solution Monitoring: Inspired by IAEA Challenge in Kazakhstan; Non-Invasive Acoustic-Based Monitoring of Uranium in Solutions; Non-Invasive Acoustic-Based Monitoring of Uranium in Solutions; and finally a summary.

  6. Market based solutions for increased flexibility in electricity consumption

    International Nuclear Information System (INIS)

    Grande, Ove S.; Saele, Hanne

    2005-06-01

    The main focus of this paper is on manual and automatic demand response to prices in the day ahead market. The content is mainly based on the results and experiences from the large scale Norwegian test and research project End User flexibility by efficient use of ICT (2001-2004) involving 10,894 customers with automatic meter reading (AMR) and remote load control (RLC) options. The response to hourly spot price products and intraday time of use (ToU) tariffs were tested. The registered response differs from 0.18-1 kWh/h in average per household customer for the different combination of these price signals. The largest response was achieved for the customers with both the ToU network tariff and hourly spot price. Some of the customers were offered remote controlled automatic disconnection of water heaters in the high price periods during week days. The test shows that the potential of load reduction from water heaters can be estimated to 0.6 kWh/h in the peak hours on average. For Norway this indicates that a total of 600 MWh/h automatic price elasticity could be achieved, provided that half of the 2 million Norwegian households accept RLC of their water heater referred to spot price. The benefit for load shifting is limited for each customer, but of great value for the power system as a whole. Combination of an hourly spot price contract with an intraday ToU network tariff should therefore be considered, in order to provide stable economic incentives for load reduction. One potential drawback for customers with spot price energy contracts is the risk of high electricity prices in periods of lasting scarcity. Combination with financial power contracts as an insurance for the customer is an option that will be examined in a follow up project

  7. Promoting Savings at Tax Time through a Video-Based Solution-Focused Brief Coaching Intervention

    Directory of Open Access Journals (Sweden)

    Lance Palmer

    2016-09-01

    Full Text Available Solution-focused brief coaching, based on solution-focused brief therapy, is a well-established practice model and is used widely to help individuals progress toward desired outcomes in a variety of settings. This papers presents the findings of a pilot study that examined the impact of a video-based solution-focused brief coaching intervention delivered in conjunction with income tax preparation services at a Volunteer Income Tax Assistance location (n = 212. Individuals receiving tax preparation assistance were randomly assigned to one of four treatment groups: 1 control group; 2 video-based solution-focused brief coaching; 3 discount card incentive; 4 both the video-based solution-focused brief coaching and the discount card incentive. Results of the study indicate that the video-based solution-focused brief coaching intervention increased both the frequency and amount of self-reported savings at tax time. Results also indicate that financial therapy based interventions may be scalable through the use of technology.

  8. Solution stability of Captisol-stabilized melphalan (Evomela) versus Propylene glycol-based melphalan hydrochloride injection.

    Science.gov (United States)

    Singh, Ramsharan; Chen, Jin; Miller, Teresa; Bergren, Michael; Mallik, Rangan

    2016-12-14

    The objective of this study was to compare the stability of recently approved Captisol-stabilized propylene glycol-free melphalan injection (Evomela™) against currently marketed propylene glycol-based melphalan injection. The products were compared as reconstituted solutions in vials as well as admixture solutions prepared from normal saline in infusion bags. Evomela and propylene glycol-based melphalan injection were reconstituted in normal saline and organic custom diluent, respectively, according to their package insert instructions. The reconstituted solutions were diluted in normal saline to obtain drug admixture solutions at specific drug concentrations. Stability of the solutions was studied at room temperature by assay of melphalan and determination of melphalan-related impurities. Results show that based on the increase in total impurities in propylene glycol-based melphalan injection at 0.45 mg/mL, Evomela admixture solutions are about 5, 9, 15 and 29 times more stable at concentrations of 0.45, 1.0, 2.0 and 5.0 mg/mL, respectively. Results confirmed that reconstituted Evomela solution can be stored in the vial for up to 1 h at RT or for up to 24 h at refrigerated temperature (2-8 °C) with no significant degradation. After storage in the vial, it remains stable for an additional 3-29 h after preparation of admixture solution in infusion bags at concentrations of 0.25-5.0 mg/mL, respectively. In addition, Evomela solution in saline, at concentration of 5.0 mg/mL melphalan was bacteriostatic through 72 h storage at 2-8 °C. Formulation of melphalan with Captisol technology significantly improved stability compared to melphalan hydrochloride reconstituted with propylene-glycol based diluents.

  9. The Use of Alkaliphilic Bacteria-based Repair Solution for Porous Network Concrete Healing Mechanism

    NARCIS (Netherlands)

    Sangadji, S.; Wiktor, V.A.C.; Jonkers, H.M.; Schlangen, H.E.J.G.

    2017-01-01

    Bacteria induced calcium carbonate precipitation based on metabolic conversion of nutrients has been acknowledged for having potentials in self-healing cement-based materials. Recent studies have shown the development of bacteria-based repair solution (liquid) for concrete surface repair. This

  10. Analysis of cloud-based solutions on EHRs systems in different scenarios.

    Science.gov (United States)

    Fernández-Cardeñosa, Gonzalo; de la Torre-Díez, Isabel; López-Coronado, Miguel; Rodrigues, Joel J P C

    2012-12-01

    Nowadays with the growing of the wireless connections people can access all the resources hosted in the Cloud almost everywhere. In this context, organisms can take advantage of this fact, in terms of e-Health, deploying Cloud-based solutions on e-Health services. In this paper two Cloud-based solutions for different scenarios of Electronic Health Records (EHRs) management system are proposed. We have researched articles published between the years 2005 and 2011 about the implementation of e-Health services based on the Cloud in Medline. In order to analyze the best scenario for the deployment of Cloud Computing two solutions for a large Hospital and a network of Primary Care Health centers have been studied. Economic estimation of the cost of the implementation for both scenarios has been done via the Amazon calculator tool. As a result of this analysis two solutions are suggested depending on the scenario: To deploy a Cloud solution for a large Hospital a typical Cloud solution in which are hired just the needed services has been assumed. On the other hand to work with several Primary Care Centers it's suggested the implementation of a network, which interconnects these centers with just one Cloud environment. Finally it's considered the fact of deploying a hybrid solution: in which EHRs with images will be hosted in the Hospital or Primary Care Centers and the rest of them will be migrated to the Cloud.

  11. Two innovative solutions based on fibre concrete blocks designed for building substructure

    Science.gov (United States)

    Pazderka, J.; Hájek, P.

    2017-09-01

    Using of fibers in a high-strength concrete allows reduction of the dimensions of small precast concrete elements, which opens up new ways of solution for traditional construction details in buildings. The paper presents two innovative technical solutions for building substructure: The special shaped plinth block from fibre concrete and the fibre concrete elements for new technical solution of ventilated floor. The main advantages of plinth block from fibre concrete blocks (compared with standard plinth solutions) is: easier and faster assembly, higher durability and thanks to the air cavity between the vertical part of the block, the building substructure reduced moisture level of structures under the waterproofing layer and a comprehensive solution to the final surface of building plinth as well as the surface of adjacent terrain. The ventilated floor based on fibre concrete precast blocks is an attractive structural alternative for tackling the problem of increased moisture in masonry in older buildings, lacking a functional waterproof layer in the substructure.

  12. A web-based rapid assessment tool for production publishing solutions

    Science.gov (United States)

    Sun, Tong

    2010-02-01

    Solution assessment is a critical first-step in understanding and measuring the business process efficiency enabled by an integrated solution package. However, assessing the effectiveness of any solution is usually a very expensive and timeconsuming task which involves lots of domain knowledge, collecting and understanding the specific customer operational context, defining validation scenarios and estimating the expected performance and operational cost. This paper presents an intelligent web-based tool that can rapidly assess any given solution package for production publishing workflows via a simulation engine and create a report for various estimated performance metrics (e.g. throughput, turnaround time, resource utilization) and operational cost. By integrating the digital publishing workflow ontology and an activity based costing model with a Petri-net based workflow simulation engine, this web-based tool allows users to quickly evaluate any potential digital publishing solutions side-by-side within their desired operational contexts, and provides a low-cost and rapid assessment for organizations before committing any purchase. This tool also benefits the solution providers to shorten the sales cycles, establishing a trustworthy customer relationship and supplement the professional assessment services with a proven quantitative simulation and estimation technology.

  13. Solid solution strengthening and diffusion in nickel- and cobalt-based superalloys

    Energy Technology Data Exchange (ETDEWEB)

    Rehman, Hamad ur

    2016-07-01

    Nickel and cobalt-based superalloys with a γ-γ{sup '} microstructure are known for their excellent creep resistance at high temperatures. Their microstructure is engineered using different alloying elements, that partition either to the fcc γ matrix or to the ordered γ{sup '} phase. In the present work the effect of alloying elements on their segregation behaviour in nickel-based superalloys, diffusion in cobalt-based superalloys and the temperature dependent solid solution strengthening in nickel-based alloys is investigated. The effect of dendritic segregation on the local mechanical properties of individual phases in the as-cast, heat treated and creep deformed state of a nickel-based superalloy is investigated. The local chemical composition is characterized using Electron Probe Micro Analysis and then correlated with the mechanical properties of individual phases using nanoindentation. Furthermore, the temperature dependant solid solution hardening contribution of Ta, W and Re towards fcc nickel is studied. The room temperature hardening is determined by a diffusion couple approach using nanoindentation and energy dispersive X-ray analysis for relating hardness to the chemical composition. The high temperature properties are determined using compression strain rate jump tests. The results show that at lower temperatures, the solute size is prevalent and the elements with the largest size difference with nickel, induce the greatest hardening consistent with a classical solid solution strengthening theory. At higher temperatures, the solutes interact with the dislocations such that the slowest diffusing solute poses maximal resistance to dislocation glide and climb. Lastly, the diffusion of different technically relevant solutes in fcc cobalt is investigated using diffusion couples. The results show that the large atoms diffuse faster in cobalt-based superalloys similar to their nickel-based counterparts.

  14. Solid solution strengthening and diffusion in nickel- and cobalt-based superalloys

    International Nuclear Information System (INIS)

    Rehman, Hamad ur

    2016-01-01

    Nickel and cobalt-based superalloys with a γ-γ ' microstructure are known for their excellent creep resistance at high temperatures. Their microstructure is engineered using different alloying elements, that partition either to the fcc γ matrix or to the ordered γ ' phase. In the present work the effect of alloying elements on their segregation behaviour in nickel-based superalloys, diffusion in cobalt-based superalloys and the temperature dependent solid solution strengthening in nickel-based alloys is investigated. The effect of dendritic segregation on the local mechanical properties of individual phases in the as-cast, heat treated and creep deformed state of a nickel-based superalloy is investigated. The local chemical composition is characterized using Electron Probe Micro Analysis and then correlated with the mechanical properties of individual phases using nanoindentation. Furthermore, the temperature dependant solid solution hardening contribution of Ta, W and Re towards fcc nickel is studied. The room temperature hardening is determined by a diffusion couple approach using nanoindentation and energy dispersive X-ray analysis for relating hardness to the chemical composition. The high temperature properties are determined using compression strain rate jump tests. The results show that at lower temperatures, the solute size is prevalent and the elements with the largest size difference with nickel, induce the greatest hardening consistent with a classical solid solution strengthening theory. At higher temperatures, the solutes interact with the dislocations such that the slowest diffusing solute poses maximal resistance to dislocation glide and climb. Lastly, the diffusion of different technically relevant solutes in fcc cobalt is investigated using diffusion couples. The results show that the large atoms diffuse faster in cobalt-based superalloys similar to their nickel-based counterparts.

  15. The superior effect of nature based solutions in land management for enhancing ecosystem services.

    Science.gov (United States)

    Keesstra, Saskia; Nunes, Joao; Novara, Agata; Finger, David; Avelar, David; Kalantari, Zahra; Cerdà, Artemi

    2018-01-01

    The rehabilitation and restoration of land is a key strategy to recover services -goods and resources- ecosystems offer to the humankind. This paper reviews key examples to understand the superior effect of nature based solutions to enhance the sustainability of catchment systems by promoting desirable soil and landscape functions. The use of concepts such as connectivity and the theory of system thinking framework allowed to review coastal and river management as a guide to evaluate other strategies to achieve sustainability. In land management NBSs are not mainstream management. Through a set of case studies: organic farming in Spain; rewilding in Slovenia; land restoration in Iceland, sediment trapping in Ethiopia and wetland construction in Sweden, we show the potential of Nature based solutions (NBSs) as a cost-effective long term solution for hydrological risks and land degradation. NBSs can be divided into two main groups of strategies: soil solutions and landscape solutions. Soil solutions aim to enhance the soil health and soil functions through which local eco-system services will be maintained or restored. Landscape solutions mainly focus on the concept of connectivity. Making the landscape less connected, facilitating less rainfall to be transformed into runoff and therefore reducing flood risk, increasing soil moisture and reducing droughts and soil erosion we can achieve the sustainability. The enhanced eco-system services directly feed into the realization of the Sustainable Development Goals of the United Nations. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. A solution approach to the ROADEF/EURO 2010 challenge based on Benders' Decomposition

    DEFF Research Database (Denmark)

    Lusby, Richard Martin; Muller, Laurent Flindt; Petersen, Bjørn

    is therefore divided into two stages: in the first stage Benders feasibility and optimality cuts are added based on the linear programming relaxation of the Benders Master problem, and in the second stage feasible integer solutions are enumerated and procedure is applied to each solution in an attempt to make...... into the problem and additionally makes it possible to find lower bounds on the problem, which is typically not the case for the competing heuristics....

  17. An approach for the semantic interoperability of ISO EN 13606 and OpenEHR archetypes.

    Science.gov (United States)

    Martínez-Costa, Catalina; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2010-10-01

    The communication between health information systems of hospitals and primary care organizations is currently an important challenge to improve the quality of clinical practice and patient safety. However, clinical information is usually distributed among several independent systems that may be syntactically or semantically incompatible. This fact prevents healthcare professionals from accessing clinical information of patients in an understandable and normalized way. In this work, we address the semantic interoperability of two EHR standards: OpenEHR and ISO EN 13606. Both standards follow the dual model approach which distinguishes information and knowledge, this being represented through archetypes. The solution presented here is capable of transforming OpenEHR archetypes into ISO EN 13606 and vice versa by combining Semantic Web and Model-driven Engineering technologies. The resulting software implementation has been tested using publicly available collections of archetypes for both standards.

  18. The caCORE Software Development Kit: Streamlining construction of interoperable biomedical information services

    Directory of Open Access Journals (Sweden)

    Warzel Denise

    2006-01-01

    Full Text Available Abstract Background Robust, programmatically accessible biomedical information services that syntactically and semantically interoperate with other resources are challenging to construct. Such systems require the adoption of common information models, data representations and terminology standards as well as documented application programming interfaces (APIs. The National Cancer Institute (NCI developed the cancer common ontologic representation environment (caCORE to provide the infrastructure necessary to achieve interoperability across the systems it develops or sponsors. The caCORE Software Development Kit (SDK was designed to provide developers both within and outside the NCI with the tools needed to construct such interoperable software systems. Results The caCORE SDK requires a Unified Modeling Language (UML tool to begin the development workflow with the construction of a domain information model in the form of a UML Class Diagram. Models are annotated with concepts and definitions from a description logic terminology source using the Semantic Connector component. The annotated model is registered in the Cancer Data Standards Repository (caDSR using the UML Loader component. System software is automatically generated using the Codegen component, which produces middleware that runs on an application server. The caCORE SDK was initially tested and validated using a seven-class UML model, and has been used to generate the caCORE production system, which includes models with dozens of classes. The deployed system supports access through object-oriented APIs with consistent syntax for retrieval of any type of data object across all classes in the original UML model. The caCORE SDK is currently being used by several development teams, including by participants in the cancer biomedical informatics grid (caBIG program, to create compatible data services. caBIG compatibility standards are based upon caCORE resources, and thus the caCORE SDK has

  19. Facilitating Semantic Interoperability Among Ocean Data Systems: ODIP-R2R Student Outcomes

    Science.gov (United States)

    Stocks, K. I.; Chen, Y.; Shepherd, A.; Chandler, C. L.; Dockery, N.; Elya, J. L.; Smith, S. R.; Ferreira, R.; Fu, L.; Arko, R. A.

    2014-12-01

    With informatics providing an increasingly important set of tools for geoscientists, it is critical to train the next generation of scientists in information and data techniques. The NSF-supported Rolling Deck to Repository (R2R) Program works with the academic fleet community to routinely document, assess, and preserve the underway sensor data from U.S. research vessels. The Ocean Data Interoperability Platform (ODIP) is an EU-US-Australian collaboration fostering interoperability among regional e-infrastructures through workshops and joint prototype development. The need to align terminology between systems is a common challenge across all of the ODIP prototypes. Five R2R students were supported to address aspects of semantic interoperability within ODIP. Developing a vocabulary matching service that links terms from different vocabularies with similar concept. The service implements Google Refine reconciliation service interface such that users can leverage Google Refine application as a friendly user interface while linking different vocabulary terms. Developing Resource Description Framework (RDF) resources that map Shipboard Automated Meteorological Oceanographic System (SAMOS) vocabularies to internationally served vocabularies. Each SAMOS vocabulary term (data parameter and quality control flag) will be described as an RDF resource page. These RDF resources allow for enhanced discoverability and retrieval of SAMOS data by enabling data searches based on parameter. Improving data retrieval and interoperability by exposing data and mapped vocabularies using Semantic Web technologies. We have collaborated with ODIP participating organizations in order to build a generalized data model that will be used to populate a SPARQL endpoint in order to provide expressive querying over our data files. Mapping local and regional vocabularies used by R2R to those used by ODIP partners. This work is described more fully in a companion poster. Making published Linked Data

  20. Smart home interoperability: the DomoEsi project approach

    OpenAIRE

    Maestre Torreblanca, José María; Camacho, Eduardo F.

    2009-01-01

    The home automation market is characterized by the great number of systems available to the end user. The recent bubble in the building industry made the situation even worse due to the birth of new proprietary systems. The success of the digital home concept depends on the ease of integration between home automation systems and other consumer electronic equipment pre-existing in the home. In this paper the interoperability issue is addressed and the approach followed in the pr...

  1. Improving interoperability by encouraging the sharing of interface specifications

    OpenAIRE

    Weston, Sally

    2017-01-01

    3D CAD software is vital to record design information. The industry is oligopolistic and despite standards has all the elements associated with a lack of interoperability, namely proprietary software, network effects and lock-in. Interfaces are similar to standards and their indirect effect amplifies their impact and value and distorts the intended intellectual property protection. The distributed machine code is not readable and the restrictions on reverse engineering are tantamount to makin...

  2. Medical Device Plug-and-Play Interoperability Standards & Technology Leadership

    Science.gov (United States)

    2011-10-01

    records and will introduce error resistance into networked medical device systems. We are producing a standardization framework consisting of a...We have also begun collecting data on the issue related to device clock time errors and erroneous data time-stamps in preparation for a White House...advances in mind. We also recognize that, as in all technological advances, interoperability poses safety and medico -legal challenges as well. The

  3. Robotics Systems Joint Project Office (RSJPO) Interoperability Profiles (IOPS) 101

    Science.gov (United States)

    2012-07-01

    Management • Suspend/Resume Waypoint Follow • Leader Follower Mode & Attributes • Execution Leader /Follower Operation • Following Status • Suspend...V0:  Leader Management  Leader /Follower Driver  Communicator (i.e. radio messages)  Platform Mode  Health Monitor  Health Reporter...Partial Interoperable Robot • Attributes selected:  Use the JAUS messages for 2 payload ports  Use the “B” style connectors at 12 VDC  Allow the use

  4. Data and Semantic Interoperability for the Oceans Sensor Web

    Science.gov (United States)

    Bermudez, L. E.; Bogden, P.; Bridger, E.; Conover, H.; Creager, G.; Forrest, D.; Gale, T.; Graybeal, J.; Howe, B.; Maskey, M.

    2007-12-01

    Ocean observing systems incorporate a spectrum of sensors and data. Making the data available to any interested scientist is important: data sharing and experimental reproducibility are hallmarks of the scientific process. However, different groups may represent, transport, store and distribute their data in different ways, leading to difficulties in sharing these data. OOSTethys, an open-source community effort with involvement from six regional associations and two major research institutes, is exploring the best mechanisms to make ocean data and metadata interoperable by advancing and influencing standards from the Open Geospatial Consortium (OGC), World Wide Web Consortium (W3C) and OASIS. Our strategy to address these challenges has been to envision a service oriented architecture (SOA) which comprises data providers, registries, semantic mediators, aggregators and visualizers. For each component of the system we select the most appropriate standard(s) and create cookbooks and tools to support its implementation. This improves accessibility for data providers with limited time and limited budgets for information technology projects. For example, we have created cookbooks and toolkits in Perl, Java and Python to facilitate implementation of OGC Sensor Observation Services (SOS). The implementation includes publishing metadata in SensorML, and making data available via Geographic Markup Language (GML) records conforming to the Observation and Measurements specifications. A semantic mediator implemented as a web service uses Semantic Web technologies to solve semantic incompatibilities, and enables proper categorization of the different services. Our initial results are positive: in addition to several national demonstrations of data interoperability, the cookbooks have been used to bring more than 60 oceanographic platforms online, and we have at least 7 data consumers relying on web services for their own oceanographic applications. OOSTethys work is an

  5. Enabling Medical Device Interoperability for the Integrated Clinical Environment

    Science.gov (United States)

    2016-12-01

    Integration ” at Society of Critical Care Medicine Annual Congress, San Francisco, CA  January 21-22 2014 – Chaired Meetings for US TAG ISO TC 121 on...Award Number: W81XWH-12-C-0154 TITLE: “Enabling Medical Device Interoperability for the Integrated Clinical Environment” PRINCIPAL INVESTIGATOR...Julian M. Goldman, MD CONTRACTING ORGANIZATION: Massachusetts General Hospital Boston, MA 02114 REPORT DATE: December 2016 TYPE OF REPORT : Final

  6. Service interoperability through advanced media gateways

    CERN Document Server

    van der Meer, S

    2000-01-01

    The convergence of telecommunications systems and the Internet causes a variety of concepts for service integration. The focus of the recent research studies and the work of several standardization bodies lies mostly on the interworking of services and the universal service access from end-user systems including both fixed and wireless terminals. All approaches are driven by the concept of providing several technologies to users by keeping the peculiarity of each service alive. But, developments should not only concentrate on media adaptation between VoIP and PSTN, but also consider the adaptation among completely different types of applications as for example E- mail, facsimile, or voice. Unified messaging, which is an already accepted service on the market, provides solutions for conversions of different application protocols into each other. The functionality of converting one medium into another is implemented here in so-called media gateways. This paper provides an overview of the current developments in...

  7. Arc-An OAI Service Provider for Digital Library Federation; Kepler-An OAI Data/Service Provider for the Individual; Information Objects and Rights Management: A Mediation-Based Approach to DRM Interoperability; Automated Name Authority Control and Enhanced Searching in the Levy Collection; Renardus Project Developments and the Wider Digital Library Context.

    Science.gov (United States)

    Liu, Xiaoming; Maly, Kurt; Zubair, Mohammad; Nelson, Michael L.; Erickson, John S.; DiLauro, Tim; Choudhury, G. Sayeed; Patton, Mark; Warner, James W.; Brown, Elizabeth W.; Heery, Rachel; Carpenter, Leona; Day, Michael

    2001-01-01

    Includes five articles that discuss the OAI (Open Archive Initiative), an interface between data providers and service providers; information objects and digital rights management interoperability; digitizing library collections, including automated name authority control, metadata, and text searching engines; and building digital library services…

  8. The role of preservation solution on acid-base regulation during machine perfusion of kidneys.

    Science.gov (United States)

    Baicu, Simona C; Taylor, Michael J; Brockbank, Kelvin G M

    2006-01-01

    To meet the current clinical organ demand, efficient preservation methods and solutions are needed to increase the number of viable kidneys for transplantation. In the present study, the influence of perfusion solution buffering strength on renal pH dynamics and regulation mechanisms during kidney ex vivo preservation was determined. Porcine kidneys were hypothermically machine perfused for 72 h with either Unisol-UHK or Belzer-Machine Perfusion solution, Belzer-MP solution. Renal perfusate samples were periodically collected and biochemically analyzed. The UHK solution, a Hepes-based solution (35 mM), provided a more efficient control of renal pH that, in turn, resulted in minor changes in the perfusate pH relative to baseline, in response to tissue CO2 and HCO3- production. In the perfusate of Belzer-MP kidney group a wider range of pH values were recorded and a pronounced pH reduction was seen in response to significant rises in pCO2 and HCO3- concentrations. The Belzer-MP solution, containing phosphate (25 mM) as its main buffer, and only 10 mM Hepes, had a greater buffering requirement to attenuate larger pH changes.

  9. PREPARATION AND PROPERTIES OF THE COLLODIAL SOLUTION BASED ON BIOGENIC METAL NANOPARTICLES

    Directory of Open Access Journals (Sweden)

    K. V. Liapina

    2014-12-01

    Full Text Available The aim of the work was obtaining a stable suspension based on biocompatible substances with application of biogenic metal nanoparticles encapsulated into NaCl salt matrix, as a precursor. Water-soluble complex based on different amine derivatives with antiseptic properties was selected as a liquid for salt dissolution. The solution was subjected to dispersion using ultrasonication at elevated temperature. Dispersion is accompanied by salt shell removal with simultaneous formation of an organic shell on the surfaces of metal nanoparticles that ensure their stabilization. Study of the suspension after soaking at room temperature for 100 days showed that its characteristics remain stable. A method for producing a stable colloidal solution based on nanoparticles of biogenic metal (Cu, Co, fem etc. was developed. Metal nanopowder encapsulated into salt shell was used as a precursor. It is shown that such colloidal solutions are characterized by narrow size dispersion, as well as stability to temperature impact and time factor.

  10. Finite element limit analysis based plastic limit pressure solutions for cracked pipes

    International Nuclear Information System (INIS)

    Shim, Do Jun; Huh, Nam Su; Kim, Yun Jae; Kim, Young Jin

    2002-01-01

    Based on detailed FE limit analyses, the present paper provides tractable approximations for plastic limit pressure solutions for axial through-wall cracked pipe; axial (inner) surface cracked pipe; circumferential through-wall cracked pipe; and circumferential (inner) surface cracked pipe. Comparisons with existing analytical and empirical solutions show a large discrepancy in circumferential short through-wall cracks and in surface cracks (both axial and circumferential). Being based on detailed 3-D FE limit analysis, the present solutions are believed to be the most accurate, and thus to be valuable information not only for plastic collapse analysis of pressurised piping but also for estimating non-linear fracture mechanics parameters based on the reference stress approach

  11. Two-Level Evaluation on Sensor Interoperability of Features in Fingerprint Image Segmentation

    Directory of Open Access Journals (Sweden)

    Ya-Shuo Li

    2012-03-01

    Full Text Available Features used in fingerprint segmentation significantly affect the segmentation performance. Various features exhibit different discriminating abilities on fingerprint images derived from different sensors. One feature which has better discriminating ability on images derived from a certain sensor may not adapt to segment images derived from other sensors. This degrades the segmentation performance. This paper empirically analyzes the sensor interoperability problem of segmentation feature, which refers to the feature’s ability to adapt to the raw fingerprints captured by different sensors. To address this issue, this paper presents a two-level feature evaluation method, including the first level feature evaluation based on segmentation error rate and the second level feature evaluation based on decision tree. The proposed method is performed on a number of fingerprint databases which are obtained from various sensors. Experimental results show that the proposed method can effectively evaluate the sensor interoperability of features, and the features with good evaluation results acquire better segmentation accuracies of images originating from different sensors.

  12. Usability and Interoperability in Wireless Sensor Networks for Patient Telemonitoring in Chronic Disease Management.

    Science.gov (United States)

    Jiménez-Fernández, Silvia; de Toledo, Paula; del Pozo, Francisco

    2013-12-01

    This paper addresses two key technological barriers to the wider adoption of patient telemonitoring systems for chronic disease management, namely, usability and sensor device interoperability. As a great percentage of chronic patients are elderly patients as well, usability of the system has to be adapted to their needs. This paper identifies (from previous research) a set of design criteria to address these challenges, and describes the resulting system based on a wireless sensor network, and including a node as a custom-made interface that follows usability design criteria stated. This system has been tested with 22 users (mean age 65) and evaluated with a validated usability questionnaire. Results are good and improve those of other systems based on TV or smartphone. Our results suggest that user interfaces alternative to TVs and smartphones could play an important role on the usability of sensor networks for patient monitoring. Regarding interoperability, only very recently a standard has been published (2010, the ISO IEEE 11073 Personal health devices) that can support the needs of limited computational power environments typical of patient monitoring sensor networks.

  13. Intelligent semantic interoperability: Integrating knowledge, terminology and information models to support stroke care.

    Science.gov (United States)

    Goossen, William T F

    2006-01-01

    Electronic patient record (EPR) systems for the continuity of care for stroke patient are under development. These systems are based on standards such as for clinical practice, vocabularies, and the HL7 information model. In order to achieve intelligent semantic interoperability, knowledge about evidence based patient care, vocabulary and information models need to be integrated. A format was developed in which the clinical knowledge, clinical terminology, and standard information models are integrated as specification for the technical implementation of electronic health systems and electronic messages. This format is verified by clinicians and technicians. The document structure consists of meta-information such as version control and changes, purpose of the clinical content, evidence from the literature, variables and values, terminology used, guidelines for application and interpretation, HL7 message models, coding, and technical data specification. Further, XML message excerpts, archetypes and screen designs are developed from these documents to facilitate implementation. The combination of these aspects in one document creates valuable content for intelligent semantic interoperability by means of development of messages and systems.

  14. Interoperable and accessible census and survey data from IPUMS.

    Science.gov (United States)

    Kugler, Tracy A; Fitch, Catherine A

    2018-02-27

    The first version of the Integrated Public Use Microdata Series (IPUMS) was released to users in 1993, and since that time IPUMS has come to stand for interoperable and accessible census and survey data. Initially created to harmonize U.S. census microdata over time, IPUMS now includes microdata from the U.S. and international censuses and from surveys on health, employment, and other topics. IPUMS also provides geo-spatial data, aggregate population data, and environmental data. IPUMS supports ten data products, each disseminating an integrated data collection with a set of tools that make complex data easy to find, access, and use. Key features are record-level integration to create interoperable datasets, user-friendly interfaces, and comprehensive metadata and documentation. The IPUMS philosophy aligns closely with the FAIR principles of findability, accessibility, interoperability, and re-usability. IPUMS data have catalyzed knowledge generation across a wide range of social science and other disciplines, as evidenced by the large volume of publications and other products created by the vast IPUMS user community.

  15. Improving interoperability through gateways and cots technologies

    CSIR Research Space (South Africa)

    Smith, C

    2013-01-01

    Full Text Available the differences between a traditional web approach and SDR-based networks. The typical command and control headquarter has evolve during the last few years, were the hardware based terminals has been replaced for SDR terminals working together seamlessly...

  16. The Breather-Like and Rational Solutions for the Integrable Kadomtsev-Petviashvili-Based System

    Directory of Open Access Journals (Sweden)

    Chuanjian Wang

    2015-01-01

    Full Text Available The integrable Kadomtsev-Petviashvili-based system is studied. The breather-like (a pulsating mode and rational solutions are presented applying Hirota bilinear method and Taylor series. The intricate structures of the rational solitary wave solution are discussed mathematically and graphically. The existence conditions of three different solitary wave solution structure for the short-wave field are given by the theory of extreme value analysis. By controlling the wave number of the background plane wave we may control the the behavior of rational solitary wave. However, the shape of the rational solitary wave solution for the real long-wave field is not affected as the wave number is varied.

  17. Synthesis method based on solution regions for planar four bar straight line linkages

    International Nuclear Information System (INIS)

    Lai Rong, Yin; Cong, Mao; Jian you, Han; Tong, Yang; Juan, Huang

    2012-01-01

    An analytical method for synthesizing and selecting desired four-bar straight line mechanisms based on solution regions is presented. Given two fixed pivots, the point position and direction of the target straight line, an infinite number of mechanism solutions can be produced by employing this method, both in the general case and all three special cases. Unifying the straight line direction and the displacement from the given point to the instant center into the same form with different angles as parameters, infinite mechanism solutions can be expressed with different solution region charts. The mechanism property graphs have been computed to enable the designers to find out the involved mechanism information more intuitively and avoid aimlessness in selecting optimal mechanisms

  18. The Train Driver Recovery Problem - a Set Partitioning Based Model and Solution Method

    DEFF Research Database (Denmark)

    Rezanova, Natalia Jurjevna; Ryan, David

    2010-01-01

    The need to recover a train driver schedule occurs during major disruptions in the daily railway operations. Based on data from the Danish passenger railway operator DSB S-tog A/S, a solution method to the train driver recovery problem (TDRP) is developed. The TDRP is formulated as a set...... partitioning problem. We define a disruption neighbourhood by identifying a small set of drivers and train tasks directly affected by the disruption. Based on the disruption neighbourhood, the TDRP model is formed and solved. If the TDRP solution provides a feasible recovery for the drivers within...

  19. A potential model for sodium chloride solutions based on the TIP4P/2005 water model

    Science.gov (United States)

    Benavides, A. L.; Portillo, M. A.; Chamorro, V. C.; Espinosa, J. R.; Abascal, J. L. F.; Vega, C.

    2017-09-01

    Despite considerable efforts over more than two decades, our knowledge of the interactions in electrolyte solutions is not yet satisfactory. Not even one of the most simple and important aqueous solutions, NaCl(aq), escapes this assertion. A requisite for the development of a force field for any water solution is the availability of a good model for water. Despite the fact that TIP4P/2005 seems to fulfill the requirement, little work has been devoted to build a force field based on TIP4P/2005. In this work, we try to fill this gap for NaCl(aq). After unsuccessful attempts to produce accurate predictions for a wide range of properties using unity ionic charges, we decided to follow recent suggestions indicating that the charges should be scaled in the ionic solution. In this way, we have been able to develop a satisfactory non-polarizable force field for NaCl(aq). We evaluate a number of thermodynamic properties of the solution (equation of state, maximum in density, enthalpies of solution, activity coefficients, radial distribution functions, solubility, surface tension, diffusion coefficients, and viscosity). Overall the results for the solution are very good. An important achievement of our model is that it also accounts for the dynamical properties of the solution, a test for which the force fields so far proposed failed. The same is true for the solubility and for the maximum in density where the model describes the experimental results almost quantitatively. The price to pay is that the model is not so good at describing NaCl in the solid phase, although the results for several properties (density and melting temperature) are still acceptable. We conclude that the scaling of the charges improves the overall description of NaCl aqueous solutions when the polarization is not included.

  20. CityGML - Interoperable semantic 3D city models

    Science.gov (United States)

    Gröger, Gerhard; Plümer, Lutz

    2012-07-01

    CityGML is the international standard of the Open Geospatial Consortium (OGC) for the representation and exchange of 3D city models. It defines the three-dimensional geometry, topology, semantics and appearance of the most relevant topographic objects in urban or regional contexts. These definitions are provided in different, well-defined Levels-of-Detail (multiresolution model). The focus of CityGML is on the semantical aspects of 3D city models, its structures, taxonomies and aggregations, allowing users to employ virtual 3D city models for advanced analysis and visualization tasks in a variety of application domains such as urban planning, indoor/outdoor pedestrian navigation, environmental simulations, cultural heritage, or facility management. This is in contrast to purely geometrical/graphical models such as KML, VRML, or X3D, which do not provide sufficient semantics. CityGML is based on the Geography Markup Language (GML), which provides a standardized geometry model. Due to this model and its well-defined semantics and structures, CityGML facilitates interoperable data exchange in the context of geo web services and spatial data infrastructures. Since its standardization in 2008, CityGML has become used on a worldwide scale: tools from notable companies in the geospatial field provide CityGML interfaces. Many applications and projects use this standard. CityGML is also having a strong impact on science: numerous approaches use CityGML, particularly its semantics, for disaster management, emergency responses, or energy-related applications as well as for visualizations, or they contribute to CityGML, improving its consistency and validity, or use CityGML, particularly its different Levels-of-Detail, as a source or target for generalizations. This paper gives an overview of CityGML, its underlying concepts, its Levels-of-Detail, how to extend it, its applications, its likely future development, and the role it plays in scientific research. Furthermore, its

  1. Category Theory Approach to Solution Searching Based on Photoexcitation Transfer Dynamics

    Directory of Open Access Journals (Sweden)

    Makoto Naruse

    2017-07-01

    Full Text Available Solution searching that accompanies combinatorial explosion is one of the most important issues in the age of artificial intelligence. Natural intelligence, which exploits natural processes for intelligent functions, is expected to help resolve or alleviate the difficulties of conventional computing paradigms and technologies. In fact, we have shown that a single-celled organism such as an amoeba can solve constraint satisfaction problems and related optimization problems as well as demonstrate experimental systems based on non-organic systems such as optical energy transfer involving near-field interactions. However, the fundamental mechanisms and limitations behind solution searching based on natural processes have not yet been understood. Herein, we present a theoretical background of solution searching based on optical excitation transfer from a category-theoretic standpoint. One important indication inspired by the category theory is that the satisfaction of short exact sequences is critical for an adequate computational operation that determines the flow of time for the system and is termed as “short-exact-sequence-based time.” In addition, the octahedral and braid structures known in triangulated categories provide a clear understanding of the underlying mechanisms, including a quantitative indication of the difficulties of obtaining solutions based on homology dimension. This study contributes to providing a fundamental background of natural intelligence.

  2. Multi-Agent Decision Support Tool to Enable Interoperability among Heterogeneous Energy Systems

    Directory of Open Access Journals (Sweden)

    Brígida Teixeira

    2018-02-01

    Full Text Available Worldwide electricity markets are undergoing a major restructuring process. One of the main reasons for the ongoing changes is to enable the adaptation of current market models to the new paradigm that arises from the large-scale integration of distributed generation sources. In order to deal with the unpredictability caused by the intermittent nature of the distributed generation and the large number of variables that contribute to the energy sector balance, it is extremely important to use simulation systems that are capable of dealing with the required complexity. This paper presents the Tools Control Center (TOOCC, a framework that allows the interoperability between heterogeneous energy and power simulation systems through the use of ontologies, allowing the simulation of scenarios with a high degree of complexity, through the cooperation of the individual capacities of each system. A case study based on real data is presented in order to demonstrate the interoperability capabilities of TOOCC. The simulation considers the energy management of a microgrid of a real university campus, from the perspective of the network manager and also of its consumers/producers, in a projection for a typical day of the winter of 2050.

  3. Development of high performance scientific components for interoperability of computing packages

    Energy Technology Data Exchange (ETDEWEB)

    Gulabani, Teena Pratap [Iowa State Univ., Ames, IA (United States)

    2008-01-01

    Three major high performance quantum chemistry computational packages, NWChem, GAMESS and MPQC have been developed by different research efforts following different design patterns. The goal is to achieve interoperability among these packages by overcoming the challenges caused by the different communication patterns and software design of each of these packages. A chemistry algorithm is hard to develop as well as being a time consuming process; integration of large quantum chemistry packages will allow resource sharing and thus avoid reinvention of the wheel. Creating connections between these incompatible packages is the major motivation of the proposed work. This interoperability is achieved by bringing the benefits of Component Based Software Engineering through a plug-and-play component framework called Common Component Architecture (CCA). In this thesis, I present a strategy and process used for interfacing two widely used and important computational chemistry methodologies: Quantum Mechanics and Molecular Mechanics. To show the feasibility of the proposed approach the Tuning and Analysis Utility (TAU) has been coupled with NWChem code and its CCA components. Results show that the overhead is negligible when compared to the ease and potential of organizing and coping with large-scale software applications.

  4. An Interoperability Platform Enabling Reuse of Electronic Health Records for Signal Verification Studies

    Directory of Open Access Journals (Sweden)

    Mustafa Yuksel

    2016-01-01

    Full Text Available Depending mostly on voluntarily sent spontaneous reports, pharmacovigilance studies are hampered by low quantity and quality of patient data. Our objective is to improve postmarket safety studies by enabling safety analysts to seamlessly access a wide range of EHR sources for collecting deidentified medical data sets of selected patient populations and tracing the reported incidents back to original EHRs. We have developed an ontological framework where EHR sources and target clinical research systems can continue using their own local data models, interfaces, and terminology systems, while structural interoperability and Semantic Interoperability are handled through rule-based reasoning on formal representations of different models and terminology systems maintained in the SALUS Semantic Resource Set. SALUS Common Information Model at the core of this set acts as the common mediator. We demonstrate the capabilities of our framework through one of the SALUS safety analysis tools, namely, the Case Series Characterization Tool, which have been deployed on top of regional EHR Data Warehouse of the Lombardy Region containing about 1 billion records from 16 million patients and validated by several pharmacovigilance researchers with real-life cases. The results confirm significant improvements in signal detection and evaluation compared to traditional methods with the missing background information.

  5. System and methods of resource usage using an interoperable management framework

    Science.gov (United States)

    Heileman, Gregory L.; Jamkhedkar, Pramod A.; Lamb, Christopher C.

    2017-10-31

    Generic rights expression language allowing interoperability across different computing environments including resource usage of different applications. A formal framework for usage management provides scaffolding upon which interoperable usage management systems can be built. Certain features of the framework are standardized, such as the operational semantics, including areas free of standards that necessitate choice and innovation to achieve a balance of flexibility and usability for interoperability in usage management systems.

  6. Ambipolar organic field-effect transistors based on a solution-processed methanofullerene

    NARCIS (Netherlands)

    Anthopoulos, Thomas D.; Tanase, Cristina; Setayesh, Sepas; Meijer, Eduard J.; Hummelen, Jan C.; Blom, Paul W.M.; de Leeuw, Dagobert

    2004-01-01

    Organic field-effect transistors (OFETs, see Figure), based on the solution-processible methanofullerene [6,6]-phenyl-C-61-butyric acid methyl ester (PCBM), have been fabricated in a bottom-contact device configuration using gold electrodes. The OFET functions either as a p- or n-channel device,

  7. An All-Solution-Based Hybrid CMOS-Like Quantum Dot/Carbon Nanotube Inverter

    NARCIS (Netherlands)

    Shulga, Artem G.; Derenskyi, Vladimir; Salazar-Rios, Jorge Mario; Dirin, Dmitry N.; Fritsch, Martin; Kovalenko, Maksym V.; Scherf, Ullrich; Loi, Maria A.

    2017-01-01

    The development of low-cost, flexible electronic devices is subordinated to the advancement in solution-based and low-temperature-processable semiconducting materials, such as colloidal quantum dots (QDs) and single-walled carbon nanotubes (SWCNTs). Here, excellent compatibility of QDs and SWCNTs as

  8. The superior effect of nature based solutions in land management for enhancing ecosystem services

    NARCIS (Netherlands)

    Keesstra, Saskia; Keesstra, Saskia; Nunes, Joao P.; Novara, Agata; Finger, David; Avelar, David; Kalantari, Zahra; Cerdà, Artemi

    2018-01-01

    The rehabilitation and restoration of land is a key strategy to recover services -goods and resources- ecosystems offer to the humankind. This paper reviews key examples to understand the superior effect of nature based solutions to enhance the sustainabilit y of catchment systems by promoting

  9. Flexible, low-temperature, solution processed ZnO-based perovskite solid state solar cells.

    Science.gov (United States)

    Kumar, Mulmudi Hemant; Yantara, Natalia; Dharani, Sabba; Graetzel, Michael; Mhaisalkar, Subodh; Boix, Pablo P; Mathews, Nripan

    2013-12-07

    A ZnO compact layer formed by electrodeposition and ZnO nanorods grown by chemical bath deposition (CBD) allow the processing of low-temperature, solution based and flexible solid state perovskite CH3NH3PbI3 solar cells. Conversion efficiencies of 8.90% were achieved on rigid substrates while the flexible ones yielded 2.62%.

  10. Prediction of Pure Component Adsorption Equilibria Using an Adsorption Isotherm Equation Based on Vacancy Solution Theory

    DEFF Research Database (Denmark)

    Marcussen, Lis; Aasberg-Petersen, K.; Krøll, Annette Elisabeth

    2000-01-01

    An adsorption isotherm equation for nonideal pure component adsorption based on vacancy solution theory and the Non-Random-Two-Liquid (NRTL) equation is found to be useful for predicting pure component adsorption equilibria at a variety of conditions. The isotherm equation is evaluated successfully...... adsorption systems, spreading pressure and isosteric heat of adsorption are also calculated....

  11. Identification of phase structure of plated zinc alloys based on a linear voltammetry in alkaline solutions

    Directory of Open Access Journals (Sweden)

    Lina V. Petrenko

    2016-12-01

    Full Text Available The purpose of research was the development of new and effective technique of electroplatings phase composition analysis by inversion voltammetric methods. As a result the possibility of the phase composition of the plated zinc-based alloys identification using anodic linear voltammetry in alkaline solutions was shown. The phase composition Zn–(0.27–9.4% Fe alloy electroplated from alkaline zincate solutions was defined based on voltammetry data. As part of the Zn–Fe alloys the phase of hexagonal structure was found which is absent in the equilibrium phase diagram. The ratio of hexagonal crystal lattice axes (c/a and the electron concentration (e/a for this phase are significantly different from the corresponding values for the primary solid solution η. From the analysis of c/a and e/a values of investigated Zn–Fe alloy the defined phase was identified as a solid solution phase type ε. It also was shown that anodic linear voltammetry accomplished in alkaline solutions is more sensitive to the identification of the phase composition of zinc alloys than the traditional X-ray method and stripping voltammetry.

  12. Interoperability, Data Control and Battlespace Visualization using XML, XSLT and X3D

    National Research Council Canada - National Science Library

    Neushul, James

    2003-01-01

    This work represents the realization of Network-Centric goals of interoperability, information management, systems integration and cohesive battlespace visualization using networked computer technology...

  13. A Solution Based on Bluetooth Low Energy for Smart Home Energy Management

    OpenAIRE

    Collotta, Mario; Pau, Giovanni

    2015-01-01

    The research and the implementation of home automation are getting more popular because the Internet of Things holds promise for making homes smarter through wireless technologies. The installation of systems based on wireless networks can play a key role also in the extension of the smart grid towards smart homes, that can be deemed as one of the most important components of smart grids. This paper proposes a fuzzy-based solution for smart energy management in a home automation wireless netw...

  14. Efficacy of handrubbing with alcohol based solution versus standard handwashing with antiseptic soap: randomised clinical trial

    Science.gov (United States)

    Girou, Emmanuelle; Loyeau, Sabrina; Legrand, Patrick; Oppein, Françoise; Brun-Buisson, Christian

    2002-01-01

    Objective To compare the efficacy of handrubbing with an alcohol based solution versus conventional handwashing with antiseptic soap in reducing hand contamination during routine patient care. Design Randomised controlled trial during daily nursing sessions of 2 to 3 hours. Setting Three intensive care units in a French university hospital. Participants 23 healthcare workers. Interventions Handrubbing with alcohol based solution (n=12) or handwashing with antiseptic soap (n=11) when hand hygiene was indicated before and after patient care. Imprints taken of fingertips and palm of dominant hand before and after hand hygiene procedure. Bacterial counts quantified blindly. Main outcome measures Bacterial reduction of hand contamination. Results With handrubbing the median percentage reduction in bacterial contamination was significantly higher than with handwashing (83% v 58%, P=0.012), with a median difference in the percentage reduction of 26% (95% confidence interval 8% to 44%). The median duration of hand hygiene was 30 seconds in each group. Conclusions During routine patient care handrubbing with an alcohol based solution is significantly more efficient in reducing hand contamination than handwashing with antiseptic soap. What is already known on this topicTo improve compliance with hand hygiene during patient care, handrubbing with an alcohol based solution has been proposed as a substitute for handwashing because of its rapid action and accessibilityExperimental studies show that handrubbing is at least as effective as medicated soap in reducing artificial contamination of handsMany healthcare workers still have reservations regarding its efficacy and are reluctant to use this techniqueWhat this study addsWhen used in routine practice, handrubbing with an alcohol based solution after contact with patients achieved a greater reduction in bacterial contamination of hands than conventional handwashing with medicated soap PMID:12183307

  15. Military Interoperable Digital Hospital Testbed (MIDHT)

    Science.gov (United States)

    2015-12-01

    C. "Using barcode medication administration to improve quality and safety." 2008. 110 Hurley A, Lancaster D, Hayes J, Wilson-Chase C, Bane A...Java install that is required by CONNECT v3.3.1.3. • Updated the MIDHT code base to work with the CONNECT v.3.3.1.3 Core Libraries . • Provided...v2.2 Lab Result messages using the open-source HL7 Application Programming Interface (HAPI) library to HLV v3. • Assisted Conemaugh with identifying

  16. SOA approach to battle command: simulation interoperability

    Science.gov (United States)

    Mayott, Gregory; Self, Mid; Miller, Gordon J.; McDonnell, Joseph S.

    2010-04-01

    NVESD is developing a Sensor Data and Management Services (SDMS) Service Oriented Architecture (SOA) that provides an innovative approach to achieve seamless application functionality across simulation and battle command systems. In 2010, CERDEC will conduct a SDMS Battle Command demonstration that will highlight the SDMS SOA capability to couple simulation applications to existing Battle Command systems. The demonstration will leverage RDECOM MATREX simulation tools and TRADOC Maneuver Support Battle Laboratory Virtual Base Defense Operations Center facilities. The battle command systems are those specific to the operation of a base defense operations center in support of force protection missions. The SDMS SOA consists of four components that will be discussed. An Asset Management Service (AMS) will automatically discover the existence, state, and interface definition required to interact with a named asset (sensor or a sensor platform, a process such as level-1 fusion, or an interface to a sensor or other network endpoint). A Streaming Video Service (SVS) will automatically discover the existence, state, and interfaces required to interact with a named video stream, and abstract the consumers of the video stream from the originating device. A Task Manager Service (TMS) will be used to automatically discover the existence of a named mission task, and will interpret, translate and transmit a mission command for the blue force unit(s) described in a mission order. JC3IEDM data objects, and software development kit (SDK), will be utilized as the basic data object definition for implemented web services.

  17. Heavy emulsifying solution on a petrolium base stabilized by calcium soaps of organic acids

    Energy Technology Data Exchange (ETDEWEB)

    Anikeenko, G.I.; Kasperskii, B.V.; Penkov, A.I.; Vakhrushev, L.P.

    1980-01-01

    These studies make possible the solution to the problem at hand through a special selection of organic acids contained in the emulsifier. It was found that the addition of an organic acid mixture obtained from sebacious acid production to the oil-based emulsifying solution (OBES) results in clearly defined thixotropic properties in the emulsifying system. Additional studies have shown that the electrostability of OBES increases 1.5 to 1.8X following the addition of 225g of bentonite to 1 1 of emulsion (tested on bentonite) with a 20% increase in the calcium chloride content.

  18. A Solution-Based Temperature Sensor Using the Organic Compound CuTsPc

    Directory of Open Access Journals (Sweden)

    Shahino Mah Abdullah

    2014-06-01

    Full Text Available An electrochemical cell using an organic compound, copper (II phthalocyanine-tetrasulfonic acid tetrasodium salt (CuTsPc, has been fabricated and investigated as a solution-based temperature sensor. The capacitance and resistance of the ITO/CuTsPc solution/ITO chemical cell has been characterized as a function of temperature in the temperature range of 25–80 °C. A linear response with minimal hysteresis is observed. The fabricated temperature sensor has shown high consistency and sensitive response towards a specific range of temperature values.

  19. Fire hazard analysis of alcohol aqueous solution and Chinese liquor based on flash point

    Science.gov (United States)

    Chen, Qinpei; Kang, Guoting; Zhou, Tiannian; Wang, Jian

    2017-10-01

    In this paper, a series of experiments were conducted to study the flash point of alcohol aqueous solution and Chinese liquor. The fire hazard of the experimental results was analysed based on the standard GB50160-2008 of China. The result shows open-cup method doesn’t suit to alcohol aqueous solution. On the other hand, the closed-cup method shows good applicability. There is a non-linear relationship between closed-cup flash point and alcohol volume concentration. And the prediction equation established in this paper shows good fitting to the flash point and fire hazard classification of Chinese liquor.

  20. Analyze the optimal solutions of optimization problems by means of fractional gradient based system using VIM

    Directory of Open Access Journals (Sweden)

    Firat Evirgen

    2016-04-01

    Full Text Available In this paper, a class of Nonlinear Programming problem is modeled with gradient based system of fractional order differential equations in Caputo's sense. To see the overlap between the equilibrium point of the fractional order dynamic system and theoptimal solution of the NLP problem in a longer timespan the Multistage Variational İteration Method isapplied. The comparisons among the multistage variational iteration method, the variationaliteration method and the fourth order Runge-Kutta method in fractional and integer order showthat fractional order model and techniques can be seen as an effective and reliable tool for finding optimal solutions of Nonlinear Programming problems.

  1. A Solution-Based Intelligent Tutoring System Integrated with an Online Game-Based Formative Assessment: Development and Evaluation

    Science.gov (United States)

    Hooshyar, Danial; Ahmad, Rodina Binti; Yousefi, Moslem; Fathi, Moein; Abdollahi, Abbas; Horng, Shi-Jinn; Lim, Heuiseok

    2016-01-01

    Nowadays, intelligent tutoring systems are considered an effective research tool for learning systems and problem-solving skill improvement. Nonetheless, such individualized systems may cause students to lose learning motivation when interaction and timely guidance are lacking. In order to address this problem, a solution-based intelligent…

  2. An integral equation based numerical solution for nanoparticles illuminated with collimated and focused light.

    Science.gov (United States)

    Sendur, Kürşat

    2009-04-27

    To address the large number of parameters involved in nano-optical problems, a more efficient computational method is necessary. An integral equation based numerical solution is developed when the particles are illuminated with collimated and focused incident beams. The solution procedure uses the method of weighted residuals, in which the integral equation is reduced to a matrix equation and then solved for the unknown electric field distribution. In the solution procedure, the effects of the surrounding medium and boundaries are taken into account using a Green's function formulation. Therefore, there is no additional error due to artificial boundary conditions unlike differential equation based techniques, such as finite difference time domain and finite element method. In this formulation, only the scattering nano-particle is discretized. Such an approach results in a lesser number of unknowns in the resulting matrix equation. The results are compared to the analytical Mie series solution for spherical particles, as well as to the finite element method for rectangular metallic particles. The Richards-Wolf vector field equations are combined with the integral equation based formulation to model the interaction of nanoparticles with linearly and radially polarized incident focused beams.

  3. Agile Implementation: A Blueprint for Implementing Evidence-Based Healthcare Solutions.

    Science.gov (United States)

    Boustani, Malaz; Alder, Catherine A; Solid, Craig A

    2018-03-07

    To describe the essential components of an Agile Implementation (AI) process, which rapidly and effectively implements evidence-based healthcare solutions, and present a case study demonstrating its utility. Case demonstration study. Integrated, safety net healthcare delivery system in Indianapolis. Interdisciplinary team of clinicians and administrators. Reduction in dementia symptoms and caregiver burden; inpatient and outpatient care expenditures. Implementation scientists were able to implement a collaborative care model for dementia care and sustain it for more than 9 years. The model was implemented and sustained by using the elements of the AI process: proactive surveillance and confirmation of clinical opportunities, selection of the right evidence-based healthcare solution, localization (i.e., tailoring to the local environment) of the selected solution, development of an evaluation plan and performance feedback loop, development of a minimally standardized operation manual, and updating such manual annually. The AI process provides an effective model to implement and sustain evidence-based healthcare solutions. © 2018, Copyright the Authors Journal compilation © 2018, The American Geriatrics Society.

  4. COMP Superscalar, an interoperable programming framework

    Directory of Open Access Journals (Sweden)

    Rosa M. Badia

    2015-12-01

    Full Text Available COMPSs is a programming framework that aims to facilitate the parallelization of existing applications written in Java, C/C++ and Python scripts. For that purpose, it offers a simple programming model based on sequential development in which the user is mainly responsible for (i identifying the functions to be executed as asynchronous parallel tasks and (ii annotating them with annotations or standard Python decorators. A runtime system is in charge of exploiting the inherent concurrency of the code, automatically detecting and enforcing the data dependencies between tasks and spawning these tasks to the available resources, which can be nodes in a cluster, clouds or grids. In cloud environments, COMPSs provides scalability and elasticity features allowing the dynamic provision of resources.

  5. Color stability of siloranes versus methacrylate-based composites after immersion in staining solutions.

    Science.gov (United States)

    Arocha, Mariana A; Mayoral, Juan R; Lefever, Dorien; Mercade, Montserrat; Basilio, Juan; Roig, Miguel

    2013-07-01

    The purpose of this study was to determine, by using a spectrophotometer device, the color stability of silorane in comparison with four methacrylate-based composites after being immersed in different staining solutions such as coffee, black tea, red wine, orange juice, and coke, and distilled water as control group. Four restorative methacrylate-based composites (Filtek Z250, TetricEvoCeram, Venus Diamond, and Grandio) and one silorane (FiltekSilorane) of shade A2 were selected to measure their color stability (180 disk samples) after 4 weeks of immersion in six staining solutions: black tea, coffee, red wine, orange juice, coke, and distilled water. The specimen's color was measured each week by means of a spectrophotometer (CIE L*a*b* system). Statistical analysis was carried out performing an ANOVA and LSD Test in order to statistically analyze differences in L*a*b*and ∆E values. All materials showed significant discoloration (p < 0.05) when compared to the control group (immersed in distilled water). The Highest ∆E observed was with red wine, whereas coke led to the lowest one. Silorane showed the highest color stability compared with methacrylate-based composites. Methacrylate-based materials immersed in staining solutions showed lower color stability when compared with silorane. Great differences in ∆E were found among the methacrylate-based materials tested. Although color stability of methacrylate-based composites immersed in staining solutions has been widely investigated, this has not been done for long immersion periods with silorane-based composites.

  6. Raman Spectroscopy for Understanding of Lithium Intercalation into Graphite in Propylene Carbonated-Based Solutions

    Directory of Open Access Journals (Sweden)

    Yang-Soo Kim

    2015-01-01

    Full Text Available Electrochemical lithium intercalation within graphite was investigated in propylene carbonate (PC containing different concentrations, 0.4, 0.9, 1.2, 2.2, 2.8, 3.8, and 4.7 mol dm−3, of lithium perchlorate, LiClO4. Lithium ion was reversibly intercalated into and deintercalated from graphite in 3.8 and 4.7 mol dm−3 solutions despite the use of pure PC as the solvent. However, ceaseless solvent decomposition and intense exfoliation of the graphene layers occurred in other solutions. The results of the Raman spectroscopic analysis indicated that contact ion pairs are present in 3.8 and 4.7 mol dm−3 solutions, which suggested that the presence of contact ion pairs is an important factor that determines the solid electrolyte interphase- (SEI- forming ability in PC-based electrolytes.

  7. An Activity-Based Dissolution Model for Solute-Containing Microdroplets

    DEFF Research Database (Denmark)

    Bitterfield, Deborah L; Madsen, Anders Utoft; Needham, D.

    2016-01-01

    activity and chemical potential, at the droplet interface. This work addresses the importance of understanding how water activity changes during solution droplet dissolution. A model for dissolution rate is presented that accounts for the kinetic effects of changing water activity at the droplet interface...... is sufficient to account for the kinetics of dissolution. The dissolution model is based on the Epstein-Plesset equation, which has previously been applied to pure gas (bubble) and liquid (droplet) dissolution into liquid phases, but not to salt solutions. The model is tested by using the micropipet technique...... to form and observe the dehydration of single NaCl solution microdroplets in octanol or butyl acetate. The model successfully predicts the droplet diameter as a function of time in both organic solvents. The NaCl concentration in water is measured well into the supersaturated area >5.4 M...

  8. Comparison of Model-Based Control Solutions for Severe Riser-Induced Slugs

    DEFF Research Database (Denmark)

    Pedersen, Simon; Jahanshahi, Esmaiel; Yang, Zhenyu

    2017-01-01

    Control solutions for eliminating severe riser-induced slugs in offshore oil & gas pipeline installations are key topics in offshore Exploration and Production (E&P) processes. This study describes the identification, analysis and control of a low-dimensional control-oriented model of a lab......-scaled slug testing facility. The model is analyzed and used for anti-slug control development for both lowpoint and topside transmitter solutions. For the controlled variables’ comparison it is concluded that the topside pressure transmitter (Pt) is the most difficult output to apply directly for anti-slug...... control due to the inverse response. However, as Pt often is the only accessible measurement on offshore platforms this study focuses on the controller development for both Pt and the lowpoint pressure transmitter (Pb). All the control solutions are based on linear control schemes and the performance...

  9. Variation in interoperability across clinical laboratories nationwide.

    Science.gov (United States)

    Patel, Vaishali; McNamara, Lauren; Dullabh, Prashila; Sawchuk, Megan E; Swain, Matthew

    2017-12-01

    To characterize nationwide variation and factors associated with clinical laboratories': (1) capabilities to send structured test results electronically to ordering practitioners' EHR systems; and (2) their levels of exchange activity, as measured by whether they sent more than three-quarters of their test results as structured data to ordering practitioners' EHR systems. A national survey of all independent and hospital laboratories was conducted in 2013. Using an analytic weighted sample of 9382 clinical laboratories, a series of logistic regression analyses were conducted to identify organizational and area characteristics associated with clinical laboratories' exchange capability and activity. Hospital-based clinical laboratories (71%) and larger clinical laboratories (80%) had significantly higher levels of capability compared to independent (58%) and smaller laboratories (48%), respectively; though all had similar levels of exchange activity, with 30% of clinical laboratories sending 75% or more of their test results electronically. In multivariate analyses, hospital and the largest laboratories had 1.87 and 4.40 higher odds, respectively, of possessing the capability to send results electronically compared to independent laboratories (pLaboratories located in areas with a higher share of potential exchange partners had a small but significantly greater capability to send results electronically and higher levels of exchange activity(pClinical laboratories' capability to exchange varied by size and type; however, all clinical laboratories had relatively low levels of exchange activity. The role of exchange partners potentially played a small but significant role in driving exchange capability and activity. Published by Elsevier B.V.

  10. Influence of aging solutions on wear resistance and hardness of selected resin-based dental composites.

    Science.gov (United States)

    Chladek, Grzegorz; Basa, Katarzyna; Żmudzki, Jarosław; Malara, Piotr; Nowak, Agnieszka J; Kasperski, Jacek

    2016-01-01

    The purpose of this study was to investigate the effect of different plasticizing aging solutions on wear resistance and hardness of selected universal resin-based dental composites. Three light cured (one nanofilled, two microhybride) and one hybride chemical cured composites were aged at 37 °C for 48 h in distillated water, ethyl alcohol solution or Listerine mouthwash. After aging the microhardness tests were carried out and then tribological tests were performed in the presence of aging solution at 37 °C. During wear testing coefficients of friction were determined. The maximal vertical loss in micrometers was determined with profilometer. Aging in all liquids resulted in a significant decrease in hardness of the test materials, with the largest values obtained successively in ethanol solution, mouthwash and water. The effect of the liquid was dependent on the particular material, but not the type of material (interpreted as the size of filler used). Introduction of mouthwash instead of water or ethanol solution resulted in a significant reduction in the coefficient of friction. The lowest wear resistance was registered after aging in ethanol and for the chemical cured hybrid composite, but the vertical loss was strongly material dependent. The effect of different aging solution, including commercial mouthrinse, on hardness and wear was material dependent, and cannot be deduced from their category or filler loading. There is no simple correlation between hardness of resin-based dental composites and their wear resistance, but softening of particular composites materials during aging leads to the reduction of its wear resistance.

  11. Effect of polyethylene glycol-based preservation solutions on graft injury in experimental kidney transplantation.

    Science.gov (United States)

    Thuillier, R; Renard, C; Rogel-Gaillard, C; Demars, J; Milan, D; Forestier, L; Ouldmoulene, A; Goujon, J M; Badet, L; Hauet, T

    2011-03-01

    New preservation solutions are emerging, of various ionic compositions and with hydroxyethyl starch replaced by polymers such as polyethylene glycols (PEGs), offering the potential for 'immunocamouflage'. This experimental study investigated which of three clinically available preservation protocols offered the best graft protection, based on epithelial-to-mesenchymal transition (EMT) and fibrosis. Kidneys were preserved for 24 h at 4° C with University of Wisconsin solution (UW)as standard, compared with solutions containing either 1 g/l PEG 35 kDa (Institute Georges Lopez solution, IGL) or 30 g/l PEG 20 kDa (solution de conservation des organes et des tissus, SCOT). Animals were followed for up to 3 months and development of EMT, tubular atrophy and fibrosis was evaluated in comparison with sham-operated animals. Functional recovery was better in the SCOT group compared with the other groups. Chronic fibrosis, EMT and inflammation were observed in the UW and IGL groups, but limited in the SCOT group. Levels of profibrosis markers such as transforming growth factor β1, plasminogen activator inhibitor 1 and connective tissue growth factor were increased in IGL and UW groups compared with the SCOT group. Hypoxia-inducible factor (HIF) 1α and 2α expression was increased at 3 months in grafts preserved in UW and IGL, but detected transiently on day 14 when SCOT was used. Expression of HIF-regulated genes vascular endothelial growth factor and erythropoietin was increased in UW and IGL groups. The choice of colloid and ionic content is paramount in providing long-term protection against chronic graft injury after renal transplantation. Preservation solutions based on PEGs may optimize graft quality.

  12. DC Grids for Smart LED-Based Lighting: The EDISON Solution

    Directory of Open Access Journals (Sweden)

    Steffen Thielemans

    2017-09-01

    Full Text Available This paper highlights the benefits and possible drawbacks of a DC-based lighting infrastructure for powering Light Emitting Diode (LED-lamps. It also evaluates the efforts needed for integrating the so called smart lighting and other sensor/actuator based control systems, and compares existing and emerging solutions. It reviews and discusses published work in this field with special focus on the intelligent DC-based infrastructure named EDISON that is primarily dedicated to lighting, but is applicable to building automation in general. The EDISON “PowerLAN” consists of a DC-based infrastructure that offers telecommunication abilities and can be applied to lighting retrofitting scenarios for buildings. Its infrastructure allows simple and efficient powering of DC-oriented devices like LED lamps, sensors and microcontrollers, while offering a wired communication channel. This paper motivates the design choices for organizing DC lighting grids and their associated communication possibilities. It also shows how the EDISON based smart lighting solution is evolving today to include new communication technologies and to further integrate other parts of building management solutions through the OneM2M (Machine to Machine service bus.

  13. Effects of sodium hydroxide (NaOH) solution concentration on fly ash-based lightweight geopolymer

    Science.gov (United States)

    Ibrahim, W. M. W.; Hussin, K.; Abdullah, M. M. A.; Kadir, A. A.; Deraman, L. M.

    2017-09-01

    In this study, the effects of NaOH concentration on properties of fly ash-based lightweight geopolymer were investigated. Lightweight geopolymer was produced using fly ash as source materials and synthetic foaming agents as air entraining agent. The alkaline solutions used in this study are combination of sodium hydroxide (NaOH) and sodium silicate (Na2SiO3) solution. Different molarities of NaOH solution (6M, 8M, 10M, 12M, and 14M) are taken for preparation of 50 x 50 x 50 mm cubes of lightweight geopolymer. The ratio of fly ash/alkaline solution, Na2SiO3/NaOH solution, foaming agent/water and foam/geopolymer paste were kept constant at 2.0, 2.5, 1:10 and 1:1 respectively. The samples were cured at 80°C for 24 hours and left at room temperature for tested at 7 days of ageing. Physical and mechanical properties such as density, water absorption, compressive strength and microstructure property were determined from the cube dried samples. The results show that the NaOH molarity had effects on the properties of lightweight geopolymer with the optimum NaOH molarity found is 12M due to the high strength of 15.6 MPa, lower water absorption (7.3%) and low density (1440 kg/m3). Microstructure analysis shows that the lightweight geopolymer contain some porous structure and unreacted fly ash particles remains.

  14. Effect of different solutions on color stability of acrylic resin-based dentures.

    Science.gov (United States)

    Goiato, Marcelo Coelho; Nóbrega, Adhara Smith; dos Santos, Daniela Micheline; Andreotti, Agda Marobo; Moreno, Amália

    2014-01-01

    The aim of this study was to evaluate the effect of thermocycling and immersion in mouthwash or beverage solutions on the color stability of four different acrylic resin-based dentures (Onda Cryl, OC; QC20, QC; Classico, CL; and Lucitone, LU). The factors evaluated were type of acrylic resin, immersion time, and solution (mouthwash or beverage). A total of 224 denture samples were fabricated. For each type of resin, eight samples were immersed in mouthwashes (Plax-Colgate, PC; Listerine, LI; and Oral-B, OB), beverages (coffee, CP; cola, C; and wine, W), and artificial saliva (AS; control). The color change (DE) was evaluated before (baseline) and after thermocycling (T1), and after immersion in solution for 1 h (T2), 3 h (T3), 24 h (T4), 48 h (T5), and 96 h (T6). The CIE Lab system was used to determine the color changes. The thermocycling test was performed for 5000 cycles. Data were submitted to three-way repeated-measures analysis of variance and Tukey's test (pacrylic resin. Similarly, when the samples were immersed in each beverage, all studied factors influenced the color change values. In general, regardless of the solution, LU exhibited the greatest DE values in the period from T1 to T5; and QC presented the greatest DE values at T6. Thus, thermocycling and immersion in the various solutions influenced the color stability of acrylic resins and QC showed the greatest color alteration.

  15. The effect of dimethylsulfoxide on absorption and fluorescence spectra of aqueous solutions of acridine orange base.

    Science.gov (United States)

    Markarian, Shiraz A; Shahinyan, Gohar A

    2015-12-05

    The photophysical properties of aqueous solutions of acridine orange base (AOB) in wide concentration range of dimethylsulfoxide (DMSO) were studied by using absorption and steady-state fluorescence spectroscopy techniques at room temperature. The absorption spectrum of acridine orange in water shows two bands at 468 and 490 nm which were attributed to the dimer ((AOBH)2(2+)) and monomer (AOBH(+)) species respectively. In DMSO solution for the same AOB concentration only the basic form was detected with the band at 428 nm. The addition of DMSO to AOB aqueous solution leads to the decrease of absorption band at 490 nm and the new absorption band increases at 428 nm due to deprotonated (basic) form of AO and the first isosbestic point occurs at 450 nm. The evolution of isosbestic point reveals that an other equilibrium, due to the self-association of DMSO molecules takes place. From the steady-state fluorescence spectra Stokes shifts were calculated for AOB in aqueous and DMSO solutions. The addition of DMSO into the aqueous solution induced the enhancement in the fluorescence intensity of the dye compared to those in water. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. UMTS network planning, optimization, and inter-operation with GSM

    CERN Document Server

    Rahnema, Moe

    2008-01-01

    UMTS Network Planning, Optimization, and Inter-Operation with GSM is an accessible, one-stop reference to help engineers effectively reduce the time and costs involved in UMTS deployment and optimization. Rahnema includes detailed coverage from both a theoretical and practical perspective on the planning and optimization aspects of UMTS, and a number of other new techniques to help operators get the most out of their networks. Provides an end-to-end perspective, from network design to optimizationIncorporates the hands-on experiences of numerous researchersSingle

  17. Creating XML/PHP Interface for BAN Interoperability.

    Science.gov (United States)

    Fragkos, Vasileios; Katzis, Konstantinos; Despotou, Georgios

    2017-01-01

    Recent advances in medical and electronic technologies have introduced the use of Body Area Networks as a part of e-health, for constant and accurate monitoring of patients and the transmission as well as processing of the data to develop a holistic Electronic Health Record. The rising global population, different BAN manufacturers and a variety of medical systems pose the issue of interoperability between BANs and systems as well as the proper way to propagate medical data in an organized and efficient manner. In this paper, we describe BANs and propose the use of certain web technologies to address this issue.

  18. Use of Annotations for Component and Framework Interoperability

    Science.gov (United States)

    David, O.; Lloyd, W.; Carlson, J.; Leavesley, G. H.; Geter, F.

    2009-12-01

    western United States at the USDA NRCS National Water and Climate Center. PRMS is a component based modular precipitation-runoff model developed to evaluate the impacts of various combinations of precipitation, climate, and land use on streamflow and general basin hydrology. The new OMS 3.0 PRMS model source code is more concise and flexible as a result of using the new framework’s annotation based approach. The fully annotated components are now providing information directly for (i) model assembly and building, (ii) dataflow analysis for implicit multithreading, (iii) automated and comprehensive model documentation of component dependencies, physical data properties, (iv) automated model and component testing, and (v) automated audit-traceability to account for all model resources leading to a particular simulation result. Experience to date has demonstrated the multi-purpose value of using annotations. Annotations are also a feasible and practical method to enable interoperability among models and modeling frameworks. As a prototype example, model code annotations were used to generate binding and mediation code to allow the use of OMS 3.0 model components within the OpenMI context.

  19. A Thermal Simulation Tool for Building and Its Interoperability through the Building Information Modeling (BIM Platform

    Directory of Open Access Journals (Sweden)

    Christophe Nicolle

    2013-05-01

    Full Text Available This paper describes potential challenges and opportunities for using thermal simulation tools to optimize building performance. After reviewing current trends in thermal simulation, it outlines major criteria for the evaluation of building thermal simulation tools based on specifications and capabilities in interoperability. Details are discussed including workflow of data exchange of multiple thermal analyses such as the BIM-based application. The present analysis focuses on selected thermal simulation tools that provide functionalities to exchange data with other tools in order to obtain a picture of its basic work principles and to identify selection criteria for generic thermal tools in BIM. Significances and barriers to integration design with BIM and building thermal simulation tools are also discussed.

  20. Fully solution-processed transparent electrodes based on silver nanowire composites for perovskite solar cells

    Science.gov (United States)

    Kim, Areum; Lee, Hongseuk; Kwon, Hyeok-Chan; Jung, Hyun Suk; Park, Nam-Gyu; Jeong, Sunho; Moon, Jooho

    2016-03-01

    We report all-solution-processed transparent conductive electrodes based on Ag nanowire (AgNW)-embedded metal oxide composite films for application in organometal halide perovskite solar cells. To address the thermal instability of Ag nanowires, we used combustive sol-gel derived thin films to construct ZnO/ITO/AgNW/ITO composite structures. The resulting composite configuration effectively prevented the AgNWs from undergoing undesirable side-reactions with halogen ions present in the perovskite precursor solutions that significantly deteriorate the optoelectrical properties of Ag nanowires in transparent conductive films. AgNW-based composite electrodes had a transmittance of ~80% at 550 nm and sheet resistance of 18 Ω sq-1. Perovskite solar cells fabricated using a fully solution-processed transparent conductive electrode, Au/spiro-OMeTAD/CH3NH3PbI3 + m-Al2O3/ZnO/ITO/AgNW/ITO, exhibited a power conversion efficiency of 8.44% (comparable to that of the FTO/glass-based counterpart at 10.81%) and were stable for 30 days in ambient air. Our results demonstrate the feasibility of using AgNWs as a transparent bottom electrode in perovskite solar cells produced by a fully printable process.We report all-solution-processed transparent conductive electrodes based on Ag nanowire (AgNW)-embedded metal oxide composite films for application in organometal halide perovskite solar cells. To address the thermal instability of Ag nanowires, we used combustive sol-gel derived thin films to construct ZnO/ITO/AgNW/ITO composite structures. The resulting composite configuration effectively prevented the AgNWs from undergoing undesirable side-reactions with halogen ions present in the perovskite precursor solutions that significantly deteriorate the optoelectrical properties of Ag nanowires in transparent conductive films. AgNW-based composite electrodes had a transmittance of ~80% at 550 nm and sheet resistance of 18 Ω sq-1. Perovskite solar cells fabricated using a fully solution