WorldWideScience

Sample records for integration information extraction

  1. Information Extraction, Data Integration, and Uncertain Data Management: The State of The Art

    NARCIS (Netherlands)

    Habib, Mena Badieh; van Keulen, Maurice

    2011-01-01

    Information Extraction, data Integration, and uncertain data management are different areas of research that got vast focus in the last two decades. Many researches tackled those areas of research individually. However, information extraction systems should have integrated with data integration

  2. An Accurate Integral Method for Vibration Signal Based on Feature Information Extraction

    Directory of Open Access Journals (Sweden)

    Yong Zhu

    2015-01-01

    Full Text Available After summarizing the advantages and disadvantages of current integral methods, a novel vibration signal integral method based on feature information extraction was proposed. This method took full advantage of the self-adaptive filter characteristic and waveform correction feature of ensemble empirical mode decomposition in dealing with nonlinear and nonstationary signals. This research merged the superiorities of kurtosis, mean square error, energy, and singular value decomposition on signal feature extraction. The values of the four indexes aforementioned were combined into a feature vector. Then, the connotative characteristic components in vibration signal were accurately extracted by Euclidean distance search, and the desired integral signals were precisely reconstructed. With this method, the interference problem of invalid signal such as trend item and noise which plague traditional methods is commendably solved. The great cumulative error from the traditional time-domain integral is effectively overcome. Moreover, the large low-frequency error from the traditional frequency-domain integral is successfully avoided. Comparing with the traditional integral methods, this method is outstanding at removing noise and retaining useful feature information and shows higher accuracy and superiority.

  3. Integrating Information Extraction Agents into a Tourism Recommender System

    Science.gov (United States)

    Esparcia, Sergio; Sánchez-Anguix, Víctor; Argente, Estefanía; García-Fornes, Ana; Julián, Vicente

    Recommender systems face some problems. On the one hand information needs to be maintained updated, which can result in a costly task if it is not performed automatically. On the other hand, it may be interesting to include third party services in the recommendation since they improve its quality. In this paper, we present an add-on for the Social-Net Tourism Recommender System that uses information extraction and natural language processing techniques in order to automatically extract and classify information from the Web. Its goal is to maintain the system updated and obtain information about third party services that are not offered by service providers inside the system.

  4. Multi-Filter String Matching and Human-Centric Entity Matching for Information Extraction

    Science.gov (United States)

    Sun, Chong

    2012-01-01

    More and more information is being generated in text documents, such as Web pages, emails and blogs. To effectively manage this unstructured information, one broadly used approach includes locating relevant content in documents, extracting structured information and integrating the extracted information for querying, mining or further analysis. In…

  5. Terrain Extraction by Integrating Terrestrial Laser Scanner Data and Spectral Information

    Science.gov (United States)

    Lau, C. L.; Halim, S.; Zulkepli, M.; Azwan, A. M.; Tang, W. L.; Chong, A. K.

    2015-10-01

    The extraction of true terrain points from unstructured laser point cloud data is an important process in order to produce an accurate digital terrain model (DTM). However, most of these spatial filtering methods just utilizing the geometrical data to discriminate the terrain points from nonterrain points. The point cloud filtering method also can be improved by using the spectral information available with some scanners. Therefore, the objective of this study is to investigate the effectiveness of using the three-channel (red, green and blue) of the colour image captured from built-in digital camera which is available in some Terrestrial Laser Scanner (TLS) for terrain extraction. In this study, the data acquisition was conducted at a mini replica landscape in Universiti Teknologi Malaysia (UTM), Skudai campus using Leica ScanStation C10. The spectral information of the coloured point clouds from selected sample classes are extracted for spectral analysis. The coloured point clouds which within the corresponding preset spectral threshold are identified as that specific feature point from the dataset. This process of terrain extraction is done through using developed Matlab coding. Result demonstrates that a higher spectral resolution passive image is required in order to improve the output. This is because low quality of the colour images captured by the sensor contributes to the low separability in spectral reflectance. In conclusion, this study shows that, spectral information is capable to be used as a parameter for terrain extraction.

  6. Information extraction system

    Science.gov (United States)

    Lemmond, Tracy D; Hanley, William G; Guensche, Joseph Wendell; Perry, Nathan C; Nitao, John J; Kidwell, Paul Brandon; Boakye, Kofi Agyeman; Glaser, Ron E; Prenger, Ryan James

    2014-05-13

    An information extraction system and methods of operating the system are provided. In particular, an information extraction system for performing meta-extraction of named entities of people, organizations, and locations as well as relationships and events from text documents are described herein.

  7. Social network extraction based on Web: 3. the integrated superficial method

    Science.gov (United States)

    Nasution, M. K. M.; Sitompul, O. S.; Noah, S. A.

    2018-03-01

    The Web as a source of information has become part of the social behavior information. Although, by involving only the limitation of information disclosed by search engines in the form of: hit counts, snippets, and URL addresses of web pages, the integrated extraction method produces a social network not only trusted but enriched. Unintegrated extraction methods may produce social networks without explanation, resulting in poor supplemental information, or resulting in a social network of durmise laden, consequently unrepresentative social structures. The integrated superficial method in addition to generating the core social network, also generates an expanded network so as to reach the scope of relation clues, or number of edges computationally almost similar to n(n - 1)/2 for n social actors.

  8. CLASSIFICATION OF INFORMAL SETTLEMENTS THROUGH THE INTEGRATION OF 2D AND 3D FEATURES EXTRACTED FROM UAV DATA

    Directory of Open Access Journals (Sweden)

    C. M. Gevaert

    2016-06-01

    Full Text Available Unmanned Aerial Vehicles (UAVs are capable of providing very high resolution and up-to-date information to support informal settlement upgrading projects. In order to provide accurate basemaps, urban scene understanding through the identification and classification of buildings and terrain is imperative. However, common characteristics of informal settlements such as small, irregular buildings with heterogeneous roof material and large presence of clutter challenge state-of-the-art algorithms. Especially the dense buildings and steeply sloped terrain cause difficulties in identifying elevated objects. This work investigates how 2D radiometric and textural features, 2.5D topographic features, and 3D geometric features obtained from UAV imagery can be integrated to obtain a high classification accuracy in challenging classification problems for the analysis of informal settlements. It compares the utility of pixel-based and segment-based features obtained from an orthomosaic and DSM with point-based and segment-based features extracted from the point cloud to classify an unplanned settlement in Kigali, Rwanda. Findings show that the integration of 2D and 3D features leads to higher classification accuracies.

  9. A New Multi-Sensor Track Fusion Architecture for Multi-Sensor Information Integration

    National Research Council Canada - National Science Library

    Jean, Buddy H; Younker, John; Hung, Chih-Cheng

    2004-01-01

    .... This new technology will integrate multi-sensor information and extract integrated multi-sensor information to detect, track and identify multiple targets at any time, in any place under all weather conditions...

  10. Theoretical information reuse and integration

    CERN Document Server

    Rubin, Stuart

    2016-01-01

    Information Reuse and Integration addresses the efficient extension and creation of knowledge through the exploitation of Kolmogorov complexity in the extraction and application of domain symmetry. Knowledge, which seems to be novel, can more often than not be recast as the image of a sequence of transformations, which yield symmetric knowledge. When the size of those transformations and/or the length of that sequence of transforms exceeds the size of the image, then that image is said to be novel or random. It may also be that the new knowledge is random in that no such sequence of transforms, which produces it exists, or is at least known. The nine chapters comprising this volume incorporate symmetry, reuse, and integration as overt operational procedures or as operations built into the formal representations of data and operators employed. Either way, the aforementioned theoretical underpinnings of information reuse and integration are supported.

  11. Multimedia Information Extraction

    CERN Document Server

    Maybury, Mark T

    2012-01-01

    The advent of increasingly large consumer collections of audio (e.g., iTunes), imagery (e.g., Flickr), and video (e.g., YouTube) is driving a need not only for multimedia retrieval but also information extraction from and across media. Furthermore, industrial and government collections fuel requirements for stock media access, media preservation, broadcast news retrieval, identity management, and video surveillance.  While significant advances have been made in language processing for information extraction from unstructured multilingual text and extraction of objects from imagery and vid

  12. Integrated Phoneme Subspace Method for Speech Feature Extraction

    Directory of Open Access Journals (Sweden)

    Park Hyunsin

    2009-01-01

    Full Text Available Speech feature extraction has been a key focus in robust speech recognition research. In this work, we discuss data-driven linear feature transformations applied to feature vectors in the logarithmic mel-frequency filter bank domain. Transformations are based on principal component analysis (PCA, independent component analysis (ICA, and linear discriminant analysis (LDA. Furthermore, this paper introduces a new feature extraction technique that collects the correlation information among phoneme subspaces and reconstructs feature space for representing phonemic information efficiently. The proposed speech feature vector is generated by projecting an observed vector onto an integrated phoneme subspace (IPS based on PCA or ICA. The performance of the new feature was evaluated for isolated word speech recognition. The proposed method provided higher recognition accuracy than conventional methods in clean and reverberant environments.

  13. Challenges in Managing Information Extraction

    Science.gov (United States)

    Shen, Warren H.

    2009-01-01

    This dissertation studies information extraction (IE), the problem of extracting structured information from unstructured data. Example IE tasks include extracting person names from news articles, product information from e-commerce Web pages, street addresses from emails, and names of emerging music bands from blogs. IE is all increasingly…

  14. MedTime: a temporal information extraction system for clinical narratives.

    Science.gov (United States)

    Lin, Yu-Kai; Chen, Hsinchun; Brown, Randall A

    2013-12-01

    Temporal information extraction from clinical narratives is of critical importance to many clinical applications. We participated in the EVENT/TIMEX3 track of the 2012 i2b2 clinical temporal relations challenge, and presented our temporal information extraction system, MedTime. MedTime comprises a cascade of rule-based and machine-learning pattern recognition procedures. It achieved a micro-averaged f-measure of 0.88 in both the recognitions of clinical events and temporal expressions. We proposed and evaluated three time normalization strategies to normalize relative time expressions in clinical texts. The accuracy was 0.68 in normalizing temporal expressions of dates, times, durations, and frequencies. This study demonstrates and evaluates the integration of rule-based and machine-learning-based approaches for high performance temporal information extraction from clinical narratives. Copyright © 2013 Elsevier Inc. All rights reserved.

  15. Advanced integrated solvent extraction systems

    Energy Technology Data Exchange (ETDEWEB)

    Horwitz, E.P.; Dietz, M.L.; Leonard, R.A. [Argonne National Lab., IL (United States)

    1997-10-01

    Advanced integrated solvent extraction systems are a series of novel solvent extraction (SX) processes that will remove and recover all of the major radioisotopes from acidic-dissolved sludge or other acidic high-level wastes. The major focus of this effort during the last 2 years has been the development of a combined cesium-strontium extraction/recovery process, the Combined CSEX-SREX Process. The Combined CSEX-SREX Process relies on a mixture of a strontium-selective macrocyclic polyether and a novel cesium-selective extractant based on dibenzo 18-crown-6. The process offers several potential advantages over possible alternatives in a chemical processing scheme for high-level waste treatment. First, if the process is applied as the first step in chemical pretreatment, the radiation level for all subsequent processing steps (e.g., transuranic extraction/recovery, or TRUEX) will be significantly reduced. Thus, less costly shielding would be required. The second advantage of the Combined CSEX-SREX Process is that the recovered Cs-Sr fraction is non-transuranic, and therefore will decay to low-level waste after only a few hundred years. Finally, combining individual processes into a single process will reduce the amount of equipment required to pretreat the waste and therefore reduce the size and cost of the waste processing facility. In an ongoing collaboration with Lockheed Martin Idaho Technology Company (LMITCO), the authors have successfully tested various segments of the Advanced Integrated Solvent Extraction Systems. Eichrom Industries, Inc. (Darien, IL) synthesizes and markets the Sr extractant and can supply the Cs extractant on a limited basis. Plans are under way to perform a test of the Combined CSEX-SREX Process with real waste at LMITCO in the near future.

  16. Extraction of Urban Trees from Integrated Airborne Based Digital Image and LIDAR Point Cloud Datasets - Initial Results

    Science.gov (United States)

    Dogon-yaro, M. A.; Kumar, P.; Rahman, A. Abdul; Buyuksalih, G.

    2016-10-01

    Timely and accurate acquisition of information on the condition and structural changes of urban trees serves as a tool for decision makers to better appreciate urban ecosystems and their numerous values which are critical to building up strategies for sustainable development. The conventional techniques used for extracting tree features include; ground surveying and interpretation of the aerial photography. However, these techniques are associated with some constraint, such as labour intensive field work, a lot of financial requirement, influences by weather condition and topographical covers which can be overcome by means of integrated airborne based LiDAR and very high resolution digital image datasets. This study presented a semi-automated approach for extracting urban trees from integrated airborne based LIDAR and multispectral digital image datasets over Istanbul city of Turkey. The above scheme includes detection and extraction of shadow free vegetation features based on spectral properties of digital images using shadow index and NDVI techniques and automated extraction of 3D information about vegetation features from the integrated processing of shadow free vegetation image and LiDAR point cloud datasets. The ability of the developed algorithms shows a promising result as an automated and cost effective approach to estimating and delineated 3D information of urban trees. The research also proved that integrated datasets is a suitable technology and a viable source of information for city managers to be used in urban trees management.

  17. The extraction and integration framework: a two-process account of statistical learning.

    Science.gov (United States)

    Thiessen, Erik D; Kronstein, Alexandra T; Hufnagle, Daniel G

    2013-07-01

    The term statistical learning in infancy research originally referred to sensitivity to transitional probabilities. Subsequent research has demonstrated that statistical learning contributes to infant development in a wide array of domains. The range of statistical learning phenomena necessitates a broader view of the processes underlying statistical learning. Learners are sensitive to a much wider range of statistical information than the conditional relations indexed by transitional probabilities, including distributional and cue-based statistics. We propose a novel framework that unifies learning about all of these kinds of statistical structure. From our perspective, learning about conditional relations outputs discrete representations (such as words). Integration across these discrete representations yields sensitivity to cues and distributional information. To achieve sensitivity to all of these kinds of statistical structure, our framework combines processes that extract segments of the input with processes that compare across these extracted items. In this framework, the items extracted from the input serve as exemplars in long-term memory. The similarity structure of those exemplars in long-term memory leads to the discovery of cues and categorical structure, which guides subsequent extraction. The extraction and integration framework provides a way to explain sensitivity to both conditional statistical structure (such as transitional probabilities) and distributional statistical structure (such as item frequency and variability), and also a framework for thinking about how these different aspects of statistical learning influence each other. 2013 APA, all rights reserved

  18. Information extraction from multi-institutional radiology reports.

    Science.gov (United States)

    Hassanpour, Saeed; Langlotz, Curtis P

    2016-01-01

    The radiology report is the most important source of clinical imaging information. It documents critical information about the patient's health and the radiologist's interpretation of medical findings. It also communicates information to the referring physicians and records that information for future clinical and research use. Although efforts to structure some radiology report information through predefined templates are beginning to bear fruit, a large portion of radiology report information is entered in free text. The free text format is a major obstacle for rapid extraction and subsequent use of information by clinicians, researchers, and healthcare information systems. This difficulty is due to the ambiguity and subtlety of natural language, complexity of described images, and variations among different radiologists and healthcare organizations. As a result, radiology reports are used only once by the clinician who ordered the study and rarely are used again for research and data mining. In this work, machine learning techniques and a large multi-institutional radiology report repository are used to extract the semantics of the radiology report and overcome the barriers to the re-use of radiology report information in clinical research and other healthcare applications. We describe a machine learning system to annotate radiology reports and extract report contents according to an information model. This information model covers the majority of clinically significant contents in radiology reports and is applicable to a wide variety of radiology study types. Our automated approach uses discriminative sequence classifiers for named-entity recognition to extract and organize clinically significant terms and phrases consistent with the information model. We evaluated our information extraction system on 150 radiology reports from three major healthcare organizations and compared its results to a commonly used non-machine learning information extraction method. We

  19. INTEGRATED INFORMATION SYSTEM ARCHITECTURE PROVIDING BEHAVIORAL FEATURE

    Directory of Open Access Journals (Sweden)

    Vladimir N. Shvedenko

    2016-11-01

    Full Text Available The paper deals with creation of integrated information system architecture capable of supporting management decisions using behavioral features. The paper considers the architecture of information decision support system for production system management. The behavioral feature is given to an information system, and it ensures extraction, processing of information, management decision-making with both automated and automatic modes of decision-making subsystem being permitted. Practical implementation of information system with behavior is based on service-oriented architecture: there is a set of independent services in the information system that provides data of its subsystems or data processing by separate application under the chosen variant of the problematic situation settlement. For creation of integrated information system with behavior we propose architecture including the following subsystems: data bus, subsystem for interaction with the integrated applications based on metadata, business process management subsystem, subsystem for the current state analysis of the enterprise and management decision-making, behavior training subsystem. For each problematic situation a separate logical layer service is created in Unified Service Bus handling problematic situations. This architecture reduces system information complexity due to the fact that with a constant amount of system elements the number of links decreases, since each layer provides communication center of responsibility for the resource with the services of corresponding applications. If a similar problematic situation occurs, its resolution is automatically removed from problem situation metamodel repository and business process metamodel of its settlement. In the business process performance commands are generated to the corresponding centers of responsibility to settle a problematic situation.

  20. What constitutes information integrity?

    Directory of Open Access Journals (Sweden)

    S. Flowerday

    2008-01-01

    Full Text Available This research focused on what constitutes information integrity as this is a problem facing companies today. Moreover, information integrity is a pillar of information security and is required in order to have a sound security management programme. However, it is acknowledged that 100% information integrity is not currently achievable due to various limitations and therefore the auditing concept of reasonable assurance is adopted. This is in line with the concept that 100% information security is not achievable and the notion that adequate security is the goal, using appropriate countermeasures. The main contribution of this article is to illustrate the importance of and provide a macro view of what constitutes information integrity. The findings are in harmony with Samuel Johnson's words (1751: 'Integrity without knowledge is weak and useless, and knowledge without integrity is dangerous and dreadful.'

  1. What constitutes information integrity?

    Directory of Open Access Journals (Sweden)

    S. Flowerday

    2007-12-01

    Full Text Available This research focused on what constitutes information integrity as this is a problem facing companies today. Moreover, information integrity is a pillar of information security and is required in order to have a sound security management programme. However, it is acknowledged that 100% information integrity is not currently achievable due to various limitations and therefore the auditing concept of reasonable assurance is adopted. This is in line with the concept that 100% information security is not achievable and the notion that adequate security is the goal, using appropriate countermeasures. The main contribution of this article is to illustrate the importance of and provide a macro view of what constitutes information integrity. The findings are in harmony with Samuel Johnson's words (1751: 'Integrity without knowledge is weak and useless, and knowledge without integrity is dangerous and dreadful.'

  2. Extracting useful information from images

    DEFF Research Database (Denmark)

    Kucheryavskiy, Sergey

    2011-01-01

    The paper presents an overview of methods for extracting useful information from digital images. It covers various approaches that utilized different properties of images, like intensity distribution, spatial frequencies content and several others. A few case studies including isotropic and heter......The paper presents an overview of methods for extracting useful information from digital images. It covers various approaches that utilized different properties of images, like intensity distribution, spatial frequencies content and several others. A few case studies including isotropic...

  3. Using text mining techniques to extract phenotypic information from the PhenoCHF corpus.

    Science.gov (United States)

    Alnazzawi, Noha; Thompson, Paul; Batista-Navarro, Riza; Ananiadou, Sophia

    2015-01-01

    Phenotypic information locked away in unstructured narrative text presents significant barriers to information accessibility, both for clinical practitioners and for computerised applications used for clinical research purposes. Text mining (TM) techniques have previously been applied successfully to extract different types of information from text in the biomedical domain. They have the potential to be extended to allow the extraction of information relating to phenotypes from free text. To stimulate the development of TM systems that are able to extract phenotypic information from text, we have created a new corpus (PhenoCHF) that is annotated by domain experts with several types of phenotypic information relating to congestive heart failure. To ensure that systems developed using the corpus are robust to multiple text types, it integrates text from heterogeneous sources, i.e., electronic health records (EHRs) and scientific articles from the literature. We have developed several different phenotype extraction methods to demonstrate the utility of the corpus, and tested these methods on a further corpus, i.e., ShARe/CLEF 2013. Evaluation of our automated methods showed that PhenoCHF can facilitate the training of reliable phenotype extraction systems, which are robust to variations in text type. These results have been reinforced by evaluating our trained systems on the ShARe/CLEF corpus, which contains clinical records of various types. Like other studies within the biomedical domain, we found that solutions based on conditional random fields produced the best results, when coupled with a rich feature set. PhenoCHF is the first annotated corpus aimed at encoding detailed phenotypic information. The unique heterogeneous composition of the corpus has been shown to be advantageous in the training of systems that can accurately extract phenotypic information from a range of different text types. Although the scope of our annotation is currently limited to a single

  4. Feature extraction for dynamic integration of classifiers

    NARCIS (Netherlands)

    Pechenizkiy, M.; Tsymbal, A.; Puuronen, S.; Patterson, D.W.

    2007-01-01

    Recent research has shown the integration of multiple classifiers to be one of the most important directions in machine learning and data mining. In this paper, we present an algorithm for the dynamic integration of classifiers in the space of extracted features (FEDIC). It is based on the technique

  5. A Method for Extracting Road Boundary Information from Crowdsourcing Vehicle GPS Trajectories.

    Science.gov (United States)

    Yang, Wei; Ai, Tinghua; Lu, Wei

    2018-04-19

    Crowdsourcing trajectory data is an important approach for accessing and updating road information. In this paper, we present a novel approach for extracting road boundary information from crowdsourcing vehicle traces based on Delaunay triangulation (DT). First, an optimization and interpolation method is proposed to filter abnormal trace segments from raw global positioning system (GPS) traces and interpolate the optimization segments adaptively to ensure there are enough tracking points. Second, constructing the DT and the Voronoi diagram within interpolated tracking lines to calculate road boundary descriptors using the area of Voronoi cell and the length of triangle edge. Then, the road boundary detection model is established integrating the boundary descriptors and trajectory movement features (e.g., direction) by DT. Third, using the boundary detection model to detect road boundary from the DT constructed by trajectory lines, and a regional growing method based on seed polygons is proposed to extract the road boundary. Experiments were conducted using the GPS traces of taxis in Beijing, China, and the results show that the proposed method is suitable for extracting the road boundary from low-frequency GPS traces, multi-type road structures, and different time intervals. Compared with two existing methods, the automatically extracted boundary information was proved to be of higher quality.

  6. A Method for Extracting Road Boundary Information from Crowdsourcing Vehicle GPS Trajectories

    Directory of Open Access Journals (Sweden)

    Wei Yang

    2018-04-01

    Full Text Available Crowdsourcing trajectory data is an important approach for accessing and updating road information. In this paper, we present a novel approach for extracting road boundary information from crowdsourcing vehicle traces based on Delaunay triangulation (DT. First, an optimization and interpolation method is proposed to filter abnormal trace segments from raw global positioning system (GPS traces and interpolate the optimization segments adaptively to ensure there are enough tracking points. Second, constructing the DT and the Voronoi diagram within interpolated tracking lines to calculate road boundary descriptors using the area of Voronoi cell and the length of triangle edge. Then, the road boundary detection model is established integrating the boundary descriptors and trajectory movement features (e.g., direction by DT. Third, using the boundary detection model to detect road boundary from the DT constructed by trajectory lines, and a regional growing method based on seed polygons is proposed to extract the road boundary. Experiments were conducted using the GPS traces of taxis in Beijing, China, and the results show that the proposed method is suitable for extracting the road boundary from low-frequency GPS traces, multi-type road structures, and different time intervals. Compared with two existing methods, the automatically extracted boundary information was proved to be of higher quality.

  7. Extraction of Information of Audio-Visual Contents

    Directory of Open Access Journals (Sweden)

    Carlos Aguilar

    2011-10-01

    Full Text Available In this article we show how it is possible to use Channel Theory (Barwise and Seligman, 1997 for modeling the process of information extraction realized by audiences of audio-visual contents. To do this, we rely on the concepts pro- posed by Channel Theory and, especially, its treatment of representational systems. We then show how the information that an agent is capable of extracting from the content depends on the number of channels he is able to establish between the content and the set of classifications he is able to discriminate. The agent can endeavor the extraction of information through these channels from the totality of content; however, we discuss the advantages of extracting from its constituents in order to obtain a greater number of informational items that represent it. After showing how the extraction process is endeavored for each channel, we propose a method of representation of all the informative values an agent can obtain from a content using a matrix constituted by the channels the agent is able to establish on the content (source classifications, and the ones he can understand as individual (destination classifications. We finally show how this representation allows reflecting the evolution of the informative items through the evolution of audio-visual content.

  8. Scenario Customization for Information Extraction

    National Research Council Canada - National Science Library

    Yangarber, Roman

    2001-01-01

    Information Extraction (IE) is an emerging NLP technology, whose function is to process unstructured, natural language text, to locate specific pieces of information, or facts, in the text, and to use these facts to fill a database...

  9. Advanced integrated solvent extraction and ion exchange systems

    International Nuclear Information System (INIS)

    Horwitz, P.

    1996-01-01

    Advanced integrated solvent extraction (SX) and ion exchange (IX) systems are a series of novel SX and IX processes that extract and recover uranium and transuranics (TRUs) (neptunium, plutonium, americium) and fission products 90 Sr, 99 Tc, and 137 Cs from acidic high-level liquid waste and that sorb and recover 90 Sr, 99 Tc, and 137 Cs from alkaline supernatant high-level waste. Each system is based on the use of new selective liquid extractants or chromatographic materials. The purpose of the integrated SX and IX processes is to minimize the quantity of waste that must be vitrified and buried in a deep geologic repository by producing raffinates (from SX) and effluent streams (from IX) that will meet the specifications of Class A low-level waste

  10. Can we replace curation with information extraction software?

    Science.gov (United States)

    Karp, Peter D

    2016-01-01

    Can we use programs for automated or semi-automated information extraction from scientific texts as practical alternatives to professional curation? I show that error rates of current information extraction programs are too high to replace professional curation today. Furthermore, current IEP programs extract single narrow slivers of information, such as individual protein interactions; they cannot extract the large breadth of information extracted by professional curators for databases such as EcoCyc. They also cannot arbitrate among conflicting statements in the literature as curators can. Therefore, funding agencies should not hobble the curation efforts of existing databases on the assumption that a problem that has stymied Artificial Intelligence researchers for more than 60 years will be solved tomorrow. Semi-automated extraction techniques appear to have significantly more potential based on a review of recent tools that enhance curator productivity. But a full cost-benefit analysis for these tools is lacking. Without such analysis it is possible to expend significant effort developing information-extraction tools that automate small parts of the overall curation workflow without achieving a significant decrease in curation costs.Database URL. © The Author(s) 2016. Published by Oxford University Press.

  11. Transductive Pattern Learning for Information Extraction

    National Research Council Canada - National Science Library

    McLernon, Brian; Kushmerick, Nicholas

    2006-01-01

    .... We present TPLEX, a semi-supervised learning algorithm for information extraction that can acquire extraction patterns from a small amount of labelled text in conjunction with a large amount of unlabelled text...

  12. Pixel extraction based integral imaging with controllable viewing direction

    International Nuclear Information System (INIS)

    Ji, Chao-Chao; Deng, Huan; Wang, Qiong-Hua

    2012-01-01

    We propose pixel extraction based integral imaging with a controllable viewing direction. The proposed integral imaging can provide viewers three-dimensional (3D) images in a very small viewing angle. The viewing angle and the viewing direction of the reconstructed 3D images are controlled by the pixels extracted from an elemental image array. Theoretical analysis and a 3D display experiment of the viewing direction controllable integral imaging are carried out. The experimental results verify the correctness of the theory. A 3D display based on the integral imaging can protect the viewer’s privacy and has huge potential for a television to show multiple 3D programs at the same time. (paper)

  13. Entropy-based analysis and bioinformatics-inspired integration of global economic information transfer.

    Directory of Open Access Journals (Sweden)

    Jinkyu Kim

    Full Text Available The assessment of information transfer in the global economic network helps to understand the current environment and the outlook of an economy. Most approaches on global networks extract information transfer based mainly on a single variable. This paper establishes an entirely new bioinformatics-inspired approach to integrating information transfer derived from multiple variables and develops an international economic network accordingly. In the proposed methodology, we first construct the transfer entropies (TEs between various intra- and inter-country pairs of economic time series variables, test their significances, and then use a weighted sum approach to aggregate information captured in each TE. Through a simulation study, the new method is shown to deliver better information integration compared to existing integration methods in that it can be applied even when intra-country variables are correlated. Empirical investigation with the real world data reveals that Western countries are more influential in the global economic network and that Japan has become less influential following the Asian currency crisis.

  14. Entropy-based analysis and bioinformatics-inspired integration of global economic information transfer.

    Science.gov (United States)

    Kim, Jinkyu; Kim, Gunn; An, Sungbae; Kwon, Young-Kyun; Yoon, Sungroh

    2013-01-01

    The assessment of information transfer in the global economic network helps to understand the current environment and the outlook of an economy. Most approaches on global networks extract information transfer based mainly on a single variable. This paper establishes an entirely new bioinformatics-inspired approach to integrating information transfer derived from multiple variables and develops an international economic network accordingly. In the proposed methodology, we first construct the transfer entropies (TEs) between various intra- and inter-country pairs of economic time series variables, test their significances, and then use a weighted sum approach to aggregate information captured in each TE. Through a simulation study, the new method is shown to deliver better information integration compared to existing integration methods in that it can be applied even when intra-country variables are correlated. Empirical investigation with the real world data reveals that Western countries are more influential in the global economic network and that Japan has become less influential following the Asian currency crisis.

  15. Extracting breathing rate information from a wearable reflectance pulse oximeter sensor.

    Science.gov (United States)

    Johnston, W S; Mendelson, Y

    2004-01-01

    The integration of multiple vital physiological measurements could help combat medics and field commanders to better predict a soldier's health condition and enhance their ability to perform remote triage procedures. In this paper we demonstrate the feasibility of extracting accurate breathing rate information from a photoplethysmographic signal that was recorded by a reflectance pulse oximeter sensor mounted on the forehead and subsequently processed by a simple time domain filtering and frequency domain Fourier analysis.

  16. Optical Aperture Synthesis Object's Information Extracting Based on Wavelet Denoising

    International Nuclear Information System (INIS)

    Fan, W J; Lu, Y

    2006-01-01

    Wavelet denoising is studied to improve OAS(optical aperture synthesis) object's Fourier information extracting. Translation invariance wavelet denoising based on Donoho wavelet soft threshold denoising is researched to remove Pseudo-Gibbs in wavelet soft threshold image. OAS object's information extracting based on translation invariance wavelet denoising is studied. The study shows that wavelet threshold denoising can improve the precision and the repetition of object's information extracting from interferogram, and the translation invariance wavelet denoising information extracting is better than soft threshold wavelet denoising information extracting

  17. Evolving spectral transformations for multitemporal information extraction using evolutionary computation

    Science.gov (United States)

    Momm, Henrique; Easson, Greg

    2011-01-01

    Remote sensing plays an important role in assessing temporal changes in land features. The challenge often resides in the conversion of large quantities of raw data into actionable information in a timely and cost-effective fashion. To address this issue, research was undertaken to develop an innovative methodology integrating biologically-inspired algorithms with standard image classification algorithms to improve information extraction from multitemporal imagery. Genetic programming was used as the optimization engine to evolve feature-specific candidate solutions in the form of nonlinear mathematical expressions of the image spectral channels (spectral indices). The temporal generalization capability of the proposed system was evaluated by addressing the task of building rooftop identification from a set of images acquired at different dates in a cross-validation approach. The proposed system generates robust solutions (kappa values > 0.75 for stage 1 and > 0.4 for stage 2) despite the statistical differences between the scenes caused by land use and land cover changes coupled with variable environmental conditions, and the lack of radiometric calibration between images. Based on our results, the use of nonlinear spectral indices enhanced the spectral differences between features improving the clustering capability of standard classifiers and providing an alternative solution for multitemporal information extraction.

  18. YAdumper: extracting and translating large information volumes from relational databases to structured flat files.

    Science.gov (United States)

    Fernández, José M; Valencia, Alfonso

    2004-10-12

    Downloading the information stored in relational databases into XML and other flat formats is a common task in bioinformatics. This periodical dumping of information requires considerable CPU time, disk and memory resources. YAdumper has been developed as a purpose-specific tool to deal with the integral structured information download of relational databases. YAdumper is a Java application that organizes database extraction following an XML template based on an external Document Type Declaration. Compared with other non-native alternatives, YAdumper substantially reduces memory requirements and considerably improves writing performance.

  19. Information Security Maturity as an Integral Part of ISMS based Risk Management Tools

    NARCIS (Netherlands)

    Fetler, Ben; Harpes, Carlo

    2016-01-01

    Measuring the continuous improvement of Information Security Management Systems (ISMS) is often neglected as most organizations do not know how to extract key-indicators that could be used for this purpose. The underlying work presents a six-level maturity model which can be fully integrated in a

  20. Cause Information Extraction from Financial Articles Concerning Business Performance

    Science.gov (United States)

    Sakai, Hiroyuki; Masuyama, Shigeru

    We propose a method of extracting cause information from Japanese financial articles concerning business performance. Our method acquires cause informtion, e. g. “_??__??__??__??__??__??__??__??__??__??_ (zidousya no uriage ga koutyou: Sales of cars were good)”. Cause information is useful for investors in selecting companies to invest. Our method extracts cause information as a form of causal expression by using statistical information and initial clue expressions automatically. Our method can extract causal expressions without predetermined patterns or complex rules given by hand, and is expected to be applied to other tasks for acquiring phrases that have a particular meaning not limited to cause information. We compared our method with our previous one originally proposed for extracting phrases concerning traffic accident causes and experimental results showed that our new method outperforms our previous one.

  1. Rule-based Information Integration

    NARCIS (Netherlands)

    de Keijzer, Ander; van Keulen, Maurice

    2005-01-01

    In this report, we show the process of information integration. We specifically discuss the language used for integration. We show that integration consists of two phases, the schema mapping phase and the data integration phase. We formally define transformation rules, conversion, evolution and

  2. Sample-based XPath Ranking for Web Information Extraction

    NARCIS (Netherlands)

    Jundt, Oliver; van Keulen, Maurice

    Web information extraction typically relies on a wrapper, i.e., program code or a configuration that specifies how to extract some information from web pages at a specific website. Manually creating and maintaining wrappers is a cumbersome and error-prone task. It may even be prohibitive as some

  3. Ontology-Based Information Extraction for Business Intelligence

    Science.gov (United States)

    Saggion, Horacio; Funk, Adam; Maynard, Diana; Bontcheva, Kalina

    Business Intelligence (BI) requires the acquisition and aggregation of key pieces of knowledge from multiple sources in order to provide valuable information to customers or feed statistical BI models and tools. The massive amount of information available to business analysts makes information extraction and other natural language processing tools key enablers for the acquisition and use of that semantic information. We describe the application of ontology-based extraction and merging in the context of a practical e-business application for the EU MUSING Project where the goal is to gather international company intelligence and country/region information. The results of our experiments so far are very promising and we are now in the process of building a complete end-to-end solution.

  4. A Two-Step Resume Information Extraction Algorithm

    Directory of Open Access Journals (Sweden)

    Jie Chen

    2018-01-01

    Full Text Available With the rapid growth of Internet-based recruiting, there are a great number of personal resumes among recruiting systems. To gain more attention from the recruiters, most resumes are written in diverse formats, including varying font size, font colour, and table cells. However, the diversity of format is harmful to data mining, such as resume information extraction, automatic job matching, and candidates ranking. Supervised methods and rule-based methods have been proposed to extract facts from resumes, but they strongly rely on hierarchical structure information and large amounts of labelled data, which are hard to collect in reality. In this paper, we propose a two-step resume information extraction approach. In the first step, raw text of resume is identified as different resume blocks. To achieve the goal, we design a novel feature, Writing Style, to model sentence syntax information. Besides word index and punctuation index, word lexical attribute and prediction results of classifiers are included in Writing Style. In the second step, multiple classifiers are employed to identify different attributes of fact information in resumes. Experimental results on a real-world dataset show that the algorithm is feasible and effective.

  5. The Agent of extracting Internet Information with Lead Order

    Science.gov (United States)

    Mo, Zan; Huang, Chuliang; Liu, Aijun

    In order to carry out e-commerce better, advanced technologies to access business information are in need urgently. An agent is described to deal with the problems of extracting internet information that caused by the non-standard and skimble-scamble structure of Chinese websites. The agent designed includes three modules which respond to the process of extracting information separately. A method of HTTP tree and a kind of Lead algorithm is proposed to generate a lead order, with which the required web can be retrieved easily. How to transform the extracted information structuralized with natural language is also discussed.

  6. Advanced field-solver techniques for RC extraction of integrated circuits

    CERN Document Server

    Yu, Wenjian

    2014-01-01

    Resistance and capacitance (RC) extraction is an essential step in modeling the interconnection wires and substrate coupling effect in nanometer-technology integrated circuits (IC). The field-solver techniques for RC extraction guarantee the accuracy of modeling, and are becoming increasingly important in meeting the demand for accurate modeling and simulation of VLSI designs. Advanced Field-Solver Techniques for RC Extraction of Integrated Circuits presents a systematic introduction to, and treatment of, the key field-solver methods for RC extraction of VLSI interconnects and substrate coupling in mixed-signal ICs. Various field-solver techniques are explained in detail, with real-world examples to illustrate the advantages and disadvantages of each algorithm. This book will benefit graduate students and researchers in the field of electrical and computer engineering, as well as engineers working in the IC design and design automation industries. Dr. Wenjian Yu is an Associate Professor at the Department of ...

  7. Fine-grained information extraction from German transthoracic echocardiography reports.

    Science.gov (United States)

    Toepfer, Martin; Corovic, Hamo; Fette, Georg; Klügl, Peter; Störk, Stefan; Puppe, Frank

    2015-11-12

    Information extraction techniques that get structured representations out of unstructured data make a large amount of clinically relevant information about patients accessible for semantic applications. These methods typically rely on standardized terminologies that guide this process. Many languages and clinical domains, however, lack appropriate resources and tools, as well as evaluations of their applications, especially if detailed conceptualizations of the domain are required. For instance, German transthoracic echocardiography reports have not been targeted sufficiently before, despite of their importance for clinical trials. This work therefore aimed at development and evaluation of an information extraction component with a fine-grained terminology that enables to recognize almost all relevant information stated in German transthoracic echocardiography reports at the University Hospital of Würzburg. A domain expert validated and iteratively refined an automatically inferred base terminology. The terminology was used by an ontology-driven information extraction system that outputs attribute value pairs. The final component has been mapped to the central elements of a standardized terminology, and it has been evaluated according to documents with different layouts. The final system achieved state-of-the-art precision (micro average.996) and recall (micro average.961) on 100 test documents that represent more than 90 % of all reports. In particular, principal aspects as defined in a standardized external terminology were recognized with f 1=.989 (micro average) and f 1=.963 (macro average). As a result of keyword matching and restraint concept extraction, the system obtained high precision also on unstructured or exceptionally short documents, and documents with uncommon layout. The developed terminology and the proposed information extraction system allow to extract fine-grained information from German semi-structured transthoracic echocardiography reports

  8. Data Entities and Information System Matrix for Integrated Agriculture Information System (IAIS)

    Science.gov (United States)

    Budi Santoso, Halim; Delima, Rosa

    2018-03-01

    Integrated Agriculture Information System is a system that is developed to process data, information, and knowledge in Agriculture sector. Integrated Agriculture Information System brings valuable information for farmers: (1) Fertilizer price; (2) Agriculture technique and practise; (3) Pest management; (4) Cultivation; (5) Irrigation; (6) Post harvest processing; (7) Innovation in agriculture processing. Integrated Agriculture Information System contains 9 subsystems. To bring an integrated information to the user and stakeholder, it needs an integrated database approach. Thus, researchers describes data entity and its matrix relate to subsystem in Integrated Agriculture Information System (IAIS). As a result, there are 47 data entities as entities in single and integrated database.

  9. Integrated care information technology.

    Science.gov (United States)

    Rowe, Ian; Brimacombe, Phil

    2003-02-21

    Counties Manukau District Health Board (CMDHB) uses information technology (IT) to drive its Integrated Care strategy. IT enables the sharing of relevant health information between care providers. This information sharing is critical to closing the gaps between fragmented areas of the health system. The tragic case of James Whakaruru demonstrates how people have been falling through those gaps. The starting point of the Integrated Care strategic initiative was the transmission of electronic discharges and referral status messages from CMDHB's secondary provider, South Auckland Health (SAH), to GPs in the district. Successful pilots of a Well Child system and a diabetes disease management system embracing primary and secondary providers followed this. The improved information flowing from hospital to GPs now enables GPs to provide better management for their patients. The Well Child system pilot helped improve reported immunization rates in a high health need area from 40% to 90%. The diabetes system pilot helped reduce the proportion of patients with HbA1c rang:9 from 47% to 16%. IT has been implemented as an integral component of an overall Integrated Care strategic initiative. Within this context, Integrated Care IT has helped to achieve significant improvements in care outcomes, broken down barriers between health system silos, and contributed to the establishment of a system of care continuum that is better for patients.

  10. Knowledge discovery: Extracting usable information from large amounts of data

    International Nuclear Information System (INIS)

    Whiteson, R.

    1998-01-01

    The threat of nuclear weapons proliferation is a problem of world wide concern. Safeguards are the key to nuclear nonproliferation and data is the key to safeguards. The safeguards community has access to a huge and steadily growing volume of data. The advantages of this data rich environment are obvious, there is a great deal of information which can be utilized. The challenge is to effectively apply proven and developing technologies to find and extract usable information from that data. That information must then be assessed and evaluated to produce the knowledge needed for crucial decision making. Efficient and effective analysis of safeguards data will depend on utilizing technologies to interpret the large, heterogeneous data sets that are available from diverse sources. With an order-of-magnitude increase in the amount of data from a wide variety of technical, textual, and historical sources there is a vital need to apply advanced computer technologies to support all-source analysis. There are techniques of data warehousing, data mining, and data analysis that can provide analysts with tools that will expedite their extracting useable information from the huge amounts of data to which they have access. Computerized tools can aid analysts by integrating heterogeneous data, evaluating diverse data streams, automating retrieval of database information, prioritizing inputs, reconciling conflicting data, doing preliminary interpretations, discovering patterns or trends in data, and automating some of the simpler prescreening tasks that are time consuming and tedious. Thus knowledge discovery technologies can provide a foundation of support for the analyst. Rather than spending time sifting through often irrelevant information, analysts could use their specialized skills in a focused, productive fashion. This would allow them to make their analytical judgments with more confidence and spend more of their time doing what they do best

  11. Geographic information processing in the Integrated Measuring and Information System (IMIS). An overview; Geographische Informationsverarbeitung im integrierten Mess- und Informationssystem (IMIS). Ein Ueberblick

    Energy Technology Data Exchange (ETDEWEB)

    Burbeck, S. [Bundesamt fuer Strahlenschutz (BfS), Freiburg (Germany)

    2014-01-20

    As most public administrations the Federal Office for Radiation Protection faces various tasks and requirements with geographic information playing an important role. All the more this is true for the Department of Emergency Protection with its Integrated Measuring and Information System (IMIS) and the tasks of information provision for the public. Crucial part in geographic information extraction and provision is cartographic representation. In BfS the different requirements shall be met by a common software architecture, based on web services.

  12. Probabilistic XML in Information Integration

    NARCIS (Netherlands)

    de Keijzer, Ander; Shim, J.; Casati, F.

    2006-01-01

    Information integration is a difficult research problem. In an ambient environment, where devices can connect and disconnect arbitrarily, the problem only increases, because data sources may become available at any time, but can also disappear. In such an environment, information integration needs

  13. Information Extraction and Interpretation Analysis of Mineral Potential Targets Based on ETM+ Data and GIS technology: A Case Study of Copper and Gold Mineralization in Burma

    International Nuclear Information System (INIS)

    Wenhui, Du; Yongqing, Chen; Nana, Guo; Yinglong, Hao; Pengfei, Zhao; Gongwen, Wang

    2014-01-01

    Mineralization-alteration and structure information extraction plays important roles in mineral resource prospecting and assessment using remote sensing data and the Geographical Information System (GIS) technology. Choosing copper and gold mines in Burma as example, the authors adopt band ratio, threshold segmentation and principal component analysis (PCA) to extract the hydroxyl alteration information using ETM+ remote sensing images. Digital elevation model (DEM) (30m spatial resolution) and ETM+ data was used to extract linear and circular faults that are associated with copper and gold mineralization. Combining geological data and the above information, the weights of evidence method and the C-A fractal model was used to integrate and identify the ore-forming favourable zones in this area. Research results show that the high grade potential targets are located with the known copper and gold deposits, and the integrated information can be used to the next exploration for the mineral resource decision-making

  14. Three-dimensional multi-terminal superconductive integrated circuit inductance extraction

    International Nuclear Information System (INIS)

    Fourie, Coenrad J; Wetzstein, Olaf; Kunert, Jürgen; Ortlepp, Thomas

    2011-01-01

    Accurate inductance calculations are critical for the design of both digital and analogue superconductive integrated circuits, and three-dimensional calculations are gaining importance with the advent of inductive biasing, inductive coupling and sky plane shielding for RSFQ cells. InductEx, an extraction programme based on the three-dimensional calculation software FastHenry, was proposed earlier. InductEx uses segmentation techniques designed to accurately model the geometries of superconductive integrated circuit structures. Inductance extraction for complex multi-terminal three-dimensional structures from current distributions calculated by FastHenry is discussed. Results for both a reflection plane modelling an infinite ground plane and a finite segmented ground plane that allows inductive elements to extend over holes in the ground plane are shown. Several SQUIDs were designed for and fabricated with IPHT's 1 kA cm −2 RSFQ1D niobium process. These SQUIDs implement a number of loop structures that span different layers, include vias, inductively coupled control lines and ground plane holes. We measured the loop inductance of these SQUIDs and show how the results are used to calibrate the layer parameters in InductEx and verify the extraction accuracy. We also show that, with proper modelling, FastHenry can be fast enough to be used for the extraction of typical RSFQ cell inductances.

  15. The impulse cutoff an entropy functional measure on trajectories of Markov diffusion process integrating in information path functional

    OpenAIRE

    Lerner, Vladimir S.

    2012-01-01

    The impulses, cutting entropy functional (EF) measure on trajectories Markov diffusion process, integrate information path functional (IPF) composing discrete information Bits extracted from observing random process. Each cut brings memory of the cutting entropy, which provides both reduction of the process entropy and discrete unit of the cutting entropy a Bit. Consequently, information is memorized entropy cutting in random observations which process interactions. The origin of information ...

  16. Integrated inventory information system

    Digital Repository Service at National Institute of Oceanography (India)

    Sarupria, J.S.; Kunte, P.D.

    The nature of oceanographic data and the management of inventory level information are described in Integrated Inventory Information System (IIIS). It is shown how a ROSCOPO (report on observations/samples collected during oceanographic programme...

  17. New microwave-integrated Soxhlet extraction. An advantageous tool for the extraction of lipids from food products.

    Science.gov (United States)

    Virot, Matthieu; Tomao, Valérie; Colnagui, Giulio; Visinoni, Franco; Chemat, Farid

    2007-12-07

    A new process of Soxhlet extraction assisted by microwave was designed and developed. The process is performed in four steps, which ensures complete, rapid and accurate extraction of the samples. A second-order central composite design (CCD) has been used to investigate the performance of the new device. The results provided by analysis of variance and Pareto chart, indicated that the extraction time was the most important factor followed by the leaching time. The response surface methodology allowed us to determine optimal conditions for olive oil extraction: 13 min of extraction time, 17 min of leaching time, and 720 W of irradiation power. The proposed process is suitable for lipids determination from food. Microwave-integrated Soxhlet (MIS) extraction has been compared with a conventional technique, Soxhlet extraction, for the extraction of oil from olives (Aglandau, Vaucluse, France). The oils extracted by MIS for 32 min were quantitatively (yield) and qualitatively (fatty acid composition) similar to those obtained by conventional Soxhlet extraction for 8 h. MIS is a green technology and appears as a good alternative for the extraction of fat and oils from food products.

  18. Seeds integrate biological information about conspecific and allospecific neighbours.

    Science.gov (United States)

    Yamawo, Akira; Mukai, Hiromi

    2017-06-28

    Numerous organisms integrate information from multiple sources and express adaptive behaviours, but how they do so at different developmental stages remains to be identified. Seeds, which are the embryonic stage of plants, need to make decisions about the timing of emergence in response to environmental cues related to survival. We investigated the timing of emergence of Plantago asiatica (Plantaginaceae) seed while manipulating the presence of Trifolium repens seed and the relatedness of neighbouring P. asiatica seed. The relatedness of neighbouring P. asiatica seed and the presence of seeds of T. repens did not on their own influence the timing of P. asiatica emergence. However, when encountering a T. repens seed, a P. asiatica seed emerged faster in the presence of a sibling seed than in the presence of a non-sibling seed. Water extracts of seeds gave the same result. We show that P. asiatica seeds integrate information about the relatedness of neighbouring P. asiatica seeds and the presence of seeds of a different species via water-soluble chemicals and adjust their emergence behaviour in response. These findings suggest the presence of kin-dependent interspecific interactions. © 2017 The Author(s).

  19. Extracting Information from Multimedia Meeting Collections

    OpenAIRE

    Gatica-Perez, Daniel; Zhang, Dong; Bengio, Samy

    2005-01-01

    Multimedia meeting collections, composed of unedited audio and video streams, handwritten notes, slides, and electronic documents that jointly constitute a raw record of complex human interaction processes in the workplace, have attracted interest due to the increasing feasibility of recording them in large quantities, by the opportunities for information access and retrieval applications derived from the automatic extraction of relevant meeting information, and by the challenges that the ext...

  20. NAMED ENTITY RECOGNITION FROM BIOMEDICAL TEXT -AN INFORMATION EXTRACTION TASK

    Directory of Open Access Journals (Sweden)

    N. Kanya

    2016-07-01

    Full Text Available Biomedical Text Mining targets the Extraction of significant information from biomedical archives. Bio TM encompasses Information Retrieval (IR and Information Extraction (IE. The Information Retrieval will retrieve the relevant Biomedical Literature documents from the various Repositories like PubMed, MedLine etc., based on a search query. The IR Process ends up with the generation of corpus with the relevant document retrieved from the Publication databases based on the query. The IE task includes the process of Preprocessing of the document, Named Entity Recognition (NER from the documents and Relationship Extraction. This process includes Natural Language Processing, Data Mining techniques and machine Language algorithm. The preprocessing task includes tokenization, stop word Removal, shallow parsing, and Parts-Of-Speech tagging. NER phase involves recognition of well-defined objects such as genes, proteins or cell-lines etc. This process leads to the next phase that is extraction of relationships (IE. The work was based on machine learning algorithm Conditional Random Field (CRF.

  1. Semantic Information Extraction of Lanes Based on Onboard Camera Videos

    Science.gov (United States)

    Tang, L.; Deng, T.; Ren, C.

    2018-04-01

    In the field of autonomous driving, semantic information of lanes is very important. This paper proposes a method of automatic detection of lanes and extraction of semantic information from onboard camera videos. The proposed method firstly detects the edges of lanes by the grayscale gradient direction, and improves the Probabilistic Hough transform to fit them; then, it uses the vanishing point principle to calculate the lane geometrical position, and uses lane characteristics to extract lane semantic information by the classification of decision trees. In the experiment, 216 road video images captured by a camera mounted onboard a moving vehicle were used to detect lanes and extract lane semantic information. The results show that the proposed method can accurately identify lane semantics from video images.

  2. Domain XML semantic integration based on extraction rules and ontology mapping

    Directory of Open Access Journals (Sweden)

    Huayu LI

    2016-08-01

    Full Text Available A plenty of XML documents exist in petroleum engineering field, but traditional XML integration solution can’t provide semantic query, which leads to low data use efficiency. In light of WeXML(oil&gas well XML data semantic integration and query requirement, this paper proposes a semantic integration method based on extraction rules and ontology mapping. The method firstly defines a series of extraction rules with which elements and properties of WeXML Schema are mapped to classes and properties in WeOWL ontology, respectively; secondly, an algorithm is used to transform WeXML documents into WeOWL instances. Because WeOWL provides limited semantics, ontology mappings between two ontologies are then built to explain class and property of global ontology with terms of WeOWL, and semantic query based on global domain concepts model is provided. By constructing a WeXML data semantic integration prototype system, the proposed transformational rule, the transfer algorithm and the mapping rule are tested.

  3. Integrated Compliance Information System (ICIS)

    Data.gov (United States)

    U.S. Environmental Protection Agency — The purpose of ICIS is to meet evolving Enforcement and Compliance business needs for EPA and State users by integrating information into a single integrated data...

  4. Optimal Information Extraction of Laser Scanning Dataset by Scale-Adaptive Reduction

    Science.gov (United States)

    Zang, Y.; Yang, B.

    2018-04-01

    3D laser technology is widely used to collocate the surface information of object. For various applications, we need to extract a good perceptual quality point cloud from the scanned points. To solve the problem, most of existing methods extract important points based on a fixed scale. However, geometric features of 3D object come from various geometric scales. We propose a multi-scale construction method based on radial basis function. For each scale, important points are extracted from the point cloud based on their importance. We apply a perception metric Just-Noticeable-Difference to measure degradation of each geometric scale. Finally, scale-adaptive optimal information extraction is realized. Experiments are undertaken to evaluate the effective of the proposed method, suggesting a reliable solution for optimal information extraction of object.

  5. OPTIMAL INFORMATION EXTRACTION OF LASER SCANNING DATASET BY SCALE-ADAPTIVE REDUCTION

    Directory of Open Access Journals (Sweden)

    Y. Zang

    2018-04-01

    Full Text Available 3D laser technology is widely used to collocate the surface information of object. For various applications, we need to extract a good perceptual quality point cloud from the scanned points. To solve the problem, most of existing methods extract important points based on a fixed scale. However, geometric features of 3D object come from various geometric scales. We propose a multi-scale construction method based on radial basis function. For each scale, important points are extracted from the point cloud based on their importance. We apply a perception metric Just-Noticeable-Difference to measure degradation of each geometric scale. Finally, scale-adaptive optimal information extraction is realized. Experiments are undertaken to evaluate the effective of the proposed method, suggesting a reliable solution for optimal information extraction of object.

  6. Information Integration; The process of integration, evolution and versioning

    NARCIS (Netherlands)

    de Keijzer, Ander; van Keulen, Maurice

    2005-01-01

    At present, many information sources are available wherever you are. Most of the time, the information needed is spread across several of those information sources. Gathering this information is a tedious and time consuming job. Automating this process would assist the user in its task. Integration

  7. Mutual Information Based Dynamic Integration of Multiple Feature Streams for Robust Real-Time LVCSR

    Science.gov (United States)

    Sato, Shoei; Kobayashi, Akio; Onoe, Kazuo; Homma, Shinichi; Imai, Toru; Takagi, Tohru; Kobayashi, Tetsunori

    We present a novel method of integrating the likelihoods of multiple feature streams, representing different acoustic aspects, for robust speech recognition. The integration algorithm dynamically calculates a frame-wise stream weight so that a higher weight is given to a stream that is robust to a variety of noisy environments or speaking styles. Such a robust stream is expected to show discriminative ability. A conventional method proposed for the recognition of spoken digits calculates the weights front the entropy of the whole set of HMM states. This paper extends the dynamic weighting to a real-time large-vocabulary continuous speech recognition (LVCSR) system. The proposed weight is calculated in real-time from mutual information between an input stream and active HMM states in a searchs pace without an additional likelihood calculation. Furthermore, the mutual information takes the width of the search space into account by calculating the marginal entropy from the number of active states. In this paper, we integrate three features that are extracted through auditory filters by taking into account the human auditory system's ability to extract amplitude and frequency modulations. Due to this, features representing energy, amplitude drift, and resonant frequency drifts, are integrated. These features are expected to provide complementary clues for speech recognition. Speech recognition experiments on field reports and spontaneous commentary from Japanese broadcast news showed that the proposed method reduced error words by 9.2% in field reports and 4.7% in spontaneous commentaries relative to the best result obtained from a single stream.

  8. Knowledge Dictionary for Information Extraction on the Arabic Text Data

    Directory of Open Access Journals (Sweden)

    Wahyu Jauharis Saputra

    2013-04-01

    Full Text Available Information extraction is an early stage of a process of textual data analysis. Information extraction is required to get information from textual data that can be used for process analysis, such as classification and categorization. A textual data is strongly influenced by the language. Arabic is gaining a significant attention in many studies because Arabic language is very different from others, and in contrast to other languages, tools and research on the Arabic language is still lacking. The information extracted using the knowledge dictionary is a concept of expression. A knowledge dictionary is usually constructed manually by an expert and this would take a long time and is specific to a problem only. This paper proposed a method for automatically building a knowledge dictionary. Dictionary knowledge is formed by classifying sentences having the same concept, assuming that they will have a high similarity value. The concept that has been extracted can be used as features for subsequent computational process such as classification or categorization. Dataset used in this paper was the Arabic text dataset. Extraction result was tested by using a decision tree classification engine and the highest precision value obtained was 71.0% while the highest recall value was 75.0%. 

  9. Integrated information theory of consciousness: an updated account.

    Science.gov (United States)

    Tononi, G

    2012-12-01

    This article presents an updated account of integrated information theory of consciousness (liT) and some of its implications. /IT stems from thought experiments that lead to phenomenological axioms (existence, compositionality, information, integration, exclusion) and corresponding ontological postulates. The information axiom asserts that every experience is spec~fic - it is what it is by differing in its particular way from a large repertoire of alternatives. The integration axiom asserts that each experience is unified- it cannot be reduced to independent components. The exclusion axiom asserts that every experience is definite - it is limited to particular things and not others and flows at a particular speed and resolution. /IT formalizes these intuitions with postulates. The information postulate states that only "differences that make a difference" from the intrinsic perpective of a system matter: a mechanism generates cause-effect information if its present state has selective past causes and selective future effects within a system. The integration postulate states that only information that is irreducible matters: mechanisms generate integrated information only to the extent that the information they generate cannot be partitioned into that generated within independent components. The exclusion postulate states that only maxima of integrated information matter: a mechanism specifies only one maximally irreducible set of past causes and future effects - a concept. A complex is a set of elements specifying a maximally irreducible constellation of concepts, where the maximum is evaluated over elements and at the optimal spatiatemporal scale. Its concepts specify a maximally integrated conceptual information structure or quale, which is identical with an experience. Finally, changes in information integration upon exposure to the environment reflect a system's ability to match the causal structure of the world. After introducing an updated definition of

  10. Regional Logistics Information Resources Integration Patterns and Countermeasures

    Science.gov (United States)

    Wu, Hui; Shangguan, Xu-ming

    Effective integration of regional logistics information resources can provide collaborative services in information flow, business flow and logistics for regional logistics enterprises, which also can reduce operating costs and improve market responsiveness. First, this paper analyzes the realistic significance on the integration of regional logistics information. Second, this paper brings forward three feasible patterns on the integration of regional logistics information resources, These three models have their own strengths and the scope of application and implementation, which model is selected will depend on the specific business and the regional distribution of enterprises. Last, this paper discusses the related countermeasures on the integration of regional logistics information resources, because the integration of regional logistics information is a systems engineering, when the integration is advancing, the countermeasures should pay close attention to the current needs and long-term development of regional enterprises.

  11. Integrative analysis of gene expression and DNA methylation using unsupervised feature extraction for detecting candidate cancer biomarkers.

    Science.gov (United States)

    Moon, Myungjin; Nakai, Kenta

    2018-04-01

    Currently, cancer biomarker discovery is one of the important research topics worldwide. In particular, detecting significant genes related to cancer is an important task for early diagnosis and treatment of cancer. Conventional studies mostly focus on genes that are differentially expressed in different states of cancer; however, noise in gene expression datasets and insufficient information in limited datasets impede precise analysis of novel candidate biomarkers. In this study, we propose an integrative analysis of gene expression and DNA methylation using normalization and unsupervised feature extractions to identify candidate biomarkers of cancer using renal cell carcinoma RNA-seq datasets. Gene expression and DNA methylation datasets are normalized by Box-Cox transformation and integrated into a one-dimensional dataset that retains the major characteristics of the original datasets by unsupervised feature extraction methods, and differentially expressed genes are selected from the integrated dataset. Use of the integrated dataset demonstrated improved performance as compared with conventional approaches that utilize gene expression or DNA methylation datasets alone. Validation based on the literature showed that a considerable number of top-ranked genes from the integrated dataset have known relationships with cancer, implying that novel candidate biomarkers can also be acquired from the proposed analysis method. Furthermore, we expect that the proposed method can be expanded for applications involving various types of multi-omics datasets.

  12. Extraction of polycyclic aromatic hydrocarbons from smoked fish using pressurized liquid extraction with integrated fat removal

    DEFF Research Database (Denmark)

    Lund, Mette; Duedahl-Olesen, Lene; Christensen, Jan H.

    2009-01-01

    Quantification of polycyclic aromatic hydrocarbons (PAHs) in smoked fish products often requires multiple clean-up steps to remove fat and other compounds that may interfere with the chemical analysis. We present a novel pressurized liquid extraction (PLE) method that integrates exhaustive...

  13. Research on Crowdsourcing Emergency Information Extraction of Based on Events' Frame

    Science.gov (United States)

    Yang, Bo; Wang, Jizhou; Ma, Weijun; Mao, Xi

    2018-01-01

    At present, the common information extraction method cannot extract the structured emergency event information accurately; the general information retrieval tool cannot completely identify the emergency geographic information; these ways also do not have an accurate assessment of these results of distilling. So, this paper proposes an emergency information collection technology based on event framework. This technique is to solve the problem of emergency information picking. It mainly includes emergency information extraction model (EIEM), complete address recognition method (CARM) and the accuracy evaluation model of emergency information (AEMEI). EIEM can be structured to extract emergency information and complements the lack of network data acquisition in emergency mapping. CARM uses a hierarchical model and the shortest path algorithm and allows the toponomy pieces to be joined as a full address. AEMEI analyzes the results of the emergency event and summarizes the advantages and disadvantages of the event framework. Experiments show that event frame technology can solve the problem of emergency information drawing and provides reference cases for other applications. When the emergency disaster is about to occur, the relevant departments query emergency's data that has occurred in the past. They can make arrangements ahead of schedule which defense and reducing disaster. The technology decreases the number of casualties and property damage in the country and world. This is of great significance to the state and society.

  14. A rapid extraction of landslide disaster information research based on GF-1 image

    Science.gov (United States)

    Wang, Sai; Xu, Suning; Peng, Ling; Wang, Zhiyi; Wang, Na

    2015-08-01

    In recent years, the landslide disasters occurred frequently because of the seismic activity. It brings great harm to people's life. It has caused high attention of the state and the extensive concern of society. In the field of geological disaster, landslide information extraction based on remote sensing has been controversial, but high resolution remote sensing image can improve the accuracy of information extraction effectively with its rich texture and geometry information. Therefore, it is feasible to extract the information of earthquake- triggered landslides with serious surface damage and large scale. Taking the Wenchuan county as the study area, this paper uses multi-scale segmentation method to extract the landslide image object through domestic GF-1 images and DEM data, which uses the estimation of scale parameter tool to determine the optimal segmentation scale; After analyzing the characteristics of landslide high-resolution image comprehensively and selecting spectrum feature, texture feature, geometric features and landform characteristics of the image, we can establish the extracting rules to extract landslide disaster information. The extraction results show that there are 20 landslide whose total area is 521279.31 .Compared with visual interpretation results, the extraction accuracy is 72.22%. This study indicates its efficient and feasible to extract earthquake landslide disaster information based on high resolution remote sensing and it provides important technical support for post-disaster emergency investigation and disaster assessment.

  15. Linking attentional processes and conceptual problem solving: visual cues facilitate the automaticity of extracting relevant information from diagrams.

    Science.gov (United States)

    Rouinfar, Amy; Agra, Elise; Larson, Adam M; Rebello, N Sanjay; Loschky, Lester C

    2014-01-01

    This study investigated links between visual attention processes and conceptual problem solving. This was done by overlaying visual cues on conceptual physics problem diagrams to direct participants' attention to relevant areas to facilitate problem solving. Participants (N = 80) individually worked through four problem sets, each containing a diagram, while their eye movements were recorded. Each diagram contained regions that were relevant to solving the problem correctly and separate regions related to common incorrect responses. Problem sets contained an initial problem, six isomorphic training problems, and a transfer problem. The cued condition saw visual cues overlaid on the training problems. Participants' verbal responses were used to determine their accuracy. This study produced two major findings. First, short duration visual cues which draw attention to solution-relevant information and aid in the organizing and integrating of it, facilitate both immediate problem solving and generalization of that ability to new problems. Thus, visual cues can facilitate re-representing a problem and overcoming impasse, enabling a correct solution. Importantly, these cueing effects on problem solving did not involve the solvers' attention necessarily embodying the solution to the problem, but were instead caused by solvers attending to and integrating relevant information in the problems into a solution path. Second, this study demonstrates that when such cues are used across multiple problems, solvers can automatize the extraction of problem-relevant information extraction. These results suggest that low-level attentional selection processes provide a necessary gateway for relevant information to be used in problem solving, but are generally not sufficient for correct problem solving. Instead, factors that lead a solver to an impasse and to organize and integrate problem information also greatly facilitate arriving at correct solutions.

  16. Integrated Information Management (IIM)

    National Research Council Canada - National Science Library

    McIlvain, Jason

    2007-01-01

    Information Technology is the core capability required to align our resources and increase our effectiveness on the battlefield by integrating and coordinating our preventative measures and responses...

  17. Information Extraction with Character-level Neural Networks and Free Noisy Supervision

    OpenAIRE

    Meerkamp, Philipp; Zhou, Zhengyi

    2016-01-01

    We present an architecture for information extraction from text that augments an existing parser with a character-level neural network. The network is trained using a measure of consistency of extracted data with existing databases as a form of noisy supervision. Our architecture combines the ability of constraint-based information extraction systems to easily incorporate domain knowledge and constraints with the ability of deep neural networks to leverage large amounts of data to learn compl...

  18. Unsupervised information extraction by text segmentation

    CERN Document Server

    Cortez, Eli

    2013-01-01

    A new unsupervised approach to the problem of Information Extraction by Text Segmentation (IETS) is proposed, implemented and evaluated herein. The authors' approach relies on information available on pre-existing data to learn how to associate segments in the input string with attributes of a given domain relying on a very effective set of content-based features. The effectiveness of the content-based features is also exploited to directly learn from test data structure-based features, with no previous human-driven training, a feature unique to the presented approach. Based on the approach, a

  19. Integrated Reporting Information System -

    Data.gov (United States)

    Department of Transportation — The Integrated Reporting Information System (IRIS) is a flexible and scalable web-based system that supports post operational analysis and evaluation of the National...

  20. Integration of Information Technologies in Enterprise Application Development

    Directory of Open Access Journals (Sweden)

    Iulia SURUGIU

    2012-05-01

    Full Text Available Healthcare enterprises are disconnected. In the era of integrated information systems and Internet explosion, the necessity of information systems integration reside from business process evolution, on the one hand, and from information technology tendencies, on the other hand. In order to become more efficient and adaptive to change, healthcare organizations are tremendously preoccupied of business process automation, flexibility and complexity. The need of information systems integration arise from these goals, explaining, at the same time, the special interest in EAI. Extensible software integration architectures and business orientation of process modeling and information systems functionalities, the same as open-connectivity, accessibility and virtualization lead to most suitable integration solutions: SOA and BPM architectural styles in a cloud computing environment.

  1. Extraction of CT dose information from DICOM metadata: automated Matlab-based approach.

    Science.gov (United States)

    Dave, Jaydev K; Gingold, Eric L

    2013-01-01

    The purpose of this study was to extract exposure parameters and dose-relevant indexes of CT examinations from information embedded in DICOM metadata. DICOM dose report files were identified and retrieved from a PACS. An automated software program was used to extract from these files information from the structured elements in the DICOM metadata relevant to exposure. Extracting information from DICOM metadata eliminated potential errors inherent in techniques based on optical character recognition, yielding 100% accuracy.

  2. Integration of Information Technologies in Enterprise Application Development

    OpenAIRE

    Iulia SURUGIU

    2012-01-01

    Healthcare enterprises are disconnected. In the era of integrated information systems and Internet explosion, the necessity of information systems integration reside from business process evolution, on the one hand, and from information technology tendencies, on the other hand. In order to become more efficient and adaptive to change, healthcare organizations are tremendously preoccupied of business process automation, flexibility and complexity. The need of information systems integration ar...

  3. MEASURING INFORMATION INTEGR-ATION MODEL FOR CAD/CMM

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    A CAD/CMM workpiece modeling system based on IGES file is proposed. The modeling system is implemented by using a new method for labelling the tolerance items of 3D workpiece. The concept-"feature face" is used in the method. First the CAD data of workpiece are extracted and recognized automatically. Then a workpiece model is generated, which is the integration of pure 3D geometry form with its corresponding inspection items. The principle of workpiece modeling is also presented. At last, the experiment results are shown and correctness of the model is certified.

  4. Advanced applications of natural language processing for performing information extraction

    CERN Document Server

    Rodrigues, Mário

    2015-01-01

    This book explains how can be created information extraction (IE) applications that are able to tap the vast amount of relevant information available in natural language sources: Internet pages, official documents such as laws and regulations, books and newspapers, and social web. Readers are introduced to the problem of IE and its current challenges and limitations, supported with examples. The book discusses the need to fill the gap between documents, data, and people, and provides a broad overview of the technology supporting IE. The authors present a generic architecture for developing systems that are able to learn how to extract relevant information from natural language documents, and illustrate how to implement working systems using state-of-the-art and freely available software tools. The book also discusses concrete applications illustrating IE uses.   ·         Provides an overview of state-of-the-art technology in information extraction (IE), discussing achievements and limitations for t...

  5. Post-processing of Deep Web Information Extraction Based on Domain Ontology

    Directory of Open Access Journals (Sweden)

    PENG, T.

    2013-11-01

    Full Text Available Many methods are utilized to extract and process query results in deep Web, which rely on the different structures of Web pages and various designing modes of databases. However, some semantic meanings and relations are ignored. So, in this paper, we present an approach for post-processing deep Web query results based on domain ontology which can utilize the semantic meanings and relations. A block identification model (BIM based on node similarity is defined to extract data blocks that are relevant to specific domain after reducing noisy nodes. Feature vector of domain books is obtained by result set extraction model (RSEM based on vector space model (VSM. RSEM, in combination with BIM, builds the domain ontology on books which can not only remove the limit of Web page structures when extracting data information, but also make use of semantic meanings of domain ontology. After extracting basic information of Web pages, a ranking algorithm is adopted to offer an ordered list of data records to users. Experimental results show that BIM and RSEM extract data blocks and build domain ontology accurately. In addition, relevant data records and basic information are extracted and ranked. The performances precision and recall show that our proposed method is feasible and efficient.

  6. An Effective Approach to Biomedical Information Extraction with Limited Training Data

    Science.gov (United States)

    Jonnalagadda, Siddhartha

    2011-01-01

    In the current millennium, extensive use of computers and the internet caused an exponential increase in information. Few research areas are as important as information extraction, which primarily involves extracting concepts and the relations between them from free text. Limitations in the size of training data, lack of lexicons and lack of…

  7. a Statistical Texture Feature for Building Collapse Information Extraction of SAR Image

    Science.gov (United States)

    Li, L.; Yang, H.; Chen, Q.; Liu, X.

    2018-04-01

    Synthetic Aperture Radar (SAR) has become one of the most important ways to extract post-disaster collapsed building information, due to its extreme versatility and almost all-weather, day-and-night working capability, etc. In view of the fact that the inherent statistical distribution of speckle in SAR images is not used to extract collapsed building information, this paper proposed a novel texture feature of statistical models of SAR images to extract the collapsed buildings. In the proposed feature, the texture parameter of G0 distribution from SAR images is used to reflect the uniformity of the target to extract the collapsed building. This feature not only considers the statistical distribution of SAR images, providing more accurate description of the object texture, but also is applied to extract collapsed building information of single-, dual- or full-polarization SAR data. The RADARSAT-2 data of Yushu earthquake which acquired on April 21, 2010 is used to present and analyze the performance of the proposed method. In addition, the applicability of this feature to SAR data with different polarizations is also analysed, which provides decision support for the data selection of collapsed building information extraction.

  8. DKIE: Open Source Information Extraction for Danish

    DEFF Research Database (Denmark)

    Derczynski, Leon; Field, Camilla Vilhelmsen; Bøgh, Kenneth Sejdenfaden

    2014-01-01

    Danish is a major Scandinavian language spoken daily by around six million people. However, it lacks a unified, open set of NLP tools. This demonstration will introduce DKIE, an extensible open-source toolkit for processing Danish text. We implement an information extraction architecture for Danish...

  9. Transliteration normalization for Information Extraction and Machine Translation

    Directory of Open Access Journals (Sweden)

    Yuval Marton

    2014-12-01

    Full Text Available Foreign name transliterations typically include multiple spelling variants. These variants cause data sparseness and inconsistency problems, increase the Out-of-Vocabulary (OOV rate, and present challenges for Machine Translation, Information Extraction and other natural language processing (NLP tasks. This work aims to identify and cluster name spelling variants using a Statistical Machine Translation method: word alignment. The variants are identified by being aligned to the same “pivot” name in another language (the source-language in Machine Translation settings. Based on word-to-word translation and transliteration probabilities, as well as the string edit distance metric, names with similar spellings in the target language are clustered and then normalized to a canonical form. With this approach, tens of thousands of high-precision name transliteration spelling variants are extracted from sentence-aligned bilingual corpora in Arabic and English (in both languages. When these normalized name spelling variants are applied to Information Extraction tasks, improvements over strong baseline systems are observed. When applied to Machine Translation tasks, a large improvement potential is shown.

  10. Efficient Algorithms for Searching the Minimum Information Partition in Integrated Information Theory

    Science.gov (United States)

    Kitazono, Jun; Kanai, Ryota; Oizumi, Masafumi

    2018-03-01

    The ability to integrate information in the brain is considered to be an essential property for cognition and consciousness. Integrated Information Theory (IIT) hypothesizes that the amount of integrated information ($\\Phi$) in the brain is related to the level of consciousness. IIT proposes that to quantify information integration in a system as a whole, integrated information should be measured across the partition of the system at which information loss caused by partitioning is minimized, called the Minimum Information Partition (MIP). The computational cost for exhaustively searching for the MIP grows exponentially with system size, making it difficult to apply IIT to real neural data. It has been previously shown that if a measure of $\\Phi$ satisfies a mathematical property, submodularity, the MIP can be found in a polynomial order by an optimization algorithm. However, although the first version of $\\Phi$ is submodular, the later versions are not. In this study, we empirically explore to what extent the algorithm can be applied to the non-submodular measures of $\\Phi$ by evaluating the accuracy of the algorithm in simulated data and real neural data. We find that the algorithm identifies the MIP in a nearly perfect manner even for the non-submodular measures. Our results show that the algorithm allows us to measure $\\Phi$ in large systems within a practical amount of time.

  11. End-to-end information extraction without token-level supervision

    DEFF Research Database (Denmark)

    Palm, Rasmus Berg; Hovy, Dirk; Laws, Florian

    2017-01-01

    Most state-of-the-art information extraction approaches rely on token-level labels to find the areas of interest in text. Unfortunately, these labels are time-consuming and costly to create, and consequently, not available for many real-life IE tasks. To make matters worse, token-level labels...... and output text. We evaluate our model on the ATIS data set, MIT restaurant corpus and the MIT movie corpus and compare to neural baselines that do use token-level labels. We achieve competitive results, within a few percentage points of the baselines, showing the feasibility of E2E information extraction...

  12. From remote sensing data about information extraction for 3D geovisualization - Development of a workflow

    International Nuclear Information System (INIS)

    Tiede, D.

    2010-01-01

    With an increased availability of high (spatial) resolution remote sensing imagery since the late nineties, the need to develop operative workflows for the automated extraction, provision and communication of information from such data has grown. Monitoring requirements, aimed at the implementation of environmental or conservation targets, management of (environmental-) resources, and regional planning as well as international initiatives, especially the joint initiative of the European Commission and ESA (European Space Agency) for Global Monitoring for Environment and Security (GMES) play also a major part. This thesis addresses the development of an integrated workflow for the automated provision of information derived from remote sensing data. Considering applied data and fields of application, this work aims to design the workflow as generic as possible. Following research questions are discussed: What are the requirements of a workflow architecture that seamlessly links the individual workflow elements in a timely manner and secures accuracy of the extracted information effectively? How can the workflow retain its efficiency if mounds of data are processed? How can the workflow be improved with regards to automated object-based image analysis (OBIA)? Which recent developments could be of use? What are the limitations or which workarounds could be applied in order to generate relevant results? How can relevant information be prepared target-oriented and communicated effectively? How can the more recently developed freely available virtual globes be used for the delivery of conditioned information under consideration of the third dimension as an additional, explicit carrier of information? Based on case studies comprising different data sets and fields of application it is demonstrated how methods to extract and process information as well as to effectively communicate results can be improved and successfully combined within one workflow. It is shown that (1

  13. SpecOp: Optimal Extraction Software for Integral Field Unit Spectrographs

    Science.gov (United States)

    McCarron, Adam; Ciardullo, Robin; Eracleous, Michael

    2018-01-01

    The Hobby-Eberly Telescope’s new low resolution integral field spectrographs, LRS2-B and LRS2-R, each cover a 12”x6” area on the sky with 280 fibers and generate spectra with resolutions between R=1100 and R=1900. To extract 1-D spectra from the instrument’s 3D data cubes, a program is needed that is flexible enough to work for a wide variety of targets, including continuum point sources, emission line sources, and compact sources embedded in complex backgrounds. We therefore introduce SpecOp, a user-friendly python program for optimally extracting spectra from integral-field unit spectrographs. As input, SpecOp takes a sky-subtracted data cube consisting of images at each wavelength increment set by the instrument’s spectral resolution, and an error file for each count measurement. All of these files are generated by the current LRS2 reduction pipeline. The program then collapses the cube in the image plane using the optimal extraction algorithm detailed by Keith Horne (1986). The various user-selected options include the fraction of the total signal enclosed in a contour-defined region, the wavelength range to analyze, and the precision of the spatial profile calculation. SpecOp can output the weighted counts and errors at each wavelength in various table formats using python’s astropy package. We outline the algorithm used for extraction and explain how the software can be used to easily obtain high-quality 1-D spectra. We demonstrate the utility of the program by applying it to spectra of a variety of quasars and AGNs. In some of these targets, we extract the spectrum of a nuclear point source that is superposed on a spatially extended galaxy.

  14. Mining knowledge from text repositories using information extraction ...

    Indian Academy of Sciences (India)

    Information extraction (IE); text mining; text repositories; knowledge discovery from .... general purpose English words. However ... of precision and recall, as extensive experimentation is required due to lack of public tagged corpora. 4. Mining ...

  15. Autonomous Preference-Aware Information Services Integration for High Response in Integrated Faded Information Field Systems

    Science.gov (United States)

    Lu, Xiaodong; Mori, Kinji

    The market and users' requirements have been rapidly changing and diversified. Under these heterogeneous and dynamic situations, not only the system structure itself, but also the accessible information services would be changed constantly. To cope with the continuously changing conditions of service provision and utilization, Faded Information Field (FIF) has been proposed, which is a agent-based distributed information service system architecture. In the case of a mono-service request, the system is designed to improve users' access time and preserve load balancing through the information structure. However, with interdependent requests of multi-service increasing, adaptability and timeliness have to be assured by the system. In this paper, the relationship that exists among the correlated services and the users' preferences for separate and integrated services is clarified. Based on these factors, the autonomous preference-aware information services integration technology to provide one-stop service for users multi-service requests is proposed. As compared to the conventional system, we show that proposed technology is able to reduce the total access time.

  16. Towards an information extraction and knowledge formation framework based on Shannon entropy

    Directory of Open Access Journals (Sweden)

    Iliescu Dragoș

    2017-01-01

    Full Text Available Information quantity subject is approached in this paperwork, considering the specific domain of nonconforming product management as information source. This work represents a case study. Raw data were gathered from a heavy industrial works company, information extraction and knowledge formation being considered herein. Involved method for information quantity estimation is based on Shannon entropy formula. Information and entropy spectrum are decomposed and analysed for extraction of specific information and knowledge-that formation. The result of the entropy analysis point out the information needed to be acquired by the involved organisation, this being presented as a specific knowledge type.

  17. Tagline: Information Extraction for Semi-Structured Text Elements in Medical Progress Notes

    Science.gov (United States)

    Finch, Dezon Kile

    2012-01-01

    Text analysis has become an important research activity in the Department of Veterans Affairs (VA). Statistical text mining and natural language processing have been shown to be very effective for extracting useful information from medical documents. However, neither of these techniques is effective at extracting the information stored in…

  18. Mars Target Encyclopedia: Information Extraction for Planetary Science

    Science.gov (United States)

    Wagstaff, K. L.; Francis, R.; Gowda, T.; Lu, Y.; Riloff, E.; Singh, K.

    2017-06-01

    Mars surface targets / and published compositions / Seek and ye will find. We used text mining methods to extract information from LPSC abstracts about the composition of Mars surface targets. Users can search by element, mineral, or target.

  19. Environment, safety, and health information technology systems integration.

    Energy Technology Data Exchange (ETDEWEB)

    Hendrickson, David A.; Bayer, Gregory W.

    2006-02-01

    The ES&H Information Systems department, motivated by the numerous isolated information technology systems under its control, undertook a significant integration effort. This effort was planned and executed over the course of several years and parts of it still continue today. The effect was to help move the ES&H Information Systems department toward integration with the corporate Information Solutions and Services center.

  20. Optimum detection for extracting maximum information from symmetric qubit sets

    International Nuclear Information System (INIS)

    Mizuno, Jun; Fujiwara, Mikio; Sasaki, Masahide; Akiba, Makoto; Kawanishi, Tetsuya; Barnett, Stephen M.

    2002-01-01

    We demonstrate a class of optimum detection strategies for extracting the maximum information from sets of equiprobable real symmetric qubit states of a single photon. These optimum strategies have been predicted by Sasaki et al. [Phys. Rev. A 59, 3325 (1999)]. The peculiar aspect is that the detections with at least three outputs suffice for optimum extraction of information regardless of the number of signal elements. The cases of ternary (or trine), quinary, and septenary polarization signals are studied where a standard von Neumann detection (a projection onto a binary orthogonal basis) fails to access the maximum information. Our experiments demonstrate that it is possible with present technologies to attain about 96% of the theoretical limit

  1. Study on methods and techniques of aeroradiometric weak information extraction for sandstone-hosted uranium deposits based on GIS

    International Nuclear Information System (INIS)

    Han Shaoyang; Ke Dan; Hou Huiqun

    2005-01-01

    The weak information extraction is one of the important research contents in the current sandstone-type uranium prospecting in China. This paper introduces the connotation of aeroradiometric weak information extraction, and discusses the formation theories of aeroradiometric weak information extraction, and discusses the formation theories of aeroradiometric weak information and establishes some effective mathematic models for weak information extraction. Models for weak information extraction are realized based on GIS software platform. Application tests of weak information extraction are realized based on GIS software platform. Application tests of weak information extraction are completed in known uranium mineralized areas. Research results prove that the prospective areas of sandstone-type uranium deposits can be rapidly delineated by extracting aeroradiometric weak information. (authors)

  2. Information Integration Technology Demonstration (IITD)

    National Research Council Canada - National Science Library

    Loe, Richard

    2001-01-01

    The objectives of the Information Integration Technology Demonstration (IITD) were to investigate, design a software architecture and demonstrate a capability to display intelligence data from multiple disciplines...

  3. Information extraction from muon radiography data

    International Nuclear Information System (INIS)

    Borozdin, K.N.; Asaki, T.J.; Chartrand, R.; Hengartner, N.W.; Hogan, G.E.; Morris, C.L.; Priedhorsky, W.C.; Schirato, R.C.; Schultz, L.J.; Sottile, M.J.; Vixie, K.R.; Wohlberg, B.E.; Blanpied, G.

    2004-01-01

    Scattering muon radiography was proposed recently as a technique of detection and 3-d imaging for dense high-Z objects. High-energy cosmic ray muons are deflected in matter in the process of multiple Coulomb scattering. By measuring the deflection angles we are able to reconstruct the configuration of high-Z material in the object. We discuss the methods for information extraction from muon radiography data. Tomographic methods widely used in medical images have been applied to a specific muon radiography information source. Alternative simple technique based on the counting of high-scattered muons in the voxels seems to be efficient in many simulated scenes. SVM-based classifiers and clustering algorithms may allow detection of compact high-Z object without full image reconstruction. The efficiency of muon radiography can be increased using additional informational sources, such as momentum estimation, stopping power measurement, and detection of muonic atom emission.

  4. Integrated Risk Information System (IRIS)

    Data.gov (United States)

    U.S. Environmental Protection Agency — EPA?s Integrated Risk Information System (IRIS) is a compilation of electronic reports on specific substances found in the environment and their potential to cause...

  5. Definition of information technology architectures for continuous data management and medical device integration in diabetes.

    Science.gov (United States)

    Hernando, M Elena; Pascual, Mario; Salvador, Carlos H; García-Sáez, Gema; Rodríguez-Herrero, Agustín; Martínez-Sarriegui, Iñaki; Gómez, Enrique J

    2008-09-01

    The growing availability of continuous data from medical devices in diabetes management makes it crucial to define novel information technology architectures for efficient data storage, data transmission, and data visualization. The new paradigm of care demands the sharing of information in interoperable systems as the only way to support patient care in a continuum of care scenario. The technological platforms should support all the services required by the actors involved in the care process, located in different scenarios and managing diverse information for different purposes. This article presents basic criteria for defining flexible and adaptive architectures that are capable of interoperating with external systems, and integrating medical devices and decision support tools to extract all the relevant knowledge to support diabetes care.

  6. Recognition techniques for extracting information from semistructured documents

    Science.gov (United States)

    Della Ventura, Anna; Gagliardi, Isabella; Zonta, Bruna

    2000-12-01

    Archives of optical documents are more and more massively employed, the demand driven also by the new norms sanctioning the legal value of digital documents, provided they are stored on supports that are physically unalterable. On the supply side there is now a vast and technologically advanced market, where optical memories have solved the problem of the duration and permanence of data at costs comparable to those for magnetic memories. The remaining bottleneck in these systems is the indexing. The indexing of documents with a variable structure, while still not completely automated, can be machine supported to a large degree with evident advantages both in the organization of the work, and in extracting information, providing data that is much more detailed and potentially significant for the user. We present here a system for the automatic registration of correspondence to and from a public office. The system is based on a general methodology for the extraction, indexing, archiving, and retrieval of significant information from semi-structured documents. This information, in our prototype application, is distributed among the database fields of sender, addressee, subject, date, and body of the document.

  7. Integration issues of information engineering based I-CASE tools

    OpenAIRE

    Kurbel, Karl; Schnieder, Thomas

    1994-01-01

    Problems and requirements regarding integration of methods and tools across phases of the software-development life cycle are discussed. Information engineering (IE) methodology and I-CASE (integrated CASE) tools supporting IE claim to have an integrated view across major stages of enterprise-wide information-system development: information strategy planning, business area analysis, system design, and construction. In the main part of this paper, two comprehensive I-CASE tools, ADW (Applicati...

  8. Knowledge and information management for integrated water resource management

    Science.gov (United States)

    Watershed information systems that integrate data and analytical tools are critical enabling technologies to support Integrated Water Resource Management (IWRM) by converting data into information, and information into knowledge. Many factors bring people to the table to participate in an IWRM fra...

  9. Note: Fully integrated 3.2 Gbps quantum random number generator with real-time extraction

    International Nuclear Information System (INIS)

    Zhang, Xiao-Guang; Nie, You-Qi; Liang, Hao; Zhang, Jun; Pan, Jian-Wei; Zhou, Hongyi; Ma, Xiongfeng

    2016-01-01

    We present a real-time and fully integrated quantum random number generator (QRNG) by measuring laser phase fluctuations. The QRNG scheme based on laser phase fluctuations is featured for its capability of generating ultra-high-speed random numbers. However, the speed bottleneck of a practical QRNG lies on the limited speed of randomness extraction. To close the gap between the fast randomness generation and the slow post-processing, we propose a pipeline extraction algorithm based on Toeplitz matrix hashing and implement it in a high-speed field-programmable gate array. Further, all the QRNG components are integrated into a module, including a compact and actively stabilized interferometer, high-speed data acquisition, and real-time data post-processing and transmission. The final generation rate of the QRNG module with real-time extraction can reach 3.2 Gbps.

  10. Note: Fully integrated 3.2 Gbps quantum random number generator with real-time extraction

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Xiao-Guang; Nie, You-Qi; Liang, Hao; Zhang, Jun, E-mail: zhangjun@ustc.edu.cn; Pan, Jian-Wei [Hefei National Laboratory for Physical Sciences at the Microscale and Department of Modern Physics, University of Science and Technology of China, Hefei, Anhui 230026 (China); CAS Center for Excellence and Synergetic Innovation Center in Quantum Information and Quantum Physics, University of Science and Technology of China, Hefei, Anhui 230026 (China); Zhou, Hongyi; Ma, Xiongfeng [Center for Quantum Information, Institute for Interdisciplinary Information Sciences, Tsinghua University, Beijing 100084 (China)

    2016-07-15

    We present a real-time and fully integrated quantum random number generator (QRNG) by measuring laser phase fluctuations. The QRNG scheme based on laser phase fluctuations is featured for its capability of generating ultra-high-speed random numbers. However, the speed bottleneck of a practical QRNG lies on the limited speed of randomness extraction. To close the gap between the fast randomness generation and the slow post-processing, we propose a pipeline extraction algorithm based on Toeplitz matrix hashing and implement it in a high-speed field-programmable gate array. Further, all the QRNG components are integrated into a module, including a compact and actively stabilized interferometer, high-speed data acquisition, and real-time data post-processing and transmission. The final generation rate of the QRNG module with real-time extraction can reach 3.2 Gbps.

  11. Integrated Information Systems Across the Weather-Climate Continuum

    Science.gov (United States)

    Pulwarty, R. S.; Higgins, W.; Nierenberg, C.; Trtanj, J.

    2015-12-01

    The increasing demand for well-organized (integrated) end-to-end research-based information has been highlighted in several National Academy studies, in IPCC Reports (such as the SREX and Fifth Assessment) and by public and private constituents. Such information constitutes a significant component of the "environmental intelligence" needed to address myriad societal needs for early warning and resilience across the weather-climate continuum. The next generation of climate research in service to the nation requires an even more visible, authoritative and robust commitment to scientific integration in support of adaptive information systems that address emergent risks and inform longer-term resilience strategies. A proven mechanism for resourcing such requirements is to demonstrate vision, purpose, support, connection to constituencies, and prototypes of desired capabilities. In this presentation we will discuss efforts at NOAA, and elsewhere, that: Improve information on how changes in extremes in key phenomena such as drought, floods, and heat stress impact management decisions for resource planning and disaster risk reduction Develop regional integrated information systems to address these emergent challenges, that integrate observations, monitoring and prediction, impacts assessments and scenarios, preparedness and adaptation, and coordination and capacity-building. Such systems, as illustrated through efforts such as NIDIS, have strengthened the integration across the foundational research enterprise (through for instance, RISAs, Modeling Analysis Predictions and Projections) by increasing agility for responding to emergent risks. The recently- initiated Climate Services Information System, in support of the WMO Global Framework for Climate Services draws on the above models and will be introduced during the presentation.

  12. EFFICIENCY INDICATORS INFORMATION MANAGEMENT IN INTEGRATED SECURITY SYSTEMS

    Directory of Open Access Journals (Sweden)

    N. S. Rodionova

    2014-01-01

    Full Text Available Summary. Introduction of information technology to improve the efficiency of security activity leads to the need to consider a number of negative factors associated with in consequence of the use of these technologies as a key element of modern security systems. One of the most notable factor is the exposure to information processes in protection systems security threats. This largely relates to integrated security systems (ISS is the system of protection with the highest level of informatization security functions. Significant damage to protected objects that they could potentially incur as a result of abnormal operation ISS, puts a very actual problem of assessing factors that reduce the efficiency of the ISS to justify the ways and methods to improve it. Because of the nature of threats and blocking distortion of information in the ISS of interest are: the volume undistorted ISF working environment, as a characteristic of data integrity; time access to information as a feature of its availability. This in turn leads to the need to use these parameters as the performance characteristics of information processes in the ISS - the completeness and timeliness of information processing. The article proposes performance indicators of information processes in integrated security systems in terms of optimal control procedures to protect information from unauthorized access. Set the considered parameters allows to conduct comprehensive security analysis of integrated security systems, and to provide recommendations to improve the management of information security procedures in them.

  13. Risk Informed Structural Systems Integrity Management

    DEFF Research Database (Denmark)

    Nielsen, Michael Havbro Faber

    2017-01-01

    The present paper is predominantly a conceptual contribution with an appraisal of major developments in risk informed structural integrity management for offshore installations together with a discussion of their merits and the challenges which still lie ahead. Starting point is taken in a selected...... overview of research and development contributions which have formed the basis for Risk Based Inspection Planning (RBI) as we know it today. Thereafter an outline of the methodical basis for risk informed structural systems integrity management, i.e. the Bayesian decision analysis is provided in summary....... The main focus is here directed on RBI for offshore facilities subject to fatigue damages. New ideas and methodical frameworks in the area of robustness and resilience modeling of structural systems are then introduced, and it is outlined how these may adequately be utilized to enhance Structural Integrity...

  14. CSIR's new integrated electronic library information-system

    CSIR Research Space (South Africa)

    Michie, A

    1995-08-01

    Full Text Available The CSIR has developed a CDROM-based electronic library information system which provides the ability to reproduce and search for published information and colour brochures on the computer screen. The system integrates this information with online...

  15. RESEARCH ON REMOTE SENSING GEOLOGICAL INFORMATION EXTRACTION BASED ON OBJECT ORIENTED CLASSIFICATION

    Directory of Open Access Journals (Sweden)

    H. Gao

    2018-04-01

    Full Text Available The northern Tibet belongs to the Sub cold arid climate zone in the plateau. It is rarely visited by people. The geological working conditions are very poor. However, the stratum exposures are good and human interference is very small. Therefore, the research on the automatic classification and extraction of remote sensing geological information has typical significance and good application prospect. Based on the object-oriented classification in Northern Tibet, using the Worldview2 high-resolution remote sensing data, combined with the tectonic information and image enhancement, the lithological spectral features, shape features, spatial locations and topological relations of various geological information are excavated. By setting the threshold, based on the hierarchical classification, eight kinds of geological information were classified and extracted. Compared with the existing geological maps, the accuracy analysis shows that the overall accuracy reached 87.8561 %, indicating that the classification-oriented method is effective and feasible for this study area and provides a new idea for the automatic extraction of remote sensing geological information.

  16. Information Security Management - Part Of The Integrated Management System

    Science.gov (United States)

    Manea, Constantin Adrian

    2015-07-01

    The international management standards allow their integrated approach, thereby combining aspects of particular importance to the activity of any organization, from the quality management systems or the environmental management of the information security systems or the business continuity management systems. Although there is no national or international regulation, nor a defined standard for the Integrated Management System, the need to implement an integrated system occurs within the organization, which feels the opportunity to integrate the management components into a cohesive system, in agreement with the purpose and mission publicly stated. The issues relating to information security in the organization, from the perspective of the management system, raise serious questions to any organization in the current context of electronic information, reason for which we consider not only appropriate but necessary to promote and implement an Integrated Management System Quality - Environment - Health and Operational Security - Information Security

  17. Ontology Based Resolution of Semantic Conflicts in Information Integration

    Institute of Scientific and Technical Information of China (English)

    LU Han; LI Qing-zhong

    2004-01-01

    Semantic conflict is the conflict caused by using different ways in heterogeneous systems to express the same entity in reality.This prevents information integration from accomplishing semantic coherence.Since ontology helps to solve semantic problems, this area has become a hot topic in information integration.In this paper, we introduce semantic conflict into information integration of heterogeneous applications.We discuss the origins and categories of the conflict, and present an ontology-based schema mapping approach to eliminate semantic conflicts.

  18. Road Network Extraction from VHR Satellite Images Using Context Aware Object Feature Integration and Tensor Voting

    Directory of Open Access Journals (Sweden)

    Mehdi Maboudi

    2016-08-01

    Full Text Available Road networks are very important features in geospatial databases. Even though high-resolution optical satellite images have already been acquired for more than a decade, tools for automated extraction of road networks from these images are still rare. One consequence of this is the need for manual interaction which, in turn, is time and cost intensive. In this paper, a multi-stage approach is proposed which integrates structural, spectral, textural, as well as contextual information of objects to extract road networks from very high resolution satellite images. Highlights of the approach are a novel linearity index employed for the discrimination of elongated road segments from other objects and customized tensor voting which is utilized to fill missing parts of the network. Experiments are carried out with different datasets. Comparison of the achieved results with the results of seven state-of-the-art methods demonstrated the efficiency of the proposed approach.

  19. Extracting Semantic Information from Visual Data: A Survey

    Directory of Open Access Journals (Sweden)

    Qiang Liu

    2016-03-01

    Full Text Available The traditional environment maps built by mobile robots include both metric ones and topological ones. These maps are navigation-oriented and not adequate for service robots to interact with or serve human users who normally rely on the conceptual knowledge or semantic contents of the environment. Therefore, the construction of semantic maps becomes necessary for building an effective human-robot interface for service robots. This paper reviews recent research and development in the field of visual-based semantic mapping. The main focus is placed on how to extract semantic information from visual data in terms of feature extraction, object/place recognition and semantic representation methods.

  20. Depth extraction method with high accuracy in integral imaging based on moving array lenslet technique

    Science.gov (United States)

    Wang, Yao-yao; Zhang, Juan; Zhao, Xue-wei; Song, Li-pei; Zhang, Bo; Zhao, Xing

    2018-03-01

    In order to improve depth extraction accuracy, a method using moving array lenslet technique (MALT) in pickup stage is proposed, which can decrease the depth interval caused by pixelation. In this method, the lenslet array is moved along the horizontal and vertical directions simultaneously for N times in a pitch to get N sets of elemental images. Computational integral imaging reconstruction method for MALT is taken to obtain the slice images of the 3D scene, and the sum modulus (SMD) blur metric is taken on these slice images to achieve the depth information of the 3D scene. Simulation and optical experiments are carried out to verify the feasibility of this method.

  1. KneeTex: an ontology-driven system for information extraction from MRI reports.

    Science.gov (United States)

    Spasić, Irena; Zhao, Bo; Jones, Christopher B; Button, Kate

    2015-01-01

    In the realm of knee pathology, magnetic resonance imaging (MRI) has the advantage of visualising all structures within the knee joint, which makes it a valuable tool for increasing diagnostic accuracy and planning surgical treatments. Therefore, clinical narratives found in MRI reports convey valuable diagnostic information. A range of studies have proven the feasibility of natural language processing for information extraction from clinical narratives. However, no study focused specifically on MRI reports in relation to knee pathology, possibly due to the complexity of knee anatomy and a wide range of conditions that may be associated with different anatomical entities. In this paper we describe KneeTex, an information extraction system that operates in this domain. As an ontology-driven information extraction system, KneeTex makes active use of an ontology to strongly guide and constrain text analysis. We used automatic term recognition to facilitate the development of a domain-specific ontology with sufficient detail and coverage for text mining applications. In combination with the ontology, high regularity of the sublanguage used in knee MRI reports allowed us to model its processing by a set of sophisticated lexico-semantic rules with minimal syntactic analysis. The main processing steps involve named entity recognition combined with coordination, enumeration, ambiguity and co-reference resolution, followed by text segmentation. Ontology-based semantic typing is then used to drive the template filling process. We adopted an existing ontology, TRAK (Taxonomy for RehAbilitation of Knee conditions), for use within KneeTex. The original TRAK ontology expanded from 1,292 concepts, 1,720 synonyms and 518 relationship instances to 1,621 concepts, 2,550 synonyms and 560 relationship instances. This provided KneeTex with a very fine-grained lexico-semantic knowledge base, which is highly attuned to the given sublanguage. Information extraction results were evaluated

  2. Information Integration Architecture Development

    OpenAIRE

    Faulkner, Stéphane; Kolp, Manuel; Nguyen, Duy Thai; Coyette, Adrien; Do, Thanh Tung; 16th International Conference on Software Engineering and Knowledge Engineering

    2004-01-01

    Multi-Agent Systems (MAS) architectures are gaining popularity for building open, distributed, and evolving software required by systems such as information integration applications. Unfortunately, despite considerable work in software architecture during the last decade, few research efforts have aimed at truly defining patterns and languages for designing such multiagent architectures. We propose a modern approach based on organizational structures and architectural description lan...

  3. Fabricating and Characterizing the Microfluidic Solid Phase Extraction Module Coupling with Integrated ESI Emitters

    Directory of Open Access Journals (Sweden)

    Hangbin Tang

    2018-05-01

    Full Text Available Microfluidic chips coupling with mass spectrometry (MS will be of great significance to the development of relevant instruments involving chemical and bio-chemical analysis, drug detection, food and environmental applications and so on. In our previous works, we proposed two types of microfluidic electrospray ionization (ESI chip coupling with MS: the two-phase flow focusing (FF ESI microfluidic chip and the corner-integrated ESI emitter, respectively. However the pretreatment module integrated with these ESI emitters is still a challenging problem. In this paper, we concentrated on integrating the solid phase micro-extraction (SPME module with our previous proposed on-chip ESI emitters; the fabrication processes of such SPME module are fully compatible with our previous proposed ESI emitters based on the multi-layer soft lithography. We optimized the structure of the integrated chip and characterized its performance using standard samples. Furthermore, we verified its abilities of salt removal, extraction of multiple analytes and separation through on-chip elution using mimic biological urine spiked with different drugs. The results indicated that our proposed integrated module with ESI emitters is practical and effective for real biological sample pretreatment and MS detection.

  4. Extracting the information backbone in online system.

    Directory of Open Access Journals (Sweden)

    Qian-Ming Zhang

    Full Text Available Information overload is a serious problem in modern society and many solutions such as recommender system have been proposed to filter out irrelevant information. In the literature, researchers have been mainly dedicated to improving the recommendation performance (accuracy and diversity of the algorithms while they have overlooked the influence of topology of the online user-object bipartite networks. In this paper, we find that some information provided by the bipartite networks is not only redundant but also misleading. With such "less can be more" feature, we design some algorithms to improve the recommendation performance by eliminating some links from the original networks. Moreover, we propose a hybrid method combining the time-aware and topology-aware link removal algorithms to extract the backbone which contains the essential information for the recommender systems. From the practical point of view, our method can improve the performance and reduce the computational time of the recommendation system, thus improving both of their effectiveness and efficiency.

  5. Extracting the information backbone in online system.

    Science.gov (United States)

    Zhang, Qian-Ming; Zeng, An; Shang, Ming-Sheng

    2013-01-01

    Information overload is a serious problem in modern society and many solutions such as recommender system have been proposed to filter out irrelevant information. In the literature, researchers have been mainly dedicated to improving the recommendation performance (accuracy and diversity) of the algorithms while they have overlooked the influence of topology of the online user-object bipartite networks. In this paper, we find that some information provided by the bipartite networks is not only redundant but also misleading. With such "less can be more" feature, we design some algorithms to improve the recommendation performance by eliminating some links from the original networks. Moreover, we propose a hybrid method combining the time-aware and topology-aware link removal algorithms to extract the backbone which contains the essential information for the recommender systems. From the practical point of view, our method can improve the performance and reduce the computational time of the recommendation system, thus improving both of their effectiveness and efficiency.

  6. Extracting the Information Backbone in Online System

    Science.gov (United States)

    Zhang, Qian-Ming; Zeng, An; Shang, Ming-Sheng

    2013-01-01

    Information overload is a serious problem in modern society and many solutions such as recommender system have been proposed to filter out irrelevant information. In the literature, researchers have been mainly dedicated to improving the recommendation performance (accuracy and diversity) of the algorithms while they have overlooked the influence of topology of the online user-object bipartite networks. In this paper, we find that some information provided by the bipartite networks is not only redundant but also misleading. With such “less can be more” feature, we design some algorithms to improve the recommendation performance by eliminating some links from the original networks. Moreover, we propose a hybrid method combining the time-aware and topology-aware link removal algorithms to extract the backbone which contains the essential information for the recommender systems. From the practical point of view, our method can improve the performance and reduce the computational time of the recommendation system, thus improving both of their effectiveness and efficiency. PMID:23690946

  7. Information Management Processes for Extraction of Student Dropout Indicators in Courses in Distance Mode

    Directory of Open Access Journals (Sweden)

    Renata Maria Abrantes Baracho

    2016-04-01

    Full Text Available This research addresses the use of information management processes in order to extract student dropout indicators in distance mode courses. Distance education in Brazil aims to facilitate access to information. The MEC (Ministry of Education announced, in the second semester of 2013, that the main obstacles faced by institutions offering courses in this mode were students dropping out and the resistance of both educators and students to this mode. The research used a mixed methodology, qualitative and quantitative, to obtain student dropout indicators. The factors found and validated in this research were: the lack of interest from students, insufficient training in the use of the virtual learning environment for students, structural problems in the schools that were chosen to offer the course, students without e-mail, incoherent answers to activities to the course, lack of knowledge on the part of the student when using the computer tool. The scenario considered was a course offered in distance mode called Aluno Integrado (Integrated Student

  8. The Dilution Effect and Information Integration in Perceptual Decision Making.

    Science.gov (United States)

    Hotaling, Jared M; Cohen, Andrew L; Shiffrin, Richard M; Busemeyer, Jerome R

    2015-01-01

    In cognitive science there is a seeming paradox: On the one hand, studies of human judgment and decision making have repeatedly shown that people systematically violate optimal behavior when integrating information from multiple sources. On the other hand, optimal models, often Bayesian, have been successful at accounting for information integration in fields such as categorization, memory, and perception. This apparent conflict could be due, in part, to different materials and designs that lead to differences in the nature of processing. Stimuli that require controlled integration of information, such as the quantitative or linguistic information (commonly found in judgment studies), may lead to suboptimal performance. In contrast, perceptual stimuli may lend themselves to automatic processing, resulting in integration that is closer to optimal. We tested this hypothesis with an experiment in which participants categorized faces based on resemblance to a family patriarch. The amount of evidence contained in the top and bottom halves of each test face was independently manipulated. These data allow us to investigate a canonical example of sub-optimal information integration from the judgment and decision making literature, the dilution effect. Splitting the top and bottom halves of a face, a manipulation meant to encourage controlled integration of information, produced farther from optimal behavior and larger dilution effects. The Multi-component Information Accumulation model, a hybrid optimal/averaging model of information integration, successfully accounts for key accuracy, response time, and dilution effects.

  9. The Dilution Effect and Information Integration in Perceptual Decision Making.

    Directory of Open Access Journals (Sweden)

    Jared M Hotaling

    Full Text Available In cognitive science there is a seeming paradox: On the one hand, studies of human judgment and decision making have repeatedly shown that people systematically violate optimal behavior when integrating information from multiple sources. On the other hand, optimal models, often Bayesian, have been successful at accounting for information integration in fields such as categorization, memory, and perception. This apparent conflict could be due, in part, to different materials and designs that lead to differences in the nature of processing. Stimuli that require controlled integration of information, such as the quantitative or linguistic information (commonly found in judgment studies, may lead to suboptimal performance. In contrast, perceptual stimuli may lend themselves to automatic processing, resulting in integration that is closer to optimal. We tested this hypothesis with an experiment in which participants categorized faces based on resemblance to a family patriarch. The amount of evidence contained in the top and bottom halves of each test face was independently manipulated. These data allow us to investigate a canonical example of sub-optimal information integration from the judgment and decision making literature, the dilution effect. Splitting the top and bottom halves of a face, a manipulation meant to encourage controlled integration of information, produced farther from optimal behavior and larger dilution effects. The Multi-component Information Accumulation model, a hybrid optimal/averaging model of information integration, successfully accounts for key accuracy, response time, and dilution effects.

  10. Integrating the Supervised Information into Unsupervised Learning

    Directory of Open Access Journals (Sweden)

    Ping Ling

    2013-01-01

    Full Text Available This paper presents an assembling unsupervised learning framework that adopts the information coming from the supervised learning process and gives the corresponding implementation algorithm. The algorithm consists of two phases: extracting and clustering data representatives (DRs firstly to obtain labeled training data and then classifying non-DRs based on labeled DRs. The implementation algorithm is called SDSN since it employs the tuning-scaled Support vector domain description to collect DRs, uses spectrum-based method to cluster DRs, and adopts the nearest neighbor classifier to label non-DRs. The validation of the clustering procedure of the first-phase is analyzed theoretically. A new metric is defined data dependently in the second phase to allow the nearest neighbor classifier to work with the informed information. A fast training approach for DRs’ extraction is provided to bring more efficiency. Experimental results on synthetic and real datasets verify that the proposed idea is of correctness and performance and SDSN exhibits higher popularity in practice over the traditional pure clustering procedure.

  11. An integrated healthcare enterprise information portal and healthcare information system framework.

    Science.gov (United States)

    Hsieh, S L; Lai, Feipei; Cheng, P H; Chen, J L; Lee, H H; Tsai, W N; Weng, Y C; Hsieh, S H; Hsu, K P; Ko, L F; Yang, T H; Chen, C H

    2006-01-01

    The paper presents an integrated, distributed Healthcare Enterprise Information Portal (HEIP) and Hospital Information Systems (HIS) framework over wireless/wired infrastructure at National Taiwan University Hospital (NTUH). A single sign-on solution for the hospital customer relationship management (CRM) in HEIP has been established. The outcomes of the newly developed Outpatient Information Systems (OIS) in HIS are discussed. The future HEIP blueprints with CRM oriented features: e-Learning, Remote Consultation and Diagnosis (RCD), as well as on-Line Vaccination Services are addressed. Finally, the integrated HEIP and HIS architectures based on the middleware technologies are proposed along with the feasible approaches. The preliminary performance of multi-media, time-based data exchanges over the wireless HEIP side is collected to evaluate the efficiency of the architecture.

  12. A Quality-Driven Methodology for Information Systems Integration

    Directory of Open Access Journals (Sweden)

    Iyad Zikra

    2017-10-01

    Full Text Available Information systems integration is an essential instrument for organizations to attain advantage in today’s growing and fast changing business and technology landscapes. Integration solutions generate added value by combining the functionality and services of heterogeneous and diverse systems. Existing integration environments tend to rely heavily on technical, platform-dependent skills. Consequently, the solutions that they enable are not optimally aligned with the envisioned business goals of the organization. Furthermore, the gap between the goals and the solutions complicates the task of evaluating the quality of integration solutions. To address these challenges, we propose a quality-driven, model-driven methodology for designing and developing integration solutions. The methodology spans organizational and systems design details, providing a holistic view of the integration solution and its underlying business goals. A multi-view meta-model provides the basis for the integration design. Quality factors that affect various aspects of the integration solution guide and inform the progress of the methodology. An example business case is presented to demonstrate the application of the methodology.

  13. Integrated risk information system (IRIS)

    Energy Technology Data Exchange (ETDEWEB)

    Tuxen, L. [Environmental Protection Agency, Washington, DC (United States)

    1990-12-31

    The Integrated Risk Information System (IRIS) is an electronic information system developed by the US Environmental Protection Agency (EPA) containing information related to health risk assessment. IRIS is the Agency`s primary vehicle for communication of chronic health hazard information that represents Agency consensus following comprehensive review by intra-Agency work groups. The original purpose for developing IRIS was to provide guidance to EPA personnel in making risk management decisions. This original purpose for developing IRIS was to guidance to EPA personnel in making risk management decisions. This role has expanded and evolved with wider access and use of the system. IRIS contains chemical-specific information in summary format for approximately 500 chemicals. IRIS is available to the general public on the National Library of Medicine`s Toxicology Data Network (TOXNET) and on diskettes through the National Technical Information Service (NTIS).

  14. Information Systems Integration and Enterprise Application Integration (EAI) Adoption: A Case from Financial Services

    Science.gov (United States)

    Lam, Wing

    2007-01-01

    Increasingly, organizations find that they need to integrate large number of information systems in order to support enterprise-wide business initiatives such as e-business, supply chain management and customer relationship management. To date, organizations have largely tended to address information systems (IS) integration in an ad-hoc manner.…

  15. Development of an integrated medical supply information system

    Science.gov (United States)

    Xu, Eric; Wermus, Marek; Blythe Bauman, Deborah

    2011-08-01

    The integrated medical supply inventory control system introduced in this study is a hybrid system that is shaped by the nature of medical supply, usage and storage capacity limitations of health care facilities. The system links demand, service provided at the clinic, health care service provider's information, inventory storage data and decision support tools into an integrated information system. ABC analysis method, economic order quantity model, two-bin method and safety stock concept are applied as decision support models to tackle inventory management issues at health care facilities. In the decision support module, each medical item and storage location has been scrutinised to determine the best-fit inventory control policy. The pilot case study demonstrates that the integrated medical supply information system holds several advantages for inventory managers, since it entails benefits of deploying enterprise information systems to manage medical supply and better patient services.

  16. Extraction Method for Earthquake-Collapsed Building Information Based on High-Resolution Remote Sensing

    International Nuclear Information System (INIS)

    Chen, Peng; Wu, Jian; Liu, Yaolin; Wang, Jing

    2014-01-01

    At present, the extraction of earthquake disaster information from remote sensing data relies on visual interpretation. However, this technique cannot effectively and quickly obtain precise and efficient information for earthquake relief and emergency management. Collapsed buildings in the town of Zipingpu after the Wenchuan earthquake were used as a case study to validate two kinds of rapid extraction methods for earthquake-collapsed building information based on pixel-oriented and object-oriented theories. The pixel-oriented method is based on multi-layer regional segments that embody the core layers and segments of the object-oriented method. The key idea is to mask layer by layer all image information, including that on the collapsed buildings. Compared with traditional techniques, the pixel-oriented method is innovative because it allows considerably rapid computer processing. As for the object-oriented method, a multi-scale segment algorithm was applied to build a three-layer hierarchy. By analyzing the spectrum, texture, shape, location, and context of individual object classes in different layers, the fuzzy determined rule system was established for the extraction of earthquake-collapsed building information. We compared the two sets of results using three variables: precision assessment, visual effect, and principle. Both methods can extract earthquake-collapsed building information quickly and accurately. The object-oriented method successfully overcomes the pepper salt noise caused by the spectral diversity of high-resolution remote sensing data and solves the problem of same object, different spectrums and that of same spectrum, different objects. With an overall accuracy of 90.38%, the method achieves more scientific and accurate results compared with the pixel-oriented method (76.84%). The object-oriented image analysis method can be extensively applied in the extraction of earthquake disaster information based on high-resolution remote sensing

  17. An information integration theory of consciousness

    Directory of Open Access Journals (Sweden)

    Tononi Giulio

    2004-11-01

    Full Text Available Abstract Background Consciousness poses two main problems. The first is understanding the conditions that determine to what extent a system has conscious experience. For instance, why is our consciousness generated by certain parts of our brain, such as the thalamocortical system, and not by other parts, such as the cerebellum? And why are we conscious during wakefulness and much less so during dreamless sleep? The second problem is understanding the conditions that determine what kind of consciousness a system has. For example, why do specific parts of the brain contribute specific qualities to our conscious experience, such as vision and audition? Presentation of the hypothesis This paper presents a theory about what consciousness is and how it can be measured. According to the theory, consciousness corresponds to the capacity of a system to integrate information. This claim is motivated by two key phenomenological properties of consciousness: differentiation – the availability of a very large number of conscious experiences; and integration – the unity of each such experience. The theory states that the quantity of consciousness available to a system can be measured as the Φ value of a complex of elements. Φ is the amount of causally effective information that can be integrated across the informational weakest link of a subset of elements. A complex is a subset of elements with Φ>0 that is not part of a subset of higher Φ. The theory also claims that the quality of consciousness is determined by the informational relationships among the elements of a complex, which are specified by the values of effective information among them. Finally, each particular conscious experience is specified by the value, at any given time, of the variables mediating informational interactions among the elements of a complex. Testing the hypothesis The information integration theory accounts, in a principled manner, for several neurobiological observations

  18. Improving information extraction using a probability-based approach

    DEFF Research Database (Denmark)

    Kim, S.; Ahmed, Saeema; Wallace, K.

    2007-01-01

    Information plays a crucial role during the entire life-cycle of a product. It has been shown that engineers frequently consult colleagues to obtain the information they require to solve problems. However, the industrial world is now more transient and key personnel move to other companies...... or retire. It is becoming essential to retrieve vital information from archived product documents, if it is available. There is, therefore, great interest in ways of extracting relevant and sharable information from documents. A keyword-based search is commonly used, but studies have shown...... the recall, while maintaining the high precision, a learning approach that makes identification decisions based on a probability model, rather than simply looking up the presence of the pre-defined variations, looks promising. This paper presents the results of developing such a probability-based entity...

  19. INEL Waste and Environmental Information Integration Project approach and concepts

    International Nuclear Information System (INIS)

    Dean, L.A.; Fairbourn, P.J.; Randall, V.C.; Riedesel, A.M.

    1994-06-01

    The Idaho National Engineering, Laboratory (INEL) Waste and Environmental Information integration Project (IWEIIP) was established in December 1993 to address issues related to INEL waste and environmental information including: Data quality; Data redundancy; Data accessibility; Data integration. This effort includes existing information, new development, and acquisition activities. Existing information may not be a database record; it may be an entire document (electronic, scanned, or hard-copy), a video clip, or a file cabinet of information. The IWEIIP will implement an effective integrated information framework to manage INEL waste and environmental information as an asset. This will improve data quality, resolve data redundancy, and increase data accessibility; therefore, providing more effective utilization of the dollars spent on waste and environmental information

  20. Semi-automatic building extraction in informal settlements from high-resolution satellite imagery

    Science.gov (United States)

    Mayunga, Selassie David

    The extraction of man-made features from digital remotely sensed images is considered as an important step underpinning management of human settlements in any country. Man-made features and buildings in particular are required for varieties of applications such as urban planning, creation of geographical information systems (GIS) databases and Urban City models. The traditional man-made feature extraction methods are very expensive in terms of equipment, labour intensive, need well-trained personnel and cannot cope with changing environments, particularly in dense urban settlement areas. This research presents an approach for extracting buildings in dense informal settlement areas using high-resolution satellite imagery. The proposed system uses a novel strategy of extracting building by measuring a single point at the approximate centre of the building. The fine measurement of the building outlines is then effected using a modified snake model. The original snake model on which this framework is based, incorporates an external constraint energy term which is tailored to preserving the convergence properties of the snake model; its use to unstructured objects will negatively affect their actual shapes. The external constrained energy term was removed from the original snake model formulation, thereby, giving ability to cope with high variability of building shapes in informal settlement areas. The proposed building extraction system was tested on two areas, which have different situations. The first area was Tungi in Dar Es Salaam, Tanzania where three sites were tested. This area is characterized by informal settlements, which are illegally formulated within the city boundaries. The second area was Oromocto in New Brunswick, Canada where two sites were tested. Oromocto area is mostly flat and the buildings are constructed using similar materials. Qualitative and quantitative measures were employed to evaluate the accuracy of the results as well as the performance

  1. Using integrated information systems in supply chain management

    Science.gov (United States)

    Gonzálvez-Gallego, Nicolás; Molina-Castillo, Francisco-Jose; Soto-Acosta, Pedro; Varajao, Joao; Trigo, Antonio

    2015-02-01

    The aim of this paper is to empirically test not only the direct effects of information and communication technology (ICT) capabilities and integrated information systems (IS) on firm performance, but also the moderating role of IS integration along the supply chain in the relationship between ICT external and capabilities and business performance. Data collected from 102 large Iberian firms from Spain and Portugal are used to test the research model. The hierarchical multiple regression analysis is employed to test the direct effects and the moderating relationships proposed. Results show that external and internal ICT capabilities are important drivers of firm performance, while merely having integrated IS do not lead to better firm performance. In addition, a moderating effect of IS integration in the relationship between ICT capabilities and business performance is found, although this integration only contributes to firm performance when it is directed to connect with suppliers or customers rather than when integrating the whole supply chain.

  2. Standards to support information systems integration in anatomic pathology.

    Science.gov (United States)

    Daniel, Christel; García Rojo, Marcial; Bourquard, Karima; Henin, Dominique; Schrader, Thomas; Della Mea, Vincenzo; Gilbertson, John; Beckwith, Bruce A

    2009-11-01

    Integrating anatomic pathology information- text and images-into electronic health care records is a key challenge for enhancing clinical information exchange between anatomic pathologists and clinicians. The aim of the Integrating the Healthcare Enterprise (IHE) international initiative is precisely to ensure interoperability of clinical information systems by using existing widespread industry standards such as Digital Imaging and Communication in Medicine (DICOM) and Health Level Seven (HL7). To define standard-based informatics transactions to integrate anatomic pathology information to the Healthcare Enterprise. We used the methodology of the IHE initiative. Working groups from IHE, HL7, and DICOM, with special interest in anatomic pathology, defined consensual technical solutions to provide end-users with improved access to consistent information across multiple information systems. The IHE anatomic pathology technical framework describes a first integration profile, "Anatomic Pathology Workflow," dedicated to the diagnostic process including basic image acquisition and reporting solutions. This integration profile relies on 10 transactions based on HL7 or DICOM standards. A common specimen model was defined to consistently identify and describe specimens in both HL7 and DICOM transactions. The IHE anatomic pathology working group has defined standard-based informatics transactions to support the basic diagnostic workflow in anatomic pathology laboratories. In further stages, the technical framework will be completed to manage whole-slide images and semantically rich structured reports in the diagnostic workflow and to integrate systems used for patient care and those used for research activities (such as tissue bank databases or tissue microarrayers).

  3. GeoDeepDive: Towards a Machine Reading-Ready Digital Library and Information Integration Resource

    Science.gov (United States)

    Husson, J. M.; Peters, S. E.; Livny, M.; Ross, I.

    2015-12-01

    Recent developments in machine reading and learning approaches to text and data mining hold considerable promise for accelerating the pace and quality of literature-based data synthesis, but these advances have outpaced even basic levels of access to the published literature. For many geoscience domains, particularly those based on physical samples and field-based descriptions, this limitation is significant. Here we describe a general infrastructure to support published literature-based machine reading and learning approaches to information integration and knowledge base creation. This infrastructure supports rate-controlled automated fetching of original documents, along with full bibliographic citation metadata, from remote servers, the secure storage of original documents, and the utilization of considerable high-throughput computing resources for the pre-processing of these documents by optical character recognition, natural language parsing, and other document annotation and parsing software tools. New tools and versions of existing tools can be automatically deployed against original documents when they are made available. The products of these tools (text/XML files) are managed by MongoDB and are available for use in data extraction applications. Basic search and discovery functionality is provided by ElasticSearch, which is used to identify documents of potential relevance to a given data extraction task. Relevant files derived from the original documents are then combined into basic starting points for application building; these starting points are kept up-to-date as new relevant documents are incorporated into the digital library. Currently, our digital library stores contains more than 360K documents supplied by Elsevier and the USGS and we are actively seeking additional content providers. By focusing on building a dependable infrastructure to support the retrieval, storage, and pre-processing of published content, we are establishing a foundation for

  4. Accurate facade feature extraction method for buildings from three-dimensional point cloud data considering structural information

    Science.gov (United States)

    Wang, Yongzhi; Ma, Yuqing; Zhu, A.-xing; Zhao, Hui; Liao, Lixia

    2018-05-01

    Facade features represent segmentations of building surfaces and can serve as a building framework. Extracting facade features from three-dimensional (3D) point cloud data (3D PCD) is an efficient method for 3D building modeling. By combining the advantages of 3D PCD and two-dimensional optical images, this study describes the creation of a highly accurate building facade feature extraction method from 3D PCD with a focus on structural information. The new extraction method involves three major steps: image feature extraction, exploration of the mapping method between the image features and 3D PCD, and optimization of the initial 3D PCD facade features considering structural information. Results show that the new method can extract the 3D PCD facade features of buildings more accurately and continuously. The new method is validated using a case study. In addition, the effectiveness of the new method is demonstrated by comparing it with the range image-extraction method and the optical image-extraction method in the absence of structural information. The 3D PCD facade features extracted by the new method can be applied in many fields, such as 3D building modeling and building information modeling.

  5. Information analysis of iris biometrics for the needs of cryptology key extraction

    Directory of Open Access Journals (Sweden)

    Adamović Saša

    2013-01-01

    Full Text Available The paper presents a rigorous analysis of iris biometric information for the synthesis of an optimized system for the extraction of a high quality cryptology key. Estimations of local entropy and mutual information were identified as segments of the iris most suitable for this purpose. In order to optimize parameters, corresponding wavelets were transformed, in order to obtain the highest possible entropy and mutual information lower in the transformation domain, which set frameworks for the synthesis of systems for the extraction of truly random sequences of iris biometrics, without compromising authentication properties. [Projekat Ministarstva nauke Republike Srbije, br. TR32054 i br. III44006

  6. [Research on medical instrument information integration technology based on IHE PCD].

    Science.gov (United States)

    Zheng, Jianli; Liao, Yun; Yang, Yongyong

    2014-06-01

    Integrating medical instruments with medical information systems becomes more and more important in healthcare industry. To make medical instruments without standard communication interface possess the capability of interoperating and sharing information with medical information systems, we developed a medical instrument integration gateway based on Integrating the Healthcare Enterprise Patient Care Device (IHE PCD) integration profiles in this research. The core component is an integration engine which is implemented according to integration profiles and Health Level Seven (HL7) messages defined in IHE PCD. Working with instrument specific Javascripts, the engine transforms medical instrument data into HL7 ORU message. This research enables medical instruments to interoperate and exchange medical data with information systems in a standardized way, and is valuable for medical instrument integration, especially for traditional instruments.

  7. Information retrieval and terminology extraction in online resources for patients with diabetes.

    Science.gov (United States)

    Seljan, Sanja; Baretić, Maja; Kucis, Vlasta

    2014-06-01

    Terminology use, as a mean for information retrieval or document indexing, plays an important role in health literacy. Specific types of users, i.e. patients with diabetes need access to various online resources (on foreign and/or native language) searching for information on self-education of basic diabetic knowledge, on self-care activities regarding importance of dietetic food, medications, physical exercises and on self-management of insulin pumps. Automatic extraction of corpus-based terminology from online texts, manuals or professional papers, can help in building terminology lists or list of "browsing phrases" useful in information retrieval or in document indexing. Specific terminology lists represent an intermediate step between free text search and controlled vocabulary, between user's demands and existing online resources in native and foreign language. The research aiming to detect the role of terminology in online resources, is conducted on English and Croatian manuals and Croatian online texts, and divided into three interrelated parts: i) comparison of professional and popular terminology use ii) evaluation of automatic statistically-based terminology extraction on English and Croatian texts iii) comparison and evaluation of extracted terminology performed on English manual using statistical and hybrid approaches. Extracted terminology candidates are evaluated by comparison with three types of reference lists: list created by professional medical person, list of highly professional vocabulary contained in MeSH and list created by non-medical persons, made as intersection of 15 lists. Results report on use of popular and professional terminology in online diabetes resources, on evaluation of automatically extracted terminology candidates in English and Croatian texts and on comparison of statistical and hybrid extraction methods in English text. Evaluation of automatic and semi-automatic terminology extraction methods is performed by recall

  8. Clinical Information Systems Integration in New York City's First Mobile Stroke Unit.

    Science.gov (United States)

    Kummer, Benjamin R; Lerario, Michael P; Navi, Babak B; Ganzman, Adam C; Ribaudo, Daniel; Mir, Saad A; Pishanidar, Sammy; Lekic, Tim; Williams, Olajide; Kamel, Hooman; Marshall, Randolph S; Hripcsak, George; Elkind, Mitchell S V; Fink, Matthew E

    2018-01-01

    Mobile stroke units (MSUs) reduce time to thrombolytic therapy in acute ischemic stroke. These units are widely used, but the clinical information systems underlying MSU operations are understudied. The first MSU on the East Coast of the United States was established at New York Presbyterian Hospital (NYP) in October 2016. We describe our program's 7-month pilot, focusing on the integration of our hospital's clinical information systems into our MSU to support patient care and research efforts. NYP's MSU was staffed by two paramedics, one radiology technologist, and a vascular neurologist. The unit was equipped with four laptop computers and networking infrastructure enabling all staff to access the hospital intranet and clinical applications during operating hours. A telephone-based registration procedure registered patients from the field into our admit/discharge/transfer system, which interfaced with the institutional electronic health record (EHR). We developed and implemented a computerized physician order entry set in our EHR with prefilled values to permit quick ordering of medications, imaging, and laboratory testing. We also developed and implemented a structured clinician note to facilitate care documentation and clinical data extraction. Our MSU began operating on October 3, 2016. As of April 27, 2017, the MSU transported 49 patients, of whom 16 received tissue plasminogen activator (t-PA). Zero technical problems impacting patient care were reported around registration, order entry, or intranet access. Two onboard network failures occurred, resulting in computed tomography scanner malfunctions, although no patients became ineligible for time-sensitive treatment as a result. Thirteen (26.5%) clinical notes contained at least one incomplete time field. The main technical challenges encountered during the integration of our hospital's clinical information systems into our MSU were onboard network failures and incomplete clinical documentation. Future

  9. Integrating Information & Communications Technologies into the Classroom

    Science.gov (United States)

    Tomei, Lawrence, Ed.

    2007-01-01

    "Integrating Information & Communications Technologies Into the Classroom" examines topics critical to business, computer science, and information technology education, such as: school improvement and reform, standards-based technology education programs, data-driven decision making, and strategic technology education planning. This book also…

  10. Portable blood extraction device integrated with biomedical monitoring system

    Science.gov (United States)

    Khumpuang, S.; Horade, M.; Fujioka, K.; Sugiyama, S.

    2006-01-01

    Painless and portable blood extraction device has been immersed in the world of miniaturization on bio-medical research particularly in manufacturing point-of-care systems. The fabrication of a blood extraction device integrated with an electrolyte-monitoring system is reported in this paper. The device has advantages in precise controlled dosage of blood extracted including the slightly damaged blood vessels and nervous system. The in-house blood diagnostic will become simple for the patients. Main components of the portable system are; the blood extraction device and electrolyte-monitoring system. The monitoring system consists of ISFET (Ion Selective Field Effect Transistor) for measuring the concentration level of minerals in blood. In this work, we measured the level of 3 ions; Na+, K+ and Cl-. The mentioned ions are frequently required the measurement since their concentration levels in the blood can indicate whether the kidney, pancreas, liver or heart is being malfunction. The fabrication of the whole system and experimentation on each ISM (Ion Sensitive Membrane) will be provided. Taking the advantages of LIGA technology, the 100 hollow microneedles fabricated by Synchrotron Radiation deep X-ray lithography through PCT (Plane-pattern to Cross-section Transfer) technique have been consisted in 5x5 mm2 area. The microneedle is 300 μm in base-diameter, 500 μm-pitch, 800 μm-height and 50 μm hole-diameter. The total size of the blood extraction device is 2x2x2 cm 3. The package is made from a plastic socket including slots for inserting microneedle array and ISFET connecting to an electrical circuit for the monitoring. Through the dimensional design for simply handling and selection of disposable material, the patients can self-evaluate the critical level of the body minerals in anywhere and anytime.

  11. [Extraction of buildings three-dimensional information from high-resolution satellite imagery based on Barista software].

    Science.gov (United States)

    Zhang, Pei-feng; Hu, Yuan-man; He, Hong-shi

    2010-05-01

    The demand for accurate and up-to-date spatial information of urban buildings is becoming more and more important for urban planning, environmental protection, and other vocations. Today's commercial high-resolution satellite imagery offers the potential to extract the three-dimensional information of urban buildings. This paper extracted the three-dimensional information of urban buildings from QuickBird imagery, and validated the precision of the extraction based on Barista software. It was shown that the extraction of three-dimensional information of the buildings from high-resolution satellite imagery based on Barista software had the advantages of low professional level demand, powerful universality, simple operation, and high precision. One pixel level of point positioning and height determination accuracy could be achieved if the digital elevation model (DEM) and sensor orientation model had higher precision and the off-Nadir View Angle was relatively perfect.

  12. Broad knowledge of information technologies: a prerequisite for the effective management of the integrated information system

    Energy Technology Data Exchange (ETDEWEB)

    Landau, H.B.

    1980-09-01

    There is a trend towards the bringing together of various information technologies into integrated information systems. The managers of these total systems therefore must be familiar with each of the component technologies and how they may be combined into a total information system. To accomplish this, the effective manager should first define the overall system as an integrated flow of information with each step identified; then, the alternate technologies applicable to each step may be selected. Methods of becoming technologically aware are suggested and examples of integrated systems are discussed.

  13. Information Extraction of High Resolution Remote Sensing Images Based on the Calculation of Optimal Segmentation Parameters

    Science.gov (United States)

    Zhu, Hongchun; Cai, Lijie; Liu, Haiying; Huang, Wei

    2016-01-01

    Multi-scale image segmentation and the selection of optimal segmentation parameters are the key processes in the object-oriented information extraction of high-resolution remote sensing images. The accuracy of remote sensing special subject information depends on this extraction. On the basis of WorldView-2 high-resolution data, the optimal segmentation parameters methodof object-oriented image segmentation and high-resolution image information extraction, the following processes were conducted in this study. Firstly, the best combination of the bands and weights was determined for the information extraction of high-resolution remote sensing image. An improved weighted mean-variance method was proposed andused to calculatethe optimal segmentation scale. Thereafter, the best shape factor parameter and compact factor parameters were computed with the use of the control variables and the combination of the heterogeneity and homogeneity indexes. Different types of image segmentation parameters were obtained according to the surface features. The high-resolution remote sensing images were multi-scale segmented with the optimal segmentation parameters. Ahierarchical network structure was established by setting the information extraction rules to achieve object-oriented information extraction. This study presents an effective and practical method that can explain expert input judgment by reproducible quantitative measurements. Furthermore the results of this procedure may be incorporated into a classification scheme. PMID:27362762

  14. Principles and core functions of integrated child health information systems.

    Science.gov (United States)

    Hinman, Alan R; Atkinson, Delton; Diehn, Tonya Norvell; Eichwald, John; Heberer, Jennifer; Hoyle, Therese; King, Pam; Kossack, Robert E; Williams, Donna C; Zimmerman, Amy

    2004-11-01

    Infants undergo a series of preventive and therapeutic health interventions and activities. Typically, each activity includes collection and submission of data to a dedicated information system. Subsequently, health care providers, families, and health programs must query each information system to determine the child's status in a given area. Efforts are underway to integrate information in these separate information systems. This requires specifying the core functions that integrated information systems must perform.

  15. An integrated organisation-wide data quality management and information governance framework: theoretical underpinnings.

    Science.gov (United States)

    Liaw, Siaw-Teng; Pearce, Christopher; Liyanage, Harshana; Liaw, Gladys S S; de Lusignan, Simon

    2014-01-01

    Increasing investment in eHealth aims to improve cost effectiveness and safety of care. Data extraction and aggregation can create new data products to improve professional practice and provide feedback to improve the quality of source data. A previous systematic review concluded that locally relevant clinical indicators and use of clinical record systems could support clinical governance. We aimed to extend and update the review with a theoretical framework. We searched PubMed, Medline, Web of Science, ABI Inform (Proquest) and Business Source Premier (EBSCO) using the terms curation, information ecosystem, data quality management (DQM), data governance, information governance (IG) and data stewardship. We focused on and analysed the scope of DQM and IG processes, theoretical frameworks, and determinants of the processing, quality assurance, presentation and sharing of data across the enterprise. There are good theoretical reasons for integrated governance, but there is variable alignment of DQM, IG and health system objectives across the health enterprise. Ethical constraints exist that require health information ecosystems to process data in ways that are aligned with improving health and system efficiency and ensuring patient safety. Despite an increasingly 'big-data' environment, DQM and IG in health services are still fragmented across the data production cycle. We extend current work on DQM and IG with a theoretical framework for integrated IG across the data cycle. The dimensions of this theory-based framework would require testing with qualitative and quantitative studies to examine the applicability and utility, along with an evaluation of its impact on data quality across the health enterprise.

  16. A cascade of classifiers for extracting medication information from discharge summaries

    Directory of Open Access Journals (Sweden)

    Halgrim Scott

    2011-07-01

    Full Text Available Abstract Background Extracting medication information from clinical records has many potential applications, and recently published research, systems, and competitions reflect an interest therein. Much of the early extraction work involved rules and lexicons, but more recently machine learning has been applied to the task. Methods We present a hybrid system consisting of two parts. The first part, field detection, uses a cascade of statistical classifiers to identify medication-related named entities. The second part uses simple heuristics to link those entities into medication events. Results The system achieved performance that is comparable to other approaches to the same task. This performance is further improved by adding features that reference external medication name lists. Conclusions This study demonstrates that our hybrid approach outperforms purely statistical or rule-based systems. The study also shows that a cascade of classifiers works better than a single classifier in extracting medication information. The system is available as is upon request from the first author.

  17. Curriculum integrated information literacy: a challenge

    DEFF Research Database (Denmark)

    Bønløkke, Mette; Kobow, Else; Kristensen, Anne-Kirstine Østergaard

    2012-01-01

    Information literacy is a competence needed for students and for practitioners in the nursing profession. A curriculum integrated intervention was qualitatively evaluated by focus group interviews of students, lecturers and the university librarian. Information literacy makes sense for students...... when it is linked to assignments, timed right, prepared, systematic and continuous. Support is needed to help students understand the meaning of seeking information, to focus their problem and to make them reflect on their search and its results. Feedback on materials used is also asked for...

  18. Rapid automatic keyword extraction for information retrieval and analysis

    Science.gov (United States)

    Rose, Stuart J [Richland, WA; Cowley,; E, Wendy [Richland, WA; Crow, Vernon L [Richland, WA; Cramer, Nicholas O [Richland, WA

    2012-03-06

    Methods and systems for rapid automatic keyword extraction for information retrieval and analysis. Embodiments can include parsing words in an individual document by delimiters, stop words, or both in order to identify candidate keywords. Word scores for each word within the candidate keywords are then calculated based on a function of co-occurrence degree, co-occurrence frequency, or both. Based on a function of the word scores for words within the candidate keyword, a keyword score is calculated for each of the candidate keywords. A portion of the candidate keywords are then extracted as keywords based, at least in part, on the candidate keywords having the highest keyword scores.

  19. Association of Informal Clinical Integration of Physicians With Cardiac Surgery Payments.

    Science.gov (United States)

    Funk, Russell J; Owen-Smith, Jason; Kaufman, Samuel A; Nallamothu, Brahmajee K; Hollingsworth, John M

    2018-05-01

    To reduce inefficiency and waste associated with care fragmentation, many current programs target greater clinical integration among physicians. However, these programs have led to only modest Medicare spending reductions. Most programs focus on formal integration, which often bears little resemblance to actual physician interaction patterns. To examine how physician interaction patterns vary between health systems and to assess whether variation in informal integration is associated with care delivery payments. National Medicare data from January 1, 2008, through December 31, 2011, identified 253 545 Medicare beneficiaries (aged ≥66 years) from 1186 health systems where Medicare beneficiaries underwent coronary artery bypass grafting (CABG) procedures. Interactions were mapped between all physicians who treated these patients-including primary care physicians and surgical and medical specialists-within a health system during their surgical episode. The level of informal integration was measured in these networks of interacting physicians. Multivariate regression models were fitted to evaluate associations between payments for each surgical episode made on a beneficiary's behalf and the level of informal integration in the health system where the patient was treated. The informal integration level of a health system. Price-standardized total surgical episode and component payments. The total 253 545 study participants included 175 520 men (69.2%; mean [SD] age, 74.51 [5.75] years) and 78 024 women (34.3%; 75.67 [5.91] years). One beneficiary of the 253 545 participants did not have sex information. The low level of informal clinical integration included 84 598 patients (33.4%; mean [SD] age, 75.00 [5.93] years); medium level, 84 442 (33.30%; 74.94 [5.87] years); and high level, 84 505 (33.34%; 74.66 [5.72] years) (P integration levels varied across health systems. After adjusting for patient, health-system, and community factors, higher levels

  20. Health Information Infrastructure for People with Intellectual and Developmental Disabilities (I/DD) Living in Supported Accommodation: Communication, Co-Ordination and Integration of Health Information.

    Science.gov (United States)

    Dahm, Maria R; Georgiou, Andrew; Balandin, Susan; Hill, Sophie; Hemsley, Bronwyn

    2017-10-25

    People with intellectual and/or developmental disability (I/DD) commonly have complex health care needs, but little is known about how their health information is managed in supported accommodation, and across health services providers. This study aimed to describe the current health information infrastructure (i.e., how data and information are collected, stored, communicated, and used) for people with I/DD living in supported accommodation in Australia. It involved a scoping review and synthesis of research, policies, and health documents relevant in this setting. Iterative database and hand searches were conducted across peer-reviewed articles internationally in English and grey literature in Australia (New South Wales) up to September 2015. Data were extracted from the selected relevant literature and analyzed for content themes. Expert stakeholders were consulted to verify the authors' interpretations of the information and content categories. The included 286 sources (peer-reviewed n = 27; grey literature n = 259) reflect that the health information for people with I/DD in supported accommodation is poorly communicated, coordinated and integrated across isolated systems. 'Work-as-imagined' as outlined in policies, does not align with 'work-as-done' in reality. This gap threatens the quality of care and safety of people with I/DD in these settings. The effectiveness of the health information infrastructure and services for people with I/DD can be improved by integrating the information sources and placing people with I/DD and their supporters at the centre of the information exchange process.

  1. Integrated information in discrete dynamical systems: motivation and theoretical framework.

    Directory of Open Access Journals (Sweden)

    David Balduzzi

    2008-06-01

    Full Text Available This paper introduces a time- and state-dependent measure of integrated information, phi, which captures the repertoire of causal states available to a system as a whole. Specifically, phi quantifies how much information is generated (uncertainty is reduced when a system enters a particular state through causal interactions among its elements, above and beyond the information generated independently by its parts. Such mathematical characterization is motivated by the observation that integrated information captures two key phenomenological properties of consciousness: (i there is a large repertoire of conscious experiences so that, when one particular experience occurs, it generates a large amount of information by ruling out all the others; and (ii this information is integrated, in that each experience appears as a whole that cannot be decomposed into independent parts. This paper extends previous work on stationary systems and applies integrated information to discrete networks as a function of their dynamics and causal architecture. An analysis of basic examples indicates the following: (i phi varies depending on the state entered by a network, being higher if active and inactive elements are balanced and lower if the network is inactive or hyperactive. (ii phi varies for systems with identical or similar surface dynamics depending on the underlying causal architecture, being low for systems that merely copy or replay activity states. (iii phi varies as a function of network architecture. High phi values can be obtained by architectures that conjoin functional specialization with functional integration. Strictly modular and homogeneous systems cannot generate high phi because the former lack integration, whereas the latter lack information. Feedforward and lattice architectures are capable of generating high phi but are inefficient. (iv In Hopfield networks, phi is low for attractor states and neutral states, but increases if the networks

  2. Integrated information in discrete dynamical systems: motivation and theoretical framework.

    Science.gov (United States)

    Balduzzi, David; Tononi, Giulio

    2008-06-13

    This paper introduces a time- and state-dependent measure of integrated information, phi, which captures the repertoire of causal states available to a system as a whole. Specifically, phi quantifies how much information is generated (uncertainty is reduced) when a system enters a particular state through causal interactions among its elements, above and beyond the information generated independently by its parts. Such mathematical characterization is motivated by the observation that integrated information captures two key phenomenological properties of consciousness: (i) there is a large repertoire of conscious experiences so that, when one particular experience occurs, it generates a large amount of information by ruling out all the others; and (ii) this information is integrated, in that each experience appears as a whole that cannot be decomposed into independent parts. This paper extends previous work on stationary systems and applies integrated information to discrete networks as a function of their dynamics and causal architecture. An analysis of basic examples indicates the following: (i) phi varies depending on the state entered by a network, being higher if active and inactive elements are balanced and lower if the network is inactive or hyperactive. (ii) phi varies for systems with identical or similar surface dynamics depending on the underlying causal architecture, being low for systems that merely copy or replay activity states. (iii) phi varies as a function of network architecture. High phi values can be obtained by architectures that conjoin functional specialization with functional integration. Strictly modular and homogeneous systems cannot generate high phi because the former lack integration, whereas the latter lack information. Feedforward and lattice architectures are capable of generating high phi but are inefficient. (iv) In Hopfield networks, phi is low for attractor states and neutral states, but increases if the networks are optimized

  3. Earth science information: Planning for the integration and use of global change information

    Science.gov (United States)

    Lousma, Jack R.

    1992-01-01

    Activities and accomplishments of the first six months of the Consortium for International Earth Science Information Network (CIESIN's) 1992 technical program have focused on four main missions: (1) the development and implementation of plans for initiation of the Socioeconomic Data and Applications Center (SEDAC) as part of the EOSDIS Program; (2) the pursuit and development of a broad-based global change information cooperative by providing systems analysis and integration between natural science and social science data bases held by numerous federal agencies and other sources; (3) the fostering of scientific research into the human dimensions of global change and providing integration between natural science and social science data and information; and (4) the serving of CIESIN as a gateway for global change data and information distribution through development of the Global Change Research Information Office and other comprehensive knowledge sharing systems.

  4. Information delivery manuals to integrate building product information into design

    DEFF Research Database (Denmark)

    Berard, Ole Bengt; Karlshøj, Jan

    2011-01-01

    Despite continuing BIM progress, professionals in the AEC industry often lack the information they need to perform their work. Although this problem could be alleviated by information systems similar to those in other industries, companies struggle to model processes and information needs...... them in information systems. BIM implies that objects are bearers of information and logic. The present study has three main aims: (1) to explore IDMs capability to capture all four perspectives, (2) to determine whether an IDM’s collaborative methodology is valid for developing standardized processes......, and (3) to ascertain whether IDM’s business rules can support the development of information and logic-bearing BIM objects. The research is based on a case study of re-engineering the bidding process for a design-build project to integrate building product manufacturers, subcontractors...

  5. Integrated plant information technology design support functionality

    International Nuclear Information System (INIS)

    Kim, Yeon Seung; Kim, Dae Jin; Barber, P. W.; Goland, D.

    1996-06-01

    This technical report was written as a result of Integrated Plant Information System (IPIS) feasibility study on CANDU 9 project which had been carried out from January, 1994 to March, 1994 at AECL (Atomic Energy Canada Limited) in Canada. From 1987, AECL had done endeavour to change engineering work process from paper based work process to computer based work process through CANDU 3 project. Even though AECL had a lot of good results form computerizing the Process Engineering, Instrumentation Control and Electrical Engineering, Mechanical Engineering, Computer Aided Design and Drafting, and Document Management System, but there remains the problem of information isolation and integration. On this feasibility study, IPIS design support functionality guideline was suggested by evaluating current AECL CAE tools, analyzing computer aided engineering task and work flow, investigating request for implementing integrated computer aided engineering and describing Korean request for future CANDU design including CANDU 9. 6 figs. (Author)

  6. Integrated plant information technology design support functionality

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yeon Seung; Kim, Dae Jin [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of); Barber, P W; Goland, D [Atomic Energy Canada Ltd., (Canada)

    1996-06-01

    This technical report was written as a result of Integrated Plant Information System (IPIS) feasibility study on CANDU 9 project which had been carried out from January, 1994 to March, 1994 at AECL (Atomic Energy Canada Limited) in Canada. From 1987, AECL had done endeavour to change engineering work process from paper based work process to computer based work process through CANDU 3 project. Even though AECL had a lot of good results form computerizing the Process Engineering, Instrumentation Control and Electrical Engineering, Mechanical Engineering, Computer Aided Design and Drafting, and Document Management System, but there remains the problem of information isolation and integration. On this feasibility study, IPIS design support functionality guideline was suggested by evaluating current AECL CAE tools, analyzing computer aided engineering task and work flow, investigating request for implementing integrated computer aided engineering and describing Korean request for future CANDU design including CANDU 9. 6 figs. (Author).

  7. Integrated occupational radiation exposure information system

    International Nuclear Information System (INIS)

    Hunt, H.W.

    1983-06-01

    The integrated (Occupational Radiation Exposure) data base information system has many advantages. Radiation exposure information is available to operating management in a more timely manner and in a more flexible mode. The ORE system has permitted the integration of scattered files and data to be stored in a more cost-effective method that permits easy and simultaneous access by a variety of users with different data needs. The external storage needs of the radiation exposure source documents are several orders of magnitude less through the use of the computer assisted retrieval techniques employed in the ORE system. Groundwork is being layed to automate the historical files, which are maintained to help describe the radiation protection programs and policies at any one point in time. The file unit will be microfilmed for topical indexing on the ORE data base

  8. Cortical integrity of the inferior alveolar canal as a predictor of paresthesia after third-molar extraction.

    Science.gov (United States)

    Park, Wonse; Choi, Ji-Wook; Kim, Jae-Young; Kim, Bong-Chul; Kim, Hyung Jun; Lee, Sang-Hwy

    2010-03-01

    Paresthesia is a well-known complication of extraction of mandibular third molars (MTMs). The authors evaluated the relationship between paresthesia after MTM extraction and the cortical integrity of the inferior alveolar canal (IAC) by using computed tomography (CT). The authors designed a retrospective cohort study involving participants considered, on the basis of panoramic imaging, to be at high risk of experiencing injury of the inferior alveolar nerve who subsequently underwent CT imaging and extraction of the MTMs. The primary predictor variable was the contact relationship between the IAC and the MTM as viewed on a CT image, classified into three groups: group 1, no contact; group 2, contact between the MTM and the intact IAC cortex; group 3, contact between the MTM and the interrupted IAC cortex. The secondary predictor variable was the number of CT image slices showing the cortical interruption around the MTM. The outcome variable was the presence or absence of postoperative paresthesia after MTM extraction. The study sample comprised 179 participants who underwent MTM extraction (a total of 259 MTMs). Their mean age was 23.6 years, and 85 (47.5 percent) were male. The overall prevalence of paresthesia was 4.2 percent (11 of 259 teeth). The prevalence of paresthesia in group 3 (involving an interrupted IAC cortex) was 11.8 percent (10 of 85 cases), while for group 2 (involving an intact IAC cortex) and group 1 (involving no contact) it was 1.0 percent (1 of 98 cases) and 0.0 percent (no cases), respectively. The frequency of nerve damage increased with the number of CT image slices showing loss of cortical integrity (P=.043). The results of this study indicate that loss of IAC cortical integrity is associated with an increased risk of experiencing paresthesia after MTM extraction.

  9. Distant supervision for neural relation extraction integrated with word attention and property features.

    Science.gov (United States)

    Qu, Jianfeng; Ouyang, Dantong; Hua, Wen; Ye, Yuxin; Li, Ximing

    2018-04-01

    Distant supervision for neural relation extraction is an efficient approach to extracting massive relations with reference to plain texts. However, the existing neural methods fail to capture the critical words in sentence encoding and meanwhile lack useful sentence information for some positive training instances. To address the above issues, we propose a novel neural relation extraction model. First, we develop a word-level attention mechanism to distinguish the importance of each individual word in a sentence, increasing the attention weights for those critical words. Second, we investigate the semantic information from word embeddings of target entities, which can be developed as a supplementary feature for the extractor. Experimental results show that our model outperforms previous state-of-the-art baselines. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. Robust Vehicle and Traffic Information Extraction for Highway Surveillance

    Directory of Open Access Journals (Sweden)

    Yeh Chia-Hung

    2005-01-01

    Full Text Available A robust vision-based traffic monitoring system for vehicle and traffic information extraction is developed in this research. It is challenging to maintain detection robustness at all time for a highway surveillance system. There are three major problems in detecting and tracking a vehicle: (1 the moving cast shadow effect, (2 the occlusion effect, and (3 nighttime detection. For moving cast shadow elimination, a 2D joint vehicle-shadow model is employed. For occlusion detection, a multiple-camera system is used to detect occlusion so as to extract the exact location of each vehicle. For vehicle nighttime detection, a rear-view monitoring technique is proposed to maintain tracking and detection accuracy. Furthermore, we propose a method to improve the accuracy of background extraction, which usually serves as the first step in any vehicle detection processing. Experimental results are given to demonstrate that the proposed techniques are effective and efficient for vision-based highway surveillance.

  11. 48 CFR 9.104-6 - Federal Awardee Performance and Integrity Information System.

    Science.gov (United States)

    2010-10-01

    ... Performance and Integrity Information System. 9.104-6 Section 9.104-6 Federal Acquisition Regulations System... Contractors 9.104-6 Federal Awardee Performance and Integrity Information System. (a) Before awarding a... Federal Awardee Performance and Integrity Information System (FAPIIS), (available at www.ppirs.gov, then...

  12. Assessment of Integrated Information System (IIS) in organization ...

    African Journals Online (AJOL)

    Assessment of Integrated Information System (IIS) in organization. ... to enable the Information System (IS) managers, as well as top management to understand the ... since organisational and strategic aspects in IIS should also be considered.

  13. Design of the Hospital Integrated Information Management System Based on Cloud Platform.

    Science.gov (United States)

    Aijing, L; Jin, Y

    2015-12-01

    At present, the outdated information management style cannot meet the needs of hospital management, and has become the bottleneck of hospital's management and development. In order to improve the integrated management of information, hospitals have increased their investment in integrated information management systems. On account of the lack of reasonable and scientific design, some hospital integrated information management systems have common problems, such as unfriendly interface, poor portability and maintainability, low security and efficiency, lack of interactivity and information sharing. To solve the problem, this paper carries out the research and design of a hospital information management system based on cloud platform, which can realize the optimized integration of hospital information resources and save money.

  14. Development of Integrated Information System for Travel Bureau Company

    Science.gov (United States)

    Karma, I. G. M.; Susanti, J.

    2018-01-01

    Related to the effectiveness of decision-making by the management of travel bureau company, especially by managers, information serves frequent delays or incomplete. Although already computer-assisted, the existing application-based is used only handle one particular activity only, not integrated. This research is intended to produce an integrated information system that handles the overall operational activities of the company. By applying the object-oriented system development approach, the system is built with Visual Basic. Net programming language and MySQL database package. The result is a system that consists of 4 (four) separated program packages, including Reservation System, AR System, AP System and Accounting System. Based on the output, we can conclude that this system is able to produce integrated information that related to the problem of reservation, operational and financial those produce up-to-date information in order to support operational activities and decisionmaking process by related parties.

  15. Review of Statistical Learning Methods in Integrated Omics Studies (An Integrated Information Science).

    Science.gov (United States)

    Zeng, Irene Sui Lan; Lumley, Thomas

    2018-01-01

    Integrated omics is becoming a new channel for investigating the complex molecular system in modern biological science and sets a foundation for systematic learning for precision medicine. The statistical/machine learning methods that have emerged in the past decade for integrated omics are not only innovative but also multidisciplinary with integrated knowledge in biology, medicine, statistics, machine learning, and artificial intelligence. Here, we review the nontrivial classes of learning methods from the statistical aspects and streamline these learning methods within the statistical learning framework. The intriguing findings from the review are that the methods used are generalizable to other disciplines with complex systematic structure, and the integrated omics is part of an integrated information science which has collated and integrated different types of information for inferences and decision making. We review the statistical learning methods of exploratory and supervised learning from 42 publications. We also discuss the strengths and limitations of the extended principal component analysis, cluster analysis, network analysis, and regression methods. Statistical techniques such as penalization for sparsity induction when there are fewer observations than the number of features and using Bayesian approach when there are prior knowledge to be integrated are also included in the commentary. For the completeness of the review, a table of currently available software and packages from 23 publications for omics are summarized in the appendix.

  16. Research of building information extraction and evaluation based on high-resolution remote-sensing imagery

    Science.gov (United States)

    Cao, Qiong; Gu, Lingjia; Ren, Ruizhi; Wang, Lang

    2016-09-01

    Building extraction currently is important in the application of high-resolution remote sensing imagery. At present, quite a few algorithms are available for detecting building information, however, most of them still have some obvious disadvantages, such as the ignorance of spectral information, the contradiction between extraction rate and extraction accuracy. The purpose of this research is to develop an effective method to detect building information for Chinese GF-1 data. Firstly, the image preprocessing technique is used to normalize the image and image enhancement is used to highlight the useful information in the image. Secondly, multi-spectral information is analyzed. Subsequently, an improved morphological building index (IMBI) based on remote sensing imagery is proposed to get the candidate building objects. Furthermore, in order to refine building objects and further remove false objects, the post-processing (e.g., the shape features, the vegetation index and the water index) is employed. To validate the effectiveness of the proposed algorithm, the omission errors (OE), commission errors (CE), the overall accuracy (OA) and Kappa are used at final. The proposed method can not only effectively use spectral information and other basic features, but also avoid extracting excessive interference details from high-resolution remote sensing images. Compared to the original MBI algorithm, the proposed method reduces the OE by 33.14% .At the same time, the Kappa increase by 16.09%. In experiments, IMBI achieved satisfactory results and outperformed other algorithms in terms of both accuracies and visual inspection

  17. Extracting Social Networks and Contact Information From Email and the Web

    National Research Council Canada - National Science Library

    Culotta, Aron; Bekkerman, Ron; McCallum, Andrew

    2005-01-01

    ...-suited for such information extraction tasks. By recursively calling itself on new people discovered on the Web, the system builds a social network with multiple degrees of separation from the user...

  18. An integrated organisation-wide data quality management and information governance framework: theoretical underpinnings

    Directory of Open Access Journals (Sweden)

    Siaw-Teng Liaw

    2014-10-01

    Full Text Available Introduction Increasing investment in eHealth aims to improve cost effectiveness and safety of care. Data extraction and aggregation can create new data products to improve professional practice and provide feedback to improve the quality of source data. A previous systematic review concluded that locally relevant clinical indicators and use of clinical record systems could support clinical governance. We aimed to extend and update the review with a theoretical framework.Methods We searched PubMed, Medline, Web of Science, ABI Inform (Proquest and Business Source Premier (EBSCO using the terms curation, information ecosystem, data quality management (DQM, data governance, information governance (IG and data stewardship. We focused on and analysed the scope of DQM and IG processes, theoretical frameworks, and determinants of the processing, quality assurance, presentation and sharing of data across the enterprise.Findings There are good theoretical reasons for integrated governance, but there is variable alignment of DQM, IG and health system objectives across the health enterprise. Ethical constraints exist that require health information ecosystems to process data in ways that are aligned with improving health and system efficiency and ensuring patient safety. Despite an increasingly ‘big-data’ environment, DQM and IG in health services are still fragmented across the data production cycle. We extend current work on DQM and IG with a theoretical framework for integrated IG across the data cycle.Conclusions The dimensions of this theory-based framework would require testing with qualitative and quantitative studies to examine the applicability and utility, along with an evaluation of its impact on data quality across the health enterprise.

  19. Nuclear plants gain integrated information systems

    International Nuclear Information System (INIS)

    Villavicencio-Ramirez, A.; Rodriquez-Alvarez, J.M.

    1994-01-01

    With the objective of simplifying the complex mesh of computing devices employed within nuclear power plants, modern technology and integration techniques are being used to form centralized (but backed up) databases and distributed processing and display networks. Benefits are immediate as a result of the integration and the use of standards. The use of a unique data acquisition and database subsystem optimizes the high costs of engineering, as this task is done only once for the life span of the system. This also contributes towards a uniform user interface and allows for graceful expansion and maintenance. This article features an integrated information system, Sistema Integral de Informacion de Proceso (SIIP). The development of this system enabled the Laguna Verde Nuclear Power plant to fully use the already existing universe of signals and its related engineering during all plant conditions, namely, start up, normal operation, transient analysis, and emergency operation. Integrated systems offer many advantages over segregated systems, and this experience should benefit similar development efforts in other electric power utilities, not only for nuclear but also for other types of generating plants

  20. The impact of IAIMS on the work of information experts. Integrated Advanced Information Management Systems.

    Science.gov (United States)

    Ash, J

    1995-10-01

    Integrated Advanced Information Management Systems (IAIMS) programs differ but have certain characteristics in common. Technological and organizational integration are universal goals. As integration takes place, what happens to those implementing the vision? A survey of 125 staff members, or information experts, involved in information or informatics at an IAIMS-funded institution was conducted during the last year of the implementation phase. The purpose was to measure the impact of IAIMS on the jobs of those in the library and related service units, and the computing, telecommunications, and health informatics divisions. The researchers used newly developed scales measuring levels of integration (knowledge of and involvement with other departments), customer orientation (focus on the user), and informatedness (changes in the nature of work beyond automation of former routines). Ninety-four percent of respondents indicated that their jobs had changed a great deal; the changes were similar regardless of division. To further investigate the impact of IAIMS on librarians in particular, a separate skills survey was conducted. The IAIMS librarians indicated that technology and training skills are especially needed in the new, integrated environment.

  1. Automated microfluidic devices integrating solid-phase extraction, fluorescent labeling, and microchip electrophoresis for preterm birth biomarker analysis.

    Science.gov (United States)

    Sahore, Vishal; Sonker, Mukul; Nielsen, Anna V; Knob, Radim; Kumar, Suresh; Woolley, Adam T

    2018-01-01

    We have developed multichannel integrated microfluidic devices for automated preconcentration, labeling, purification, and separation of preterm birth (PTB) biomarkers. We fabricated multilayer poly(dimethylsiloxane)-cyclic olefin copolymer (PDMS-COC) devices that perform solid-phase extraction (SPE) and microchip electrophoresis (μCE) for automated PTB biomarker analysis. The PDMS control layer had a peristaltic pump and pneumatic valves for flow control, while the PDMS fluidic layer had five input reservoirs connected to microchannels and a μCE system. The COC layers had a reversed-phase octyl methacrylate porous polymer monolith for SPE and fluorescent labeling of PTB biomarkers. We determined μCE conditions for two PTB biomarkers, ferritin (Fer) and corticotropin-releasing factor (CRF). We used these integrated microfluidic devices to preconcentrate and purify off-chip-labeled Fer and CRF in an automated fashion. Finally, we performed a fully automated on-chip analysis of unlabeled PTB biomarkers, involving SPE, labeling, and μCE separation with 1 h total analysis time. These integrated systems have strong potential to be combined with upstream immunoaffinity extraction, offering a compact sample-to-answer biomarker analysis platform. Graphical abstract Pressure-actuated integrated microfluidic devices have been developed for automated solid-phase extraction, fluorescent labeling, and microchip electrophoresis of preterm birth biomarkers.

  2. Information Science and integrative Science. A sistemic approach to information units

    Directory of Open Access Journals (Sweden)

    Rita Dolores Santaella Ruiz

    2006-01-01

    Full Text Available Structured in two parts: The Documentation like integrating science and Systematics approach to the documentary units, this work understands the Documentation from a brought integrating perspective of the twinning that supposes same modus operandi in the information systems through the use of the technologies of the communication. From the General Theory of Systems, the present work interprets this science to multidiscipline like a system formed by the technical subsystems, of elements and individuals

  3. Information Extraction of High-Resolution Remotely Sensed Image Based on Multiresolution Segmentation

    Directory of Open Access Journals (Sweden)

    Peng Shao

    2014-08-01

    Full Text Available The principle of multiresolution segmentation was represented in detail in this study, and the canny algorithm was applied for edge-detection of a remotely sensed image based on this principle. The target image was divided into regions based on object-oriented multiresolution segmentation and edge-detection. Furthermore, object hierarchy was created, and a series of features (water bodies, vegetation, roads, residential areas, bare land and other information were extracted by the spectral and geometrical features. The results indicate that the edge-detection has a positive effect on multiresolution segmentation, and overall accuracy of information extraction reaches to 94.6% by the confusion matrix.

  4. The integrated approach methodology for operator information evaluation

    International Nuclear Information System (INIS)

    Stroube, K.; Modarres, M.; Roush, M.; Hunt, N.; Pearce, R.

    1986-01-01

    The Integrated Approach has developed a complete method for evaluating the relative importance of operation information improvements. By use of decision trees the impact of information on success probability of a function or system can be evaluated. This approach couples goal trees and human success likelihoods to estimate anticipated consequences of a given information system

  5. Metaproteomics: extracting and mining proteome information to characterize metabolic activities in microbial communities.

    Science.gov (United States)

    Abraham, Paul E; Giannone, Richard J; Xiong, Weili; Hettich, Robert L

    2014-06-17

    Contemporary microbial ecology studies usually employ one or more "omics" approaches to investigate the structure and function of microbial communities. Among these, metaproteomics aims to characterize the metabolic activities of the microbial membership, providing a direct link between the genetic potential and functional metabolism. The successful deployment of metaproteomics research depends on the integration of high-quality experimental and bioinformatic techniques for uncovering the metabolic activities of a microbial community in a way that is complementary to other "meta-omic" approaches. The essential, quality-defining informatics steps in metaproteomics investigations are: (1) construction of the metagenome, (2) functional annotation of predicted protein-coding genes, (3) protein database searching, (4) protein inference, and (5) extraction of metabolic information. In this article, we provide an overview of current bioinformatic approaches and software implementations in metaproteome studies in order to highlight the key considerations needed for successful implementation of this powerful community-biology tool. Copyright © 2014 John Wiley & Sons, Inc.

  6. Conceptual information processing: A robust approach to KBS-DBMS integration

    Science.gov (United States)

    Lazzara, Allen V.; Tepfenhart, William; White, Richard C.; Liuzzi, Raymond

    1987-01-01

    Integrating the respective functionality and architectural features of knowledge base and data base management systems is a topic of considerable interest. Several aspects of this topic and associated issues are addressed. The significance of integration and the problems associated with accomplishing that integration are discussed. The shortcomings of current approaches to integration and the need to fuse the capabilities of both knowledge base and data base management systems motivates the investigation of information processing paradigms. One such paradigm is concept based processing, i.e., processing based on concepts and conceptual relations. An approach to robust knowledge and data base system integration is discussed by addressing progress made in the development of an experimental model for conceptual information processing.

  7. Development of the Integrated Information Technology System

    National Research Council Canada - National Science Library

    2005-01-01

    The Integrated Medical Information Technology System (IMITS) Program is focused on implementation of advanced technology solutions that eliminate inefficiencies, increase utilization and improve quality of care for active duty forces...

  8. Extracting information from multiplex networks

    Science.gov (United States)

    Iacovacci, Jacopo; Bianconi, Ginestra

    2016-06-01

    Multiplex networks are generalized network structures that are able to describe networks in which the same set of nodes are connected by links that have different connotations. Multiplex networks are ubiquitous since they describe social, financial, engineering, and biological networks as well. Extending our ability to analyze complex networks to multiplex network structures increases greatly the level of information that is possible to extract from big data. For these reasons, characterizing the centrality of nodes in multiplex networks and finding new ways to solve challenging inference problems defined on multiplex networks are fundamental questions of network science. In this paper, we discuss the relevance of the Multiplex PageRank algorithm for measuring the centrality of nodes in multilayer networks and we characterize the utility of the recently introduced indicator function Θ ˜ S for describing their mesoscale organization and community structure. As working examples for studying these measures, we consider three multiplex network datasets coming for social science.

  9. Management of information in development projects – a proposed integrated model

    Directory of Open Access Journals (Sweden)

    C. Bester

    2008-11-01

    Full Text Available The first section of the article focuses on the need for development in Africa and the specific challenges of development operations. It describes the need for a holistic and integrated information management model as part of the project management body of knowledge aimed at managing the information flow between communities and development project teams. It is argued that information, and access to information, is crucial in development projects and can therefore be seen as a critical success factor in any development project. In the second section of the article, the three information areas of the holistic and integrated information management model are described. In the section thereafter we suggest roles and actions for information managers to facilitate information processes integral to the model. These processes seek to create a developing information community that aligns itself with the development project, and supports and sustains it.

  10. OpenCV-Based Nanomanipulation Information Extraction and the Probe Operation in SEM

    Directory of Open Access Journals (Sweden)

    Dongjie Li

    2015-02-01

    Full Text Available Aimed at the established telenanomanipulation system, the method of extracting location information and the strategies of probe operation were studied in this paper. First, the machine learning algorithm of OpenCV was used to extract location information from SEM images. Thus nanowires and probe in SEM images can be automatically tracked and the region of interest (ROI can be marked quickly. Then the location of nanowire and probe can be extracted from the ROI. To study the probe operation strategy, the Van der Waals force between probe and a nanowire was computed; thus relevant operating parameters can be obtained. With these operating parameters, the nanowire in 3D virtual environment can be preoperated and an optimal path of the probe can be obtained. The actual probe runs automatically under the telenanomanipulation system's control. Finally, experiments were carried out to verify the above methods, and results show the designed methods have achieved the expected effect.

  11. Microscope-integrated intraoperative optical coherence tomography-guided small-incision lenticule extraction: New surgical technique.

    Science.gov (United States)

    Sharma, Namrata; Urkude, Jayanand; Chaniyara, Manthan; Titiyal, Jeewan S

    2017-10-01

    We describe the surgical technique of microscope-integrated intraoperative optical coherence tomography (OCT)-guided small-incision lenticule extraction. The technique enables manual tracking of surgical instruments and identification of the desired dissection plane. It also helps discern the relation between the dissector and the intrastromal lenticule. The dissection plane becomes hyperreflective on dissection, ensuring complete separation of the intrastromal lenticule from the overlying and underlying stroma. Inadvertent posterior plane entry, cap-lenticule adhesion, incomplete separation of the lenticule, creation of a false plane, and lenticule remnants may be recognized intraoperatively so corrective steps can be taken immediately. In cases with a hazy overlying cap, microscope-integrated intraoperative OCT enables localization and extraction of the lenticule. The technique is helpful for inexperienced surgeons, especially in cases with low amplitudes of refractive errors, ie, thin lenticules. Copyright © 2017 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  12. An integrated process for the extraction of fuel and chemicals from marine macroalgal biomass

    Science.gov (United States)

    Trivedi, Nitin; Baghel, Ravi S.; Bothwell, John; Gupta, Vishal; Reddy, C. R. K.; Lali, Arvind M.; Jha, Bhavanath

    2016-07-01

    We describe an integrated process that can be applied to biomass of the green seaweed, Ulva fasciata, to allow the sequential recovery of four economically important fractions; mineral rich liquid extract (MRLE), lipid, ulvan, and cellulose. The main benefits of our process are: a) its simplicity and b) the consistent yields obtained from the residual biomass after each successive extraction step. For example, dry Ulva biomass yields ~26% of its starting mass as MRLE, ~3% as lipid, ~25% as ulvan, and ~11% as cellulose, with the enzymatic hydrolysis and fermentation of the final cellulose fraction under optimized conditions producing ethanol at a competitive 0.45 g/g reducing sugar. These yields are comparable to those obtained by direct processing of the individual components from primary biomass. We propose that this integration of ethanol production and chemical feedstock recovery from macroalgal biomass could substantially enhance the sustainability of marine biomass use.

  13. A method for automating the extraction of specialized information from the web

    NARCIS (Netherlands)

    Lin, L.; Liotta, A.; Hippisley, A.; Hao, Y.; Liu, J.; Wang, Y.; Cheung, Y-M.; Yin, H.; Jiao, L.; Ma, j.; Jiao, Y-C.

    2005-01-01

    The World Wide Web can be viewed as a gigantic distributed database including millions of interconnected hosts some of which publish information via web servers or peer-to-peer systems. We present here a novel method for the extraction of semantically rich information from the web in a fully

  14. Classification Method in Integrated Information Network Using Vector Image Comparison

    Directory of Open Access Journals (Sweden)

    Zhou Yuan

    2014-05-01

    Full Text Available Wireless Integrated Information Network (WMN consists of integrated information that can get data from its surrounding, such as image, voice. To transmit information, large resource is required which decreases the service time of the network. In this paper we present a Classification Approach based on Vector Image Comparison (VIC for WMN that improve the service time of the network. The available methods for sub-region selection and conversion are also proposed.

  15. Integrated prediction based on GIS for sandstone-type uranium deposits in the northwest of Ordos Basin

    International Nuclear Information System (INIS)

    Han Shaoyang; Ke Dan; Hu Shuiqing; Guo Qingyin; Hou Huiqun

    2005-01-01

    The integrated prediction model of sandstone-type uranium deposits and its integrated evaluation methods as well as flow of the work based on GIS are studied. A software for extracting metallogenic information is also developed. A multi-source exploring information database is established in the northwest of Ordos Basin, and an integrated digital mineral deposit prospecting model of sandstone-type uranium deposits is designed based on GIS. The authors have completed metallogenic information extraction and integrated evaluation of sandstone-type uranium deposits based on GIS in the study area. Research results prove that the integrated prediction of sandstone-type uranium deposits based on GIS may further delineate prospective target areas rapidly and improve the predictive precision. (authors)

  16. Extraction of land cover change information from ENVISAT-ASAR data in Chengdu Plain

    Science.gov (United States)

    Xu, Wenbo; Fan, Jinlong; Huang, Jianxi; Tian, Yichen; Zhang, Yong

    2006-10-01

    Land cover data are essential to most global change research objectives, including the assessment of current environmental conditions and the simulation of future environmental scenarios that ultimately lead to public policy development. Chinese Academy of Sciences generated a nationwide land cover database in order to carry out the quantification and spatial characterization of land use/cover changes (LUCC) in 1990s. In order to improve the reliability of the database, we will update the database anytime. But it is difficult to obtain remote sensing data to extract land cover change information in large-scale. It is hard to acquire optical remote sensing data in Chengdu plain, so the objective of this research was to evaluate multitemporal ENVISAT advanced synthetic aperture radar (ASAR) data for extracting land cover change information. Based on the fieldwork and the nationwide 1:100000 land cover database, the paper assesses several land cover changes in Chengdu plain, for example: crop to buildings, forest to buildings, and forest to bare land. The results show that ENVISAT ASAR data have great potential for the applications of extracting land cover change information.

  17. A RuleML Study on Integrating Geographical and Health Information

    DEFF Research Database (Denmark)

    Gao, Sheng; Mioc, Darka; Boley, Harold

    2008-01-01

    To facilitate health surveillance, flexible ways to represent, integrate, and deduce health information become increasingly important. In this paper, an ontology is used to support the semantic definition of spatial, temporal and thematic factors of health information. The ontology is realized...... as an interchangeable RuleML knowledge base, consisting of facts and rules. Rules are also used for integrating geographical and health information. The implemented eHealthGeo system uses the OO jDREW reasoning engine to deduce implicit information such as spatial relationships. The system combines this with spatial...

  18. Research on monitoring and management information integration technique in waste treatment and management

    International Nuclear Information System (INIS)

    Kong Jinsong; Yu Ren; Mao Wei

    2013-01-01

    The integration of the waste treatment process and the device status monitoring information and management information is a key problem required to be solved in the information integration of the waste treatment and management. The main content of the monitoring and management information integration is discussed in the paper. The data exchange techniques, which are based on the OPC, FTP and data push technology, are applied to the different monitoring system respectively, according to their development platform, to realize the integration of the waste treatment process and device status monitoring information and management information in a waste treatment center. (authors)

  19. The Effect of Information Security Management on Organizational Processes Integration in Supply Chain

    Directory of Open Access Journals (Sweden)

    Mohsen Shafiei Nikabadi

    2012-03-01

    Full Text Available : The major purpose of this article was that how information security management has effect on supply chain integration and the effect of implementing "information security management system" on enhancing supplies chain integration. In this respect, current research was seeking a combination overview to these tow approaches (Information Security Management and Organizational Processes Integration by Enterprise Resources Planning System and after that determined factors of these two important issue by factor analysis. Researchers using a series of comments in the automotive experts (production planning and management and supply chain experts and caregivers car makers and suppliers in the first level and second level supply chain industry. In this way, it has been done that impact on how information security management processes enterprise supply chain integration with the help of statistical correlation analysis. The results of this investigation indicated effect of "information security management system" various dimensions that were coordination of information, prevent human errors and hardware, the accuracy of information and education for users on two dimensions of internal and external integration of business processes, supply chain and finally, it can increased integration of business processes in supply chain. At the end owing to quite these results, deployment of "information security management system" increased the integration of organizational processes in supply chain. It could be demonstrate with the consideration of relation of organizational integration processes whit the level of coordination of information, prevent errors and accuracy of information throughout the supply chain.

  20. Obtaining bixin from semi-defatted annatto seeds by a mechanical method and solvent extraction: Process integration and economic evaluation.

    Science.gov (United States)

    Alcázar-Alay, Sylvia C; Osorio-Tobón, J Felipe; Forster-Carneiro, Tânia; Meireles, M Angela A

    2017-09-01

    This work involves the application of physical separation methods to concentrate the pigment of semi-defatted annatto seeds, a noble vegetal biomass rich in bixin pigments. Semi-defatted annatto seeds are the residue produced after the extraction of the lipid fraction from annatto seeds using supercritical fluid extraction (SFE). Semi-defatted annatto seeds are use in this work due to three important reasons: i) previous lipid extraction is necessary to recovery the tocotrienol-rich oil present in the annatto seeds, ii) an initial removal of the oil via SFE process favors bixin separation and iii) the cost of raw material is null. Physical methods including i) the mechanical fractionation method and ii) an integrated process of mechanical fractionation method and low-pressure solvent extraction (LPSE) were studied. The integrated process was proposed for processing two different semi-defatted annatto materials denoted Batches 1 and 2. The cost of manufacture (COM) was calculated for two different production scales (5 and 50L) considering the integrated process vs. only the mechanical fractionation method. The integrated process showed a significantly higher COM than mechanical fractionation method. This work suggests that mechanical fractionation method is an adequate and low-cost process to obtain a rich-pigment product from semi-defatted annatto seeds. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Information Security and Integrity Systems

    Science.gov (United States)

    1990-01-01

    Viewgraphs from the Information Security and Integrity Systems seminar held at the University of Houston-Clear Lake on May 15-16, 1990 are presented. A tutorial on computer security is presented. The goals of this tutorial are the following: to review security requirements imposed by government and by common sense; to examine risk analysis methods to help keep sight of forest while in trees; to discuss the current hot topic of viruses (which will stay hot); to examine network security, now and in the next year to 30 years; to give a brief overview of encryption; to review protection methods in operating systems; to review database security problems; to review the Trusted Computer System Evaluation Criteria (Orange Book); to comment on formal verification methods; to consider new approaches (like intrusion detection and biometrics); to review the old, low tech, and still good solutions; and to give pointers to the literature and to where to get help. Other topics covered include security in software applications and development; risk management; trust: formal methods and associated techniques; secure distributed operating system and verification; trusted Ada; a conceptual model for supporting a B3+ dynamic multilevel security and integrity in the Ada runtime environment; and information intelligence sciences.

  2. Integrated Reporting and Assurance of Sustainability Information: An Experimental Study on Professional Investors’ Information Processing

    NARCIS (Netherlands)

    Reimsbach, D.; Hahn, R.; Gürtürk, A.

    2018-01-01

    Sustainability-related non-financial information is increasingly deemed value relevant. Against this background, two recent trends in non-financial reporting are frequently discussed: integrated reporting and assurance of sustainability information. Using an established framework of information

  3. A Comparative Analysis of Extract, Transformation and Loading (ETL) Process

    Science.gov (United States)

    Runtuwene, J. P. A.; Tangkawarow, I. R. H. T.; Manoppo, C. T. M.; Salaki, R. J.

    2018-02-01

    The current growth of data and information occurs rapidly in varying amount and media. These types of development will eventually produce large number of data better known as the Big Data. Business Intelligence (BI) utilizes large number of data and information for analysis so that one can obtain important information. This type of information can be used to support decision-making process. In practice a process integrating existing data and information into data warehouse is needed. This data integration process is known as Extract, Transformation and Loading (ETL). In practice, many applications have been developed to carry out the ETL process, but selection which applications are more time, cost and power effective and efficient may become a challenge. Therefore, the objective of the study was to provide comparative analysis through comparison between the ETL process using Microsoft SQL Server Integration Service (SSIS) and one using Pentaho Data Integration (PDI).

  4. Understanding Information Systems Integration Deficiencies in Mergers and Acquisitions

    DEFF Research Database (Denmark)

    Henningsson, Stefan; Kettinger, William J.

    2017-01-01

    Information systems (IS) integration is a critical challenge for value-creating mergers and acquisitions. Appropriate design and implementation of IS integration is typically a precondition for enabling a majority of the anticipated business benefits of a combined organization. Often...

  5. Extraction of Graph Information Based on Image Contents and the Use of Ontology

    Science.gov (United States)

    Kanjanawattana, Sarunya; Kimura, Masaomi

    2016-01-01

    A graph is an effective form of data representation used to summarize complex information. Explicit information such as the relationship between the X- and Y-axes can be easily extracted from a graph by applying human intelligence. However, implicit knowledge such as information obtained from other related concepts in an ontology also resides in…

  6. Integrating Semantic Information in Metadata Descriptions for a Geoscience-wide Resource Inventory.

    Science.gov (United States)

    Zaslavsky, I.; Richard, S. M.; Gupta, A.; Valentine, D.; Whitenack, T.; Ozyurt, I. B.; Grethe, J. S.; Schachne, A.

    2016-12-01

    Integrating semantic information into legacy metadata catalogs is a challenging issue and so far has been mostly done on a limited scale. We present experience of CINERGI (Community Inventory of Earthcube Resources for Geoscience Interoperability), an NSF Earthcube Building Block project, in creating a large cross-disciplinary catalog of geoscience information resources to enable cross-domain discovery. The project developed a pipeline for automatically augmenting resource metadata, in particular generating keywords that describe metadata documents harvested from multiple geoscience information repositories or contributed by geoscientists through various channels including surveys and domain resource inventories. The pipeline examines available metadata descriptions using text parsing, vocabulary management and semantic annotation and graph navigation services of GeoSciGraph. GeoSciGraph, in turn, relies on a large cross-domain ontology of geoscience terms, which bridges several independently developed ontologies or taxonomies including SWEET, ENVO, YAGO, GeoSciML, GCMD, SWO, and CHEBI. The ontology content enables automatic extraction of keywords reflecting science domains, equipment used, geospatial features, measured properties, methods, processes, etc. We specifically focus on issues of cross-domain geoscience ontology creation, resolving several types of semantic conflicts among component ontologies or vocabularies, and constructing and managing facets for improved data discovery and navigation. The ontology and keyword generation rules are iteratively improved as pipeline results are presented to data managers for selective manual curation via a CINERGI Annotator user interface. We present lessons learned from applying CINERGI metadata augmentation pipeline to a number of federal agency and academic data registries, in the context of several use cases that require data discovery and integration across multiple earth science data catalogs of varying quality

  7. [Development method of healthcare information system integration based on business collaboration model].

    Science.gov (United States)

    Li, Shasha; Nie, Hongchao; Lu, Xudong; Duan, Huilong

    2015-02-01

    Integration of heterogeneous systems is the key to hospital information construction due to complexity of the healthcare environment. Currently, during the process of healthcare information system integration, people participating in integration project usually communicate by free-format document, which impairs the efficiency and adaptability of integration. A method utilizing business process model and notation (BPMN) to model integration requirement and automatically transforming it to executable integration configuration was proposed in this paper. Based on the method, a tool was developed to model integration requirement and transform it to integration configuration. In addition, an integration case in radiology scenario was used to verify the method.

  8. Modeling decisions information fusion and aggregation operators

    CERN Document Server

    Torra, Vicenc

    2007-01-01

    Information fusion techniques and aggregation operators produce the most comprehensive, specific datum about an entity using data supplied from different sources, thus enabling us to reduce noise, increase accuracy, summarize and extract information, and make decisions. These techniques are applied in fields such as economics, biology and education, while in computer science they are particularly used in fields such as knowledge-based systems, robotics, and data mining. This book covers the underlying science and application issues related to aggregation operators, focusing on tools used in practical applications that involve numerical information. Starting with detailed introductions to information fusion and integration, measurement and probability theory, fuzzy sets, and functional equations, the authors then cover the following topics in detail: synthesis of judgements, fuzzy measures, weighted means and fuzzy integrals, indices and evaluation methods, model selection, and parameter extraction. The method...

  9. Integrated Information System for Higher Education Qualifications

    Directory of Open Access Journals (Sweden)

    Catalin Ionut SILVESTRU

    2012-10-01

    Full Text Available In the present article we aim to study thoroughly and detail aspects related to architectures specific for e-learning and management of human resources training interconnected to management of qualifications. In addition, we take into consideration combining e-learning architectures with software in an e-learning system interconnected with the National Registry of Qualifications of Higher Education, in view of developing and information system that correlates educational supply from higher education from Romania with labor market demands through qualifications. The scientific endeavor consists of original architectural solutions to integrate data, systems, processes, services from various sources and to use them in the proposed system. The practical result of the scientific endeavor is represented by design of architectures required for developing an e-learning system interconnected with the National Registry of Qualifications from Romania, which involve in first stage the qualifications provided by higher education. The proposed innovative solution consists in the fact that the proposed information system combines the advantages of content management system (CMS with learning content management system (LCMS and with reusable learning objects (RLO. Thus, the architecture proposed in the research ensures the integration of a content management system with a portal for information, guidance and support in making a professional project. The integration enables correlation of competences with content areas and specific items from various teaching subjects, thus evaluating the usefulness for this registry from learning/educational perspective. Using the proposed information system in enables correlation among qualifications, content of educational program and continuous self-evaluation opportunities, which facilitate monitoring of progress and adjustment of learning content.

  10. Visualization and Integrated Data Mining of Disparate Information

    Energy Technology Data Exchange (ETDEWEB)

    Saffer, Jeffrey D.(OMNIVIZ, INC); Albright, Cory L.(BATTELLE (PACIFIC NW LAB)); Calapristi, Augustin J.(BATTELLE (PACIFIC NW LAB)); Chen, Guang (OMNIVIZ, INC); Crow, Vernon L.(BATTELLE (PACIFIC NW LAB)); Decker, Scott D.(BATTELLE (PACIFIC NW LAB)); Groch, Kevin M.(BATTELLE (PACIFIC NW LAB)); Havre, Susan L.(BATTELLE (PACIFIC NW LAB)); Malard, Joel (BATTELLE (PACIFIC NW LAB)); Martin, Tonya J.(BATTELLE (PACIFIC NW LAB)); Miller, Nancy E.(BATTELLE (PACIFIC NW LAB)); Monroe, Philip J.(OMNIVIZ, INC); Nowell, Lucy T.(BATTELLE (PACIFIC NW LAB)); Payne, Deborah A.(BATTELLE (PACIFIC NW LAB)); Reyes Spindola, Jorge F.(BATTELLE (PACIFIC NW LAB)); Scarberry, Randall E.(OMNIVIZ, INC); Sofia, Heidi J.(BATTELLE (PACIFIC NW LAB)); Stillwell, Lisa C.(OMNIVIZ, INC); Thomas, Gregory S.(BATTELLE (PACIFIC NW LAB)); Thurston, Sarah J.(OMNIVIZ, INC); Williams, Leigh K.(BATTELLE (PACIFIC NW LAB)); Zabriskie, Sean J.(OMNIVIZ, INC); MG Hicks

    2001-05-11

    The volumes and diversity of information in the discovery, development, and business processes within the chemical and life sciences industries require new approaches for analysis. Traditional list- or spreadsheet-based methods are easily overwhelmed by large amounts of data. Furthermore, generating strong hypotheses and, just as importantly, ruling out weak ones, requires integration across different experimental and informational sources. We have developed a framework for this integration, including common conceptual data models for multiple data types and linked visualizations that provide an overview of the entire data set, a measure of how each data record is related to every other record, and an assessment of the associations within the data set.

  11. Testing can counteract proactive interference by integrating competing information

    Science.gov (United States)

    Wahlheim, Christopher N.

    2015-01-01

    Testing initially learned information before presenting new information has been shown to counteract the deleterious effects of proactive interference by segregating competing sources of information. The present experiments were conducted to demonstrate that testing can also have its effects in part by integrating competing information. Variations of classic A–B, A–D paired-associate learning paradigms were employed that included two lists of word pairs and a cued-recall test. Repeated pairs appeared in both lists (A–B, A–B), control pairs appeared in List 2 only (A–B, C–D), and changed pairs appeared with the same cue in both lists but with different responses (A–B, A–D). The critical manipulation was whether pairs were tested or restudied in an interpolated phase that occurred between Lists 1 and 2. On a final cued-recall test, participants recalled List 2 responses and then indicated when they recollected that responses had earlier changed between lists. The change recollection measure indexed the extent to which competing responses were integrated during List 2. Change was recollected more often for tested than for restudied pairs. Proactive facilitation was obtained in cued recall when change was recollected, whereas proactive interference was obtained when change was not recollected. These results provide evidence that testing counteracted proactive interference in part by making List 1 responses more accessible during List 2, thus promoting integration and increasing later recollection of change. These results have theoretical implications because they show that testing can counteract proactive interference by integrating or segregating competing information. PMID:25120241

  12. Testing can counteract proactive interference by integrating competing information.

    Science.gov (United States)

    Wahlheim, Christopher N

    2015-01-01

    Testing initially learned information before presenting new information has been shown to counteract the deleterious effects of proactive interference by segregating competing sources of information. The present experiments were conducted to demonstrate that testing can also have its effects in part by integrating competing information. Variations of classic A-B, A-D paired-associate learning paradigms were employed that included two lists of word pairs and a cued-recall test. Repeated pairs appeared in both lists (A-B, A-B), control pairs appeared in List 2 only (A-B, C-D), and changed pairs appeared with the same cue in both lists but with different responses (A-B, A-D). The critical manipulation was whether pairs were tested or restudied in an interpolated phase that occurred between Lists 1 and 2. On a final cued-recall test, participants recalled List 2 responses and then indicated when they recollected that responses had earlier changed between lists. The change recollection measure indexed the extent to which competing responses were integrated during List 2. Change was recollected more often for tested than for restudied pairs. Proactive facilitation was obtained in cued recall when change was recollected, whereas proactive interference was obtained when change was not recollected. These results provide evidence that testing counteracted proactive interference in part by making List 1 responses more accessible during List 2, thus promoting integration and increasing later recollection of change. These results have theoretical implications because they show that testing can counteract proactive interference by integrating or segregating competing information.

  13. Integrating information systems : linking global business goals to local database applications

    NARCIS (Netherlands)

    Dignum, F.P.M.; Houben, G.J.P.M.

    1999-01-01

    This paper describes a new approach to design modern information systems that offer an integrated access to the data and knowledge that is available in local applications. By integrating the local data management activities into one transparent information distribution process, modern organizations

  14. Information-integration category learning and the human uncertainty response.

    Science.gov (United States)

    Paul, Erick J; Boomer, Joseph; Smith, J David; Ashby, F Gregory

    2011-04-01

    The human response to uncertainty has been well studied in tasks requiring attention and declarative memory systems. However, uncertainty monitoring and control have not been studied in multi-dimensional, information-integration categorization tasks that rely on non-declarative procedural memory. Three experiments are described that investigated the human uncertainty response in such tasks. Experiment 1 showed that following standard categorization training, uncertainty responding was similar in information-integration tasks and rule-based tasks requiring declarative memory. In Experiment 2, however, uncertainty responding in untrained information-integration tasks impaired the ability of many participants to master those tasks. Finally, Experiment 3 showed that the deficit observed in Experiment 2 was not because of the uncertainty response option per se, but rather because the uncertainty response provided participants a mechanism via which to eliminate stimuli that were inconsistent with a simple declarative response strategy. These results are considered in the light of recent models of category learning and metacognition.

  15. Developing an Approach to Prioritize River Restoration using Data Extracted from Flood Risk Information System Databases.

    Science.gov (United States)

    Vimal, S.; Tarboton, D. G.; Band, L. E.; Duncan, J. M.; Lovette, J. P.; Corzo, G.; Miles, B.

    2015-12-01

    Prioritizing river restoration requires information on river geometry. In many states in the US detailed river geometry has been collected for floodplain mapping and is available in Flood Risk Information Systems (FRIS). In particular, North Carolina has, for its 100 Counties, developed a database of numerous HEC-RAS models which are available through its Flood Risk Information System (FRIS). These models that include over 260 variables were developed and updated by numerous contractors. They contain detailed surveyed or LiDAR derived cross-sections and modeled flood extents for different extreme event return periods. In this work, over 4700 HEC-RAS models' data was integrated and upscaled to utilize detailed cross-section information and 100-year modelled flood extent information to enable river restoration prioritization for the entire state of North Carolina. We developed procedures to extract geomorphic properties such as entrenchment ratio, incision ratio, etc. from these models. Entrenchment ratio quantifies the vertical containment of rivers and thereby their vulnerability to flooding and incision ratio quantifies the depth per unit width. A map of entrenchment ratio for the whole state was derived by linking these model results to a geodatabase. A ranking of highly entrenched counties enabling prioritization for flood allowance and mitigation was obtained. The results were shared through HydroShare and web maps developed for their visualization using Google Maps Engine API.

  16. QUANTITATIVE СHARACTERISTICS OF COMPLEMENTARY INTEGRATED HEALTH CARE SYSTEM AND INTEGRATED MEDICATION MANAGEMENT INFORMATION SYSTEM

    Directory of Open Access Journals (Sweden)

    L. Yu. Babintseva

    2015-05-01

    i mportant elements of state regulation of the pharmaceutical sector health. For the first time creation of two information systems: integrated medication management infor mation system and integrated health care system in an integrated medical infor mation area, operating based on th e principle of complementarity was justified. Global and technological coefficients of these systems’ functioning were introduced.

  17. THE IMPORTANCE OF THE IMPLEMENTATION OF INTEGRATED INFORMATION SYSTEMS IN THE RESTRUCTURING AND EUROPEAN INTEGRATION PROCESS OF ORGANIZATIONS

    Directory of Open Access Journals (Sweden)

    Steliac Nela

    2010-12-01

    Full Text Available Many of the organizations that are part of the public and private domain in Romania have reached the stage in which the existing information systems can no longer comply with the requests of users. Therefore, we are compelled by necessity to use integrated information systems which should be able to control all kinds of data and to allow access to them, to ensure the coherence and consistency of the stored information. Managers must be aware of the importance of the implementation of integrated information systems in the background restructuring of the organization, which can thus become consistent and competitive with the European Union one, so the integration process becomes a real and possible one.

  18. Extraction of temporal information in functional MRI

    Science.gov (United States)

    Singh, M.; Sungkarat, W.; Jeong, Jeong-Won; Zhou, Yongxia

    2002-10-01

    The temporal resolution of functional MRI (fMRI) is limited by the shape of the haemodynamic response function (hrf) and the vascular architecture underlying the activated regions. Typically, the temporal resolution of fMRI is on the order of 1 s. We have developed a new data processing approach to extract temporal information on a pixel-by-pixel basis at the level of 100 ms from fMRI data. Instead of correlating or fitting the time-course of each pixel to a single reference function, which is the common practice in fMRI, we correlate each pixel's time-course to a series of reference functions that are shifted with respect to each other by 100 ms. The reference function yielding the highest correlation coefficient for a pixel is then used as a time marker for that pixel. A Monte Carlo simulation and experimental study of this approach were performed to estimate the temporal resolution as a function of signal-to-noise ratio (SNR) in the time-course of a pixel. Assuming a known and stationary hrf, the simulation and experimental studies suggest a lower limit in the temporal resolution of approximately 100 ms at an SNR of 3. The multireference function approach was also applied to extract timing information from an event-related motor movement study where the subjects flexed a finger on cue. The event was repeated 19 times with the event's presentation staggered to yield an approximately 100-ms temporal sampling of the haemodynamic response over the entire presentation cycle. The timing differences among different regions of the brain activated by the motor task were clearly visualized and quantified by this method. The results suggest that it is possible to achieve a temporal resolution of /spl sim/200 ms in practice with this approach.

  19. Towards a Unified Approach to Information Integration - A review paper on data/information fusion

    Energy Technology Data Exchange (ETDEWEB)

    Whitney, Paul D.; Posse, Christian; Lei, Xingye C.

    2005-10-14

    Information or data fusion of data from different sources are ubiquitous in many applications, from epidemiology, medical, biological, political, and intelligence to military applications. Data fusion involves integration of spectral, imaging, text, and many other sensor data. For example, in epidemiology, information is often obtained based on many studies conducted by different researchers at different regions with different protocols. In the medical field, the diagnosis of a disease is often based on imaging (MRI, X-Ray, CT), clinical examination, and lab results. In the biological field, information is obtained based on studies conducted on many different species. In military field, information is obtained based on data from radar sensors, text messages, chemical biological sensor, acoustic sensor, optical warning and many other sources. Many methodologies are used in the data integration process, from classical, Bayesian, to evidence based expert systems. The implementation of the data integration ranges from pure software design to a mixture of software and hardware. In this review we summarize the methodologies and implementations of data fusion process, and illustrate in more detail the methodologies involved in three examples. We propose a unified multi-stage and multi-path mapping approach to the data fusion process, and point out future prospects and challenges.

  20. Double path-integral migration velocity analysis: a real data example

    International Nuclear Information System (INIS)

    Costa, Jessé C; Schleicher, Jörg

    2011-01-01

    Path-integral imaging forms an image with no knowledge of the velocity model by summing over the migrated images obtained for a set of migration velocity models. Double path-integral imaging migration extracts the stationary velocities, i.e. those velocities at which common-image gathers align horizontally, as a byproduct. An application of the technique to a real data set demonstrates that quantitative information about the time migration velocity model can be determined by double path-integral migration velocity analysis. Migrated images using interpolations with different regularizations of the extracted velocities prove the high quality of the resulting time-migration velocity information. The so-obtained velocity model can then be used as a starting model for subsequent velocity analysis tools like migration tomography or other tomographic methods

  1. Integrating SAP to Information Systems Curriculum: Design and Delivery

    Science.gov (United States)

    Wang, Ming

    2011-01-01

    Information Systems (IS) education is being transformed from the segmented applications toward the integrated enterprise-wide system software Enterprise Resource Planning (ERP). ERP is a platform that integrates all business functions with its centralized data repository shared by all the business operations in the enterprise. This tremendous…

  2. A Concept Lattice for Semantic Integration of Geo-Ontologies Based on Weight of Inclusion Degree Importance and Information Entropy

    Directory of Open Access Journals (Sweden)

    Jia Xiao

    2016-11-01

    Full Text Available Constructing a merged concept lattice with formal concept analysis (FCA is an important research direction in the field of integrating multi-source geo-ontologies. Extracting essential geographical properties and reducing the concept lattice are two key points of previous research. A formal integration method is proposed to address the challenges in these two areas. We first extract essential properties from multi-source geo-ontologies and use FCA to build a merged formal context. Second, the combined importance weight of each single attribute of the formal context is calculated by introducing the inclusion degree importance from rough set theory and information entropy; then a weighted formal context is built from the merged formal context. Third, a combined weighted concept lattice is established from the weighted formal context with FCA and the importance weight value of every concept is defined as the sum of weight of attributes belonging to the concept’s intent. Finally, semantic granularity of concept is defined by its importance weight; we, then gradually reduce the weighted concept lattice by setting up diminishing threshold of semantic granularity. Additionally, all of those reduced lattices are organized into a regular hierarchy structure based on the threshold of semantic granularity. A workflow is designed to demonstrate this procedure. A case study is conducted to show feasibility and validity of this method and the procedure to integrate multi-source geo-ontologies.

  3. Integrated system of production information processing for surface mines

    Energy Technology Data Exchange (ETDEWEB)

    Li, K.; Wang, S.; Zeng, Z.; Wei, J.; Ren, Z. [China University of Mining and Technology, Xuzhou (China). Dept of Mining Engineering

    2000-09-01

    Based on the concept of geological statistic, mathematical program, condition simulation, system engineering, and the features and duties of each main department in surface mine production, an integrated system for surface mine production information was studied systematically and developed by using the technology of data warehousing, CAD, object-oriented and system integration, which leads to the systematizing and automating of the information management, data processing, optimization computing and plotting. In this paper, its overall object, system design, structure and functions and some key techniques were described. 2 refs., 3 figs.

  4. INFORMATION EXTRACTION IN TOMB PIT USING HYPERSPECTRAL DATA

    Directory of Open Access Journals (Sweden)

    X. Yang

    2018-04-01

    Full Text Available Hyperspectral data has characteristics of multiple bands and continuous, large amount of data, redundancy, and non-destructive. These characteristics make it possible to use hyperspectral data to study cultural relics. In this paper, the hyperspectral imaging technology is adopted to recognize the bottom images of an ancient tomb located in Shanxi province. There are many black remains on the bottom surface of the tomb, which are suspected to be some meaningful texts or paintings. Firstly, the hyperspectral data is preprocessing to get the reflectance of the region of interesting. For the convenient of compute and storage, the original reflectance value is multiplied by 10000. Secondly, this article uses three methods to extract the symbols at the bottom of the ancient tomb. Finally we tried to use morphology to connect the symbols and gave fifteen reference images. The results show that the extraction of information based on hyperspectral data can obtain a better visual experience, which is beneficial to the study of ancient tombs by researchers, and provides some references for archaeological research findings.

  5. Information Extraction in Tomb Pit Using Hyperspectral Data

    Science.gov (United States)

    Yang, X.; Hou, M.; Lyu, S.; Ma, S.; Gao, Z.; Bai, S.; Gu, M.; Liu, Y.

    2018-04-01

    Hyperspectral data has characteristics of multiple bands and continuous, large amount of data, redundancy, and non-destructive. These characteristics make it possible to use hyperspectral data to study cultural relics. In this paper, the hyperspectral imaging technology is adopted to recognize the bottom images of an ancient tomb located in Shanxi province. There are many black remains on the bottom surface of the tomb, which are suspected to be some meaningful texts or paintings. Firstly, the hyperspectral data is preprocessing to get the reflectance of the region of interesting. For the convenient of compute and storage, the original reflectance value is multiplied by 10000. Secondly, this article uses three methods to extract the symbols at the bottom of the ancient tomb. Finally we tried to use morphology to connect the symbols and gave fifteen reference images. The results show that the extraction of information based on hyperspectral data can obtain a better visual experience, which is beneficial to the study of ancient tombs by researchers, and provides some references for archaeological research findings.

  6. The integration of Information and Communication Technology into nursing.

    Science.gov (United States)

    Lupiáñez-Villanueva, Francisco; Hardey, Michael; Torrent, Joan; Ficapal, Pilar

    2011-02-01

    To identify and characterise different profiles of nurses' utilization of Information and Communication Technology (ICT) and the Internet and to identify factors that can enhance or inhibit the use of these technologies within nursing. An online survey of the 13,588 members of the Nurses Association of Barcelona who had a registered email account in 2006 was carried out. Factor analysis, cluster analysis and binomial logit model was undertaken. Although most of the nurses (76.70%) are utilizing the Internet within their daily work, multivariate statistics analysis revealed two profiles of the adoption of ICT. The first profile (4.58%) represents those nurses who value ICT and the Internet so that it forms an integral part of their practice. This group is thus referred to as 'integrated nurses'. The second profile (95.42%) represents those nurses who place less emphasis on ICT and the Internet and are consequently labelled 'non-integrated nurses'. From the statistical modelling, it was observed that undertaking research activities an emphasis on international information and a belief that health information available on the Internet was 'very relevant' play a positive and significant role in the probability of being an integrated nurse. The emerging world of the 'integrated nurse' cannot be adequately understood without examining how nurses make use of ICT and the Internet within nursing practice and the way this is shaped by institutional, technical and professional opportunities and constraints. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  7. Information Technology Integration in Higher Education: A Novel Approach for Impact Assessment

    Directory of Open Access Journals (Sweden)

    Abdulkareem Al-Alwani

    2014-12-01

    Full Text Available In the current technological world of Information services, academic systems are also in the process of adapting information technology solutions. Information systems vary for different applications and specifically in academia domain, a range of information systems are available for different institutions worldwide. Integration of e-learning can optimize implementation of computer-based and computer-assisted educational processes at all levels. Therefore it is imperative to assess and evaluate integration of these information systems because they have serious impact on e-learning processes. In this study an instrument survey is presented for evaluating integration of information technology systems and practices in an educational environment. Survey is constructed using descriptive questions related to information technology tools to assess qualitative impact and usage of such tools. Critical feedback, analysis and suggestions from 25 educationists played a pivotal role in finalizing proposed survey questionnaire. A subsequent test evaluation by teachers and students is also carried out to assess adequate utilization of information systems in Yanbu University College. The results showed that feedback using this survey can help in identifying technological gaps and facilitate effective integration of information technology in an educational environment. Survey instrument proposed in this research can greatly enhance integration of IT tools as it can identify shortcomings by collecting statistical data from feedback of both faculty and students. Solution to these problems is deterministic and can be easily implemented to optimize overall performance of e-learning systems.

  8. Feature Fusion Based Road Extraction for HJ-1-C SAR Image

    Directory of Open Access Journals (Sweden)

    Lu Ping-ping

    2014-06-01

    Full Text Available Road network extraction in SAR images is one of the key tasks of military and civilian technologies. To solve the issues of road extraction of HJ-1-C SAR images, a road extraction algorithm is proposed based on the integration of ratio and directional information. Due to the characteristic narrow dynamic range and low signal to noise ratio of HJ-1-C SAR images, a nonlinear quantization and an image filtering method based on a multi-scale autoregressive model are proposed here. A road extraction algorithm based on information fusion, which considers ratio and direction information, is also proposed. By processing Radon transformation, main road directions can be extracted. Cross interferences can be suppressed, and the road continuity can then be improved by the main direction alignment and secondary road extraction. The HJ-1-C SAR image acquired in Wuhan, China was used to evaluate the proposed method. The experimental results show good performance with correctness (80.5% and quality (70.1% when applied to a SAR image with complex content.

  9. The Integrated Information System for Natural Disaster Mitigation

    Directory of Open Access Journals (Sweden)

    Junxiu Wu

    2007-08-01

    Full Text Available Supported by the World Bank, the Integrated Information System for Natural Disaster Mitigation (ISNDM, including the operational service system and network telecommunication system, has been in development for three years in the Center of Disaster Reduction, Chinese Academy of Sciences, based on the platform of the GIS software Arcview. It has five main modules: disaster background information, socio- economic information, disaster-induced factors database, disaster scenarios database, and disaster assessment. ISNDM has several significant functions, which include information collection, information processing, data storage, and information distribution. It is a simple but comprehensive demonstration system for our national center for natural disaster reduction.

  10. Automated granularity to integrate digital information: the "Antarctic Treaty Searchable Database" case study

    Directory of Open Access Journals (Sweden)

    Paul Arthur Berkman

    2006-06-01

    Full Text Available Access to information is necessary, but not sufficient in our digital era. The challenge is to objectively integrate digital resources based on user-defined objectives for the purpose of discovering information relationships that facilitate interpretations and decision making. The Antarctic Treaty Searchable Database (http://aspire.nvi.net, which is in its sixth edition, provides an example of digital integration based on the automated generation of information granules that can be dynamically combined to reveal objective relationships within and between digital information resources. This case study further demonstrates that automated granularity and dynamic integration can be accomplished simply by utilizing the inherent structure of the digital information resources. Such information integration is relevant to library and archival programs that require long-term preservation of authentic digital resources.

  11. Integrating hospital information systems in healthcare institutions: a mediation architecture.

    Science.gov (United States)

    El Azami, Ikram; Cherkaoui Malki, Mohammed Ouçamah; Tahon, Christian

    2012-10-01

    Many studies have examined the integration of information systems into healthcare institutions, leading to several standards in the healthcare domain (CORBAmed: Common Object Request Broker Architecture in Medicine; HL7: Health Level Seven International; DICOM: Digital Imaging and Communications in Medicine; and IHE: Integrating the Healthcare Enterprise). Due to the existence of a wide diversity of heterogeneous systems, three essential factors are necessary to fully integrate a system: data, functions and workflow. However, most of the previous studies have dealt with only one or two of these factors and this makes the system integration unsatisfactory. In this paper, we propose a flexible, scalable architecture for Hospital Information Systems (HIS). Our main purpose is to provide a practical solution to insure HIS interoperability so that healthcare institutions can communicate without being obliged to change their local information systems and without altering the tasks of the healthcare professionals. Our architecture is a mediation architecture with 3 levels: 1) a database level, 2) a middleware level and 3) a user interface level. The mediation is based on two central components: the Mediator and the Adapter. Using the XML format allows us to establish a structured, secured exchange of healthcare data. The notion of medical ontology is introduced to solve semantic conflicts and to unify the language used for the exchange. Our mediation architecture provides an effective, promising model that promotes the integration of hospital information systems that are autonomous, heterogeneous, semantically interoperable and platform-independent.

  12. Assessing Extinction Risk: Integrating Genetic Information

    Directory of Open Access Journals (Sweden)

    Jason Dunham

    1999-06-01

    Full Text Available Risks of population extinction have been estimated using a variety of methods incorporating information from different spatial and temporal scales. We briefly consider how several broad classes of extinction risk assessments, including population viability analysis, incidence functions, and ranking methods integrate information on different temporal and spatial scales. In many circumstances, data from surveys of neutral genetic variability within, and among, populations can provide information useful for assessing extinction risk. Patterns of genetic variability resulting from past and present ecological and demographic events, can indicate risks of extinction that are otherwise difficult to infer from ecological and demographic analyses alone. We provide examples of how patterns of neutral genetic variability, both within, and among populations, can be used to corroborate and complement extinction risk assessments.

  13. Dual-wavelength phase-shifting digital holography selectively extracting wavelength information from wavelength-multiplexed holograms.

    Science.gov (United States)

    Tahara, Tatsuki; Mori, Ryota; Kikunaga, Shuhei; Arai, Yasuhiko; Takaki, Yasuhiro

    2015-06-15

    Dual-wavelength phase-shifting digital holography that selectively extracts wavelength information from five wavelength-multiplexed holograms is presented. Specific phase shifts for respective wavelengths are introduced to remove the crosstalk components and extract only the object wave at the desired wavelength from the holograms. Object waves in multiple wavelengths are selectively extracted by utilizing 2π ambiguity and the subtraction procedures based on phase-shifting interferometry. Numerical results show the validity of the proposed technique. The proposed technique is also experimentally demonstrated.

  14. Mediator infrastructure for information integration and semantic data integration environment for biomedical research.

    Science.gov (United States)

    Grethe, Jeffrey S; Ross, Edward; Little, David; Sanders, Brian; Gupta, Amarnath; Astakhov, Vadim

    2009-01-01

    This paper presents current progress in the development of semantic data integration environment which is a part of the Biomedical Informatics Research Network (BIRN; http://www.nbirn.net) project. BIRN is sponsored by the National Center for Research Resources (NCRR), a component of the National Institutes of Health (NIH). A goal is the development of a cyberinfrastructure for biomedical research that supports advance data acquisition, data storage, data management, data integration, data mining, data visualization, and other computing and information processing services over the Internet. Each participating institution maintains storage of their experimental or computationally derived data. Mediator-based data integration system performs semantic integration over the databases to enable researchers to perform analyses based on larger and broader datasets than would be available from any single institution's data. This paper describes recent revision of the system architecture, implementation, and capabilities of the semantically based data integration environment for BIRN.

  15. A Framework for Understanding Post-Merger Information Systems Integration

    DEFF Research Database (Denmark)

    Alaranta, Maria; Kautz, Karlheinz

    2012-01-01

    This paper develops a theoretical framework for the integration of information systems (IS) after a merger or an acquisition. The framework integrates three perspectives: a structuralist, an individualist, and an interactive process perspective to analyze and understand such integrations....... The framework is applied to a longitudinal case study of a manufacturing company that grew through an acquisition. The management decided to integrate the production control IS via tailoring a new system that blends together features of existing IS. The application of the framework in the case study confirms...... several known impediments to IS integrations. It also identifies a number of new inhibitors, as well as known and new facilitators that can bring post-merger IS integration to a success. Our findings provide relevant insights to researching and managing post-merger IS integrations. They emphasize...

  16. The Effect of Information Security Management on Organizational Processes Integration in Supply Chain

    OpenAIRE

    Mohsen Shafiei Nikabadi; Ahmad Jafarian; Azam Jalili Bolhasani

    2012-01-01

    : The major purpose of this article was that how information security management has effect on supply chain integration and the effect of implementing "information security management system" on enhancing supplies chain integration. In this respect, current research was seeking a combination overview to these tow approaches (Information Security Management and Organizational Processes Integration by Enterprise Resources Planning System) and after that determined factors of these two import...

  17. The Extraction of Post-Earthquake Building Damage Informatiom Based on Convolutional Neural Network

    Science.gov (United States)

    Chen, M.; Wang, X.; Dou, A.; Wu, X.

    2018-04-01

    The seismic damage information of buildings extracted from remote sensing (RS) imagery is meaningful for supporting relief and effective reduction of losses caused by earthquake. Both traditional pixel-based and object-oriented methods have some shortcoming in extracting information of object. Pixel-based method can't make fully use of contextual information of objects. Object-oriented method faces problem that segmentation of image is not ideal, and the choice of feature space is difficult. In this paper, a new stratage is proposed which combines Convolution Neural Network (CNN) with imagery segmentation to extract building damage information from remote sensing imagery. the key idea of this method includes two steps. First to use CNN to predicate the probability of each pixel and then integrate the probability within each segmentation spot. The method is tested through extracting the collapsed building and uncollapsed building from the aerial image which is acquired in Longtoushan Town after Ms 6.5 Ludian County, Yunnan Province earthquake. The results show that the proposed method indicates its effectiveness in extracting damage information of buildings after earthquake.

  18. Information theoretic approach to tactile encoding and discrimination

    OpenAIRE

    Saal, Hannes

    2011-01-01

    The human sense of touch integrates feedback from a multitude of touch receptors, but how this information is represented in the neural responses such that it can be extracted quickly and reliably is still largely an open question. At the same time, dexterous robots equipped with touch sensors are becoming more common, necessitating better methods for representing sequentially updated information and new control strategies that aid in extracting relevant features for object man...

  19. Extracting information from two-dimensional electrophoresis gels by partial least squares regression

    DEFF Research Database (Denmark)

    Jessen, Flemming; Lametsch, R.; Bendixen, E.

    2002-01-01

    of all proteins/spots in the gels. In the present study it is demonstrated how information can be extracted by multivariate data analysis. The strategy is based on partial least squares regression followed by variable selection to find proteins that individually or in combination with other proteins vary......Two-dimensional gel electrophoresis (2-DE) produces large amounts of data and extraction of relevant information from these data demands a cautious and time consuming process of spot pattern matching between gels. The classical approach of data analysis is to detect protein markers that appear...... or disappear depending on the experimental conditions. Such biomarkers are found by comparing the relative volumes of individual spots in the individual gels. Multivariate statistical analysis and modelling of 2-DE data for comparison and classification is an alternative approach utilising the combination...

  20. Analysis of Factors Affect to Organizational Performance In Using Accounting Information Systems Through Users Satisfaction and Integration Information Systems

    Directory of Open Access Journals (Sweden)

    Anton Arisman

    2017-09-01

    Full Text Available The aim of this research is to investigate the factors affecting organizational performance in using accounting information system through users satisfaction and integration information systems. The research respondents were 447 companies that listed in Indonesian Stock Exchange. The data are gathered through consensus method and in total there are 176 responses with complete data. Structural Equation Model (SEM is used in analyzing the data and system theory is utilized in this research. The result shows that knowledge management systems and management control system have significant influence on users satisfaction and integration information systems.  Integration information system and users satisfaction has positive significant on organizational performance.

  1. On the effects of multimodal information integration in multitasking.

    Science.gov (United States)

    Stock, Ann-Kathrin; Gohil, Krutika; Huster, René J; Beste, Christian

    2017-07-07

    There have recently been considerable advances in our understanding of the neuronal mechanisms underlying multitasking, but the role of multimodal integration for this faculty has remained rather unclear. We examined this issue by comparing different modality combinations in a multitasking (stop-change) paradigm. In-depth neurophysiological analyses of event-related potentials (ERPs) were conducted to complement the obtained behavioral data. Specifically, we applied signal decomposition using second order blind identification (SOBI) to the multi-subject ERP data and source localization. We found that both general multimodal information integration and modality-specific aspects (potentially related to task difficulty) modulate behavioral performance and associated neurophysiological correlates. Simultaneous multimodal input generally increased early attentional processing of visual stimuli (i.e. P1 and N1 amplitudes) as well as measures of cognitive effort and conflict (i.e. central P3 amplitudes). Yet, tactile-visual input caused larger impairments in multitasking than audio-visual input. General aspects of multimodal information integration modulated the activity in the premotor cortex (BA 6) as well as different visual association areas concerned with the integration of visual information with input from other modalities (BA 19, BA 21, BA 37). On top of this, differences in the specific combination of modalities also affected performance and measures of conflict/effort originating in prefrontal regions (BA 6).

  2. Towards integrated biorefinery from dried distillers grains: Selective extraction of pentoses using dilute acid hydrolysis

    International Nuclear Information System (INIS)

    Fonseca, Dania A.; Lupitskyy, Robert; Timmons, David; Gupta, Mayank; Satyavolu, Jagannadh

    2014-01-01

    The abundant availability and high level of hemicellulose content make dried distillers grains (DDG) an attractive feedstock for production of pentoses (C5) and conversion of C5 to bioproducts. One target of this work was to produce a C5 extract (hydrolyzate) with high yield and purity with a low concentration of C5 degradation products. A high selectivity towards pentoses was achieved using dilute acid hydrolysis of DDG in a percolation reactor with liquid recirculation. Pretreatment of starting material using screening and ultrasonication resulted in fractional increase of the pentose yield by 42%. A 94% yield of pentoses on the DDG (280.9 g kg −1 ) was obtained. Selective extraction of individual pentoses has been achieved by using a 2-stage hydrolysis process, resulting in arabinose-rich (arabinose 81.5%) and xylose-rich (xylose 85.2%) streams. A broader impact of this work is towards an Integrated Bio-Refinery based on DDG – for production of biofuels, biochemical intermediates, and other bioproducts. - Highlights: • A process for selective extraction of pentoses from DDG was presented as a part of integrated biorefinery approach. • The selectivity for pentoses was high using dilute acid hydrolysis in a percolation reactor with liquid recirculation. • Pretreatment of DDG using screening and ultrasonication resulted in fractional increase of the pentose yield by 42 %. • A 94% yield in pentoses (280.9 g kg −1 of DDG) was obtained. • A 2-stage hydrolysis process, developed to extract individual pentoses, resulted in arabinose and xylose rich streams

  3. Chaotic spectra: How to extract dynamic information

    International Nuclear Information System (INIS)

    Taylor, H.S.; Gomez Llorente, J.M.; Zakrzewski, J.; Kulander, K.C.

    1988-10-01

    Nonlinear dynamics is applied to chaotic unassignable atomic and molecular spectra with the aim of extracting detailed information about regular dynamic motions that exist over short intervals of time. It is shown how this motion can be extracted from high resolution spectra by doing low resolution studies or by Fourier transforming limited regions of the spectrum. These motions mimic those of periodic orbits (PO) and are inserts into the dominant chaotic motion. Considering these inserts and the PO as a dynamically decoupled region of space, resonant scattering theory and stabilization methods enable us to compute ladders of resonant states which interact with the chaotic quasi-continuum computed in principle from basis sets placed off the PO. The interaction of the resonances with the quasicontinuum explains the low resolution spectra seen in such experiments. It also allows one to associate low resolution features with a particular PO. The motion on the PO thereby supplies the molecular movements whose quantization causes the low resolution spectra. Characteristic properties of the periodic orbit based resonances are discussed. The method is illustrated on the photoabsorption spectrum of the hydrogen atom in a strong magnetic field and on the photodissociation spectrum of H 3 + . Other molecular systems which are currently under investigation using this formalism are also mentioned. 53 refs., 10 figs., 2 tabs

  4. Automated Extraction of Substance Use Information from Clinical Texts.

    Science.gov (United States)

    Wang, Yan; Chen, Elizabeth S; Pakhomov, Serguei; Arsoniadis, Elliot; Carter, Elizabeth W; Lindemann, Elizabeth; Sarkar, Indra Neil; Melton, Genevieve B

    2015-01-01

    Within clinical discourse, social history (SH) includes important information about substance use (alcohol, drug, and nicotine use) as key risk factors for disease, disability, and mortality. In this study, we developed and evaluated a natural language processing (NLP) system for automated detection of substance use statements and extraction of substance use attributes (e.g., temporal and status) based on Stanford Typed Dependencies. The developed NLP system leveraged linguistic resources and domain knowledge from a multi-site social history study, Propbank and the MiPACQ corpus. The system attained F-scores of 89.8, 84.6 and 89.4 respectively for alcohol, drug, and nicotine use statement detection, as well as average F-scores of 82.1, 90.3, 80.8, 88.7, 96.6, and 74.5 respectively for extraction of attributes. Our results suggest that NLP systems can achieve good performance when augmented with linguistic resources and domain knowledge when applied to a wide breadth of substance use free text clinical notes.

  5. FEMA's Integrated Emergency Management Information System (IEMIS)

    International Nuclear Information System (INIS)

    Jaske, R.T.; Meitzler, W.

    1987-01-01

    FEMA is implementing a computerized system for use in optimizing planning, and for supporting exercises of these plans. Called the Integrated Emergency Management Information System (IEMIS), it consists of a base geographic information system upon which analytical models are superimposed in order to load data and report results analytically. At present, it supports FEMA's work in offsite preparedness around nuclear power stations, but is being developed to deal with a full range of natural and technological accident hazards for which emergency evacuation or population movement is required

  6. The integral and extrinsic bioactive proteins in the aqueous extracted soybean oil bodies.

    Science.gov (United States)

    Zhao, Luping; Chen, Yeming; Cao, Yanyun; Kong, Xiangzhen; Hua, Yufei

    2013-10-09

    Soybean oil bodies (OBs), naturally pre-emulsified soybean oil, have been examined by many researchers owing to their great potential utilizations in food, cosmetics, pharmaceutical, and other applications requiring stable oil-in-water emulsions. This study was the first time to confirm that lectin, Gly m Bd 28K (Bd 28K, one soybean allergenic protein), Kunitz trypsin inhibitor (KTI), and Bowman-Birk inhibitor (BBI) were not contained in the extracted soybean OBs even by neutral pH aqueous extraction. It was clarified that the well-known Gly m Bd 30K (Bd 30K), another soybean allergenic protein, was strongly bound to soybean OBs through a disulfide bond with 24 kDa oleosin. One steroleosin isoform (41 kDa) and two caleosin isoforms (27 kDa, 29 kDa), the integral bioactive proteins, were confirmed for the first time in soybean OBs, and a considerable amount of calcium, necessary for the biological activities of caleosin, was strongly bound to OBs. Unexpectedly, it was found that 24 kDa and 18 kDa oleosins could be hydrolyzed by an unknown soybean endoprotease in the extracted soybean OBs, which might give some hints for improving the enzyme-assisted aqueous extraction processing of soybean free oil.

  7. Network and Ensemble Enabled Entity Extraction in Informal Text (NEEEEIT) final report

    Energy Technology Data Exchange (ETDEWEB)

    Kegelmeyer, Philip W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shead, Timothy M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dunlavy, Daniel M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2013-09-01

    This SAND report summarizes the activities and outcomes of the Network and Ensemble Enabled Entity Extraction in Information Text (NEEEEIT) LDRD project, which addressed improving the accuracy of conditional random fields for named entity recognition through the use of ensemble methods.

  8. Integrating knowledge based functionality in commercial hospital information systems.

    Science.gov (United States)

    Müller, M L; Ganslandt, T; Eich, H P; Lang, K; Ohmann, C; Prokosch, H U

    2000-01-01

    Successful integration of knowledge-based functions in the electronic patient record depends on direct and context-sensitive accessibility and availability to clinicians and must suit their workflow. In this paper we describe an exemplary integration of an existing standalone scoring system for acute abdominal pain into two different commercial hospital information systems using Java/Corba technolgy.

  9. Three-dimensional information extraction from GaoFen-1 satellite images for landslide monitoring

    Science.gov (United States)

    Wang, Shixin; Yang, Baolin; Zhou, Yi; Wang, Futao; Zhang, Rui; Zhao, Qing

    2018-05-01

    To more efficiently use GaoFen-1 (GF-1) satellite images for landslide emergency monitoring, a Digital Surface Model (DSM) can be generated from GF-1 across-track stereo image pairs to build a terrain dataset. This study proposes a landslide 3D information extraction method based on the terrain changes of slope objects. The slope objects are mergences of segmented image objects which have similar aspects; and the terrain changes are calculated from the post-disaster Digital Elevation Model (DEM) from GF-1 and the pre-disaster DEM from GDEM V2. A high mountain landslide that occurred in Wenchuan County, Sichuan Province is used to conduct a 3D information extraction test. The extracted total area of the landslide is 22.58 ha; the displaced earth volume is 652,100 m3; and the average sliding direction is 263.83°. The accuracies of them are 0.89, 0.87 and 0.95, respectively. Thus, the proposed method expands the application of GF-1 satellite images to the field of landslide emergency monitoring.

  10. REVIEW PAPER ON THE DEEP WEB DATA EXTRACTION

    OpenAIRE

    Prof. V. S. Patil*1, Miss Sneha Sitafale2, Miss Priyanka Kale3, Miss Poonam Bhujbal 4 , Miss Mohini Dandge 5 .

    2018-01-01

    Deep web data extraction is the process of extracting a set of data records and the items that they contain from a query result page. Such structured data can be later integrated into results from other data sources and given to the user in a single, cohesive view. Domain identification is used to identify the query interfaces related to the domain from the forms obtained in the search process. The surface web contains a large amount of unfiltered information, whereas the deep web includes hi...

  11. Lithium NLP: A System for Rich Information Extraction from Noisy User Generated Text on Social Media

    OpenAIRE

    Bhargava, Preeti; Spasojevic, Nemanja; Hu, Guoning

    2017-01-01

    In this paper, we describe the Lithium Natural Language Processing (NLP) system - a resource-constrained, high- throughput and language-agnostic system for information extraction from noisy user generated text on social media. Lithium NLP extracts a rich set of information including entities, topics, hashtags and sentiment from text. We discuss several real world applications of the system currently incorporated in Lithium products. We also compare our system with existing commercial and acad...

  12. System and method for extracting physiological information from remotely detected electromagnetic radiation

    NARCIS (Netherlands)

    2016-01-01

    The present invention relates to a device and a method for extracting physiological information indicative of at least one health symptom from remotely detected electromagnetic radiation. The device comprises an interface (20) for receiving a data stream comprising remotely detected image data

  13. System and method for extracting physiological information from remotely detected electromagnetic radiation

    NARCIS (Netherlands)

    2015-01-01

    The present invention relates to a device and a method for extracting physiological information indicative of at least one health symptom from remotely detected electromagnetic radiation. The device comprises an interface (20) for receiving a data stream comprising remotely detected image data

  14. ABOUT APPROACHES OF CREATION OF INTEGRATED INFORMATION SYSTEM PDM-ERP

    Directory of Open Access Journals (Sweden)

    V. G. Mikhailov

    2016-01-01

    Full Text Available The problems which has added in the field of creation of systems PDM and their integration with ERP is considered. The analysis of the reasons of low efficiency existing PDM is carried out: insufficiency of the primary information brought in PDM unit, structures of a DB, entering of designations in one field, application of referential character of guiding of composition that leads to lowering of its functionality and creates problems with integration with ERP.It is shown that the uniform integrated information system created on uniform databases is necessary for the enterprises with a full stroke, using as the primary document card part-bom-unit, instead of a file. For it other is necessary in difference from databases existing the general-purpose structure in which it is possible to bring any information.Implementation of the new system CDRP, uniting on functional PDM-ERP and providing enterprise basic needs is offered.

  15. Unifying Kohlberg with Information Integration: The Moral Algebra of Recompense and of Kohlbergian Moral Informers

    Directory of Open Access Journals (Sweden)

    Wilfried Hommers

    2010-01-01

    Full Text Available In order to unify two major theories of moral judgment, a novel task is employed which combines elements of Kohlberg's stage theory and of the theory of information integration. In contrast to the format of Kohlberg's moral judgment interview, a nonverbal and quantitative response which makes low demands on verbal facility was used . Moral informers differing in value, i.e. high and low, are presented. The differences in effect of those two pieces of information should be substantial for a person at that specific moral stage, but small for a person at a different stage. Hence, these differences may diagnose the person's moral stage in the simplest possible way as the two levels of each of the thoughts were about typical content of the four Kohlbergian preconventional and conventional stages. The novel task allowed additionally to measure the influence of another moral concept which was about the non-Kohlbergian moral concept of recompense. After a training phase, pairs of those thoughts were presented to allow for the study of integration and individual differences. German and Korean children, 8, 10, and 12 years in age, judged deserved punishment. The patterns of means, correlations and factor loadings showed that elements of both theories can be unified, but produced unexpected results also. Additive integration of each of the two pairs of moral informers appeared, either with two Kohlbergian moral informers or with another Kohlbergian moral informer in combination with information about recompense. Also cultural independence as well as dependence, developmental changes between 8 and 10 years, and an outstanding moral impact of recompense in size and distinctiveness were observed.

  16. How Does Alkali Aid Protein Extraction in Green Tea Leaf Residue: A Basis for Integrated Biorefinery of Leaves

    Science.gov (United States)

    Zhang, Chen; Sanders, Johan P. M.; Xiao, Ting T.; Bruins, Marieke E.

    2015-01-01

    Leaf protein can be obtained cost-efficiently by alkaline extraction, but overuse of chemicals and low quality of (denatured) protein limits its application. The research objective was to investigate how alkali aids protein extraction of green tea leaf residue, and use these results for further improvements in alkaline protein biorefinery. Protein extraction yield was studied for correlation to morphology of leaf tissue structure, protein solubility and hydrolysis degree, and yields of non-protein components obtained at various conditions. Alkaline protein extraction was not facilitated by increased solubility or hydrolysis of protein, but positively correlated to leaf tissue disruption. HG pectin, RGII pectin, and organic acids were extracted before protein extraction, which was followed by the extraction of cellulose and hemi-cellulose. RGI pectin and lignin were both linear to protein yield. The yields of these two components were 80% and 25% respectively when 95% protein was extracted, which indicated that RGI pectin is more likely to be the key limitation to leaf protein extraction. An integrated biorefinery was designed based on these results. PMID:26200774

  17. How Does Alkali Aid Protein Extraction in Green Tea Leaf Residue: A Basis for Integrated Biorefinery of Leaves.

    Directory of Open Access Journals (Sweden)

    Chen Zhang

    Full Text Available Leaf protein can be obtained cost-efficiently by alkaline extraction, but overuse of chemicals and low quality of (denatured protein limits its application. The research objective was to investigate how alkali aids protein extraction of green tea leaf residue, and use these results for further improvements in alkaline protein biorefinery. Protein extraction yield was studied for correlation to morphology of leaf tissue structure, protein solubility and hydrolysis degree, and yields of non-protein components obtained at various conditions. Alkaline protein extraction was not facilitated by increased solubility or hydrolysis of protein, but positively correlated to leaf tissue disruption. HG pectin, RGII pectin, and organic acids were extracted before protein extraction, which was followed by the extraction of cellulose and hemi-cellulose. RGI pectin and lignin were both linear to protein yield. The yields of these two components were 80% and 25% respectively when 95% protein was extracted, which indicated that RGI pectin is more likely to be the key limitation to leaf protein extraction. An integrated biorefinery was designed based on these results.

  18. Design of Integrated Database on Mobile Information System: A Study of Yogyakarta Smart City App

    Science.gov (United States)

    Nurnawati, E. K.; Ermawati, E.

    2018-02-01

    An integration database is a database which acts as the data store for multiple applications and thus integrates data across these applications (in contrast to an Application Database). An integration database needs a schema that takes all its client applications into account. The benefit of the schema that sharing data among applications does not require an extra layer of integration services on the applications. Any changes to data made in a single application are made available to all applications at the time of database commit - thus keeping the applications’ data use better synchronized. This study aims to design and build an integrated database that can be used by various applications in a mobile device based system platforms with the based on smart city system. The built-in database can be used by various applications, whether used together or separately. The design and development of the database are emphasized on the flexibility, security, and completeness of attributes that can be used together by various applications to be built. The method used in this study is to choice of the appropriate database logical structure (patterns of data) and to build the relational-database models (Design Databases). Test the resulting design with some prototype apps and analyze system performance with test data. The integrated database can be utilized both of the admin and the user in an integral and comprehensive platform. This system can help admin, manager, and operator in managing the application easily and efficiently. This Android-based app is built based on a dynamic clientserver where data is extracted from an external database MySQL. So if there is a change of data in the database, then the data on Android applications will also change. This Android app assists users in searching of Yogyakarta (as smart city) related information, especially in culture, government, hotels, and transportation.

  19. DEXTER: Disease-Expression Relation Extraction from Text.

    Science.gov (United States)

    Gupta, Samir; Dingerdissen, Hayley; Ross, Karen E; Hu, Yu; Wu, Cathy H; Mazumder, Raja; Vijay-Shanker, K

    2018-01-01

    Gene expression levels affect biological processes and play a key role in many diseases. Characterizing expression profiles is useful for clinical research, and diagnostics and prognostics of diseases. There are currently several high-quality databases that capture gene expression information, obtained mostly from large-scale studies, such as microarray and next-generation sequencing technologies, in the context of disease. The scientific literature is another rich source of information on gene expression-disease relationships that not only have been captured from large-scale studies but have also been observed in thousands of small-scale studies. Expression information obtained from literature through manual curation can extend expression databases. While many of the existing databases include information from literature, they are limited by the time-consuming nature of manual curation and have difficulty keeping up with the explosion of publications in the biomedical field. In this work, we describe an automated text-mining tool, Disease-Expression Relation Extraction from Text (DEXTER) to extract information from literature on gene and microRNA expression in the context of disease. One of the motivations in developing DEXTER was to extend the BioXpress database, a cancer-focused gene expression database that includes data derived from large-scale experiments and manual curation of publications. The literature-based portion of BioXpress lags behind significantly compared to expression information obtained from large-scale studies and can benefit from our text-mined results. We have conducted two different evaluations to measure the accuracy of our text-mining tool and achieved average F-scores of 88.51 and 81.81% for the two evaluations, respectively. Also, to demonstrate the ability to extract rich expression information in different disease-related scenarios, we used DEXTER to extract information on differential expression information for 2024 genes in lung

  20. Extracting of implicit information in English advertising texts with phonetic and lexical-morphological means

    Directory of Open Access Journals (Sweden)

    Traikovskaya Natalya Petrovna

    2015-12-01

    Full Text Available The article deals with phonetic and lexical-morphological language means participating in the process of extracting implicit information in English-speaking advertising texts for men and women. The functioning of phonetic means of the English language is not the basis for implication of information in advertising texts. Lexical and morphological means play the role of markers of relevant information, playing the role of the activator ofimplicit information in the texts of advertising.

  1. Specification of an integrated information architecture for a mobile teleoperated robot for home telecare.

    Science.gov (United States)

    Iannuzzi, David; Grant, Andrew; Corriveau, Hélène; Boissy, Patrick; Michaud, Francois

    2016-12-01

    The objective of this study was to design effectively integrated information architecture for a mobile teleoperated robot in remote assistance to the delivery of home health care. Three role classes were identified related to the deployment of a telerobot, namely, engineer, technology integrator, and health professional. Patients and natural caregivers were indirectly considered, this being a component of future field studies. Interviewing representatives of each class provided the functions, and information content and flows for each function. Interview transcripts enabled the formulation of UML (Universal Modeling Language) diagrams for feedback from participants. The proposed information architecture was validated with a use-case scenario. The integrated information architecture incorporates progressive design, ergonomic integration, and the home care needs from medical specialist, nursing, physiotherapy, occupational therapy, and social worker care perspectives. The integrated architecture iterative process promoted insight among participants. The use-case scenario evaluation showed the design's robustness. Complex innovation such as a telerobot must coherently mesh with health-care service delivery needs. The deployment of integrated information architecture bridging development, with specialist and home care applications, is necessary for home care technology innovation. It enables continuing evolution of robot and novel health information design in the same integrated architecture, while accounting for patient ecological need.

  2. Integrating Multiple On-line Knowledge Bases for Disease-Lab Test Relation Extraction.

    Science.gov (United States)

    Zhang, Yaoyun; Soysal, Ergin; Moon, Sungrim; Wang, Jingqi; Tao, Cui; Xu, Hua

    2015-01-01

    A computable knowledge base containing relations between diseases and lab tests would be a great resource for many biomedical informatics applications. This paper describes our initial step towards establishing a comprehensive knowledge base of disease and lab tests relations utilizing three public on-line resources. LabTestsOnline, MedlinePlus and Wikipedia are integrated to create a freely available, computable disease-lab test knowledgebase. Disease and lab test concepts are identified using MetaMap and relations between diseases and lab tests are determined based on source-specific rules. Experimental results demonstrate a high precision for relation extraction, with Wikipedia achieving the highest precision of 87%. Combining the three sources reached a recall of 51.40%, when compared with a subset of disease-lab test relations extracted from a reference book. Moreover, we found additional disease-lab test relations from on-line resources, indicating they are complementary to existing reference books for building a comprehensive disease and lab test relation knowledge base.

  3. Methods from Information Extraction from LIDAR Intensity Data and Multispectral LIDAR Technology

    Science.gov (United States)

    Scaioni, M.; Höfle, B.; Baungarten Kersting, A. P.; Barazzetti, L.; Previtali, M.; Wujanz, D.

    2018-04-01

    LiDAR is a consolidated technology for topographic mapping and 3D reconstruction, which is implemented in several platforms On the other hand, the exploitation of the geometric information has been coupled by the use of laser intensity, which may provide additional data for multiple purposes. This option has been emphasized by the availability of sensors working on different wavelength, thus able to provide additional information for classification of surfaces and objects. Several applications ofmonochromatic and multi-spectral LiDAR data have been already developed in different fields: geosciences, agriculture, forestry, building and cultural heritage. The use of intensity data to extract measures of point cloud quality has been also developed. The paper would like to give an overview on the state-of-the-art of these techniques, and to present the modern technologies for the acquisition of multispectral LiDAR data. In addition, the ISPRS WG III/5 on `Information Extraction from LiDAR Intensity Data' has collected and made available a few open data sets to support scholars to do research on this field. This service is presented and data sets delivered so far as are described.

  4. METHODS FROM INFORMATION EXTRACTION FROM LIDAR INTENSITY DATA AND MULTISPECTRAL LIDAR TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    M. Scaioni

    2018-04-01

    Full Text Available LiDAR is a consolidated technology for topographic mapping and 3D reconstruction, which is implemented in several platforms On the other hand, the exploitation of the geometric information has been coupled by the use of laser intensity, which may provide additional data for multiple purposes. This option has been emphasized by the availability of sensors working on different wavelength, thus able to provide additional information for classification of surfaces and objects. Several applications ofmonochromatic and multi-spectral LiDAR data have been already developed in different fields: geosciences, agriculture, forestry, building and cultural heritage. The use of intensity data to extract measures of point cloud quality has been also developed. The paper would like to give an overview on the state-of-the-art of these techniques, and to present the modern technologies for the acquisition of multispectral LiDAR data. In addition, the ISPRS WG III/5 on ‘Information Extraction from LiDAR Intensity Data’ has collected and made available a few open data sets to support scholars to do research on this field. This service is presented and data sets delivered so far as are described.

  5. A Critical Review of the Integration of Geographic Information System and Building Information Modelling at the Data Level

    Directory of Open Access Journals (Sweden)

    Junxiang Zhu

    2018-02-01

    Full Text Available The benefits brought by the integration of Building Information Modelling (BIM and Geographic Information Systems (GIS are being proved by more and more research. The integration of the two systems is difficult for many reasons. Among them, data incompatibility is the most significant, as BIM and GIS data are created, managed, analyzed, stored, and visualized in different ways in terms of coordinate systems, scope of interest, and data structures. The objective of this paper is to review the relevant research papers to (1 identify the most relevant data models used in BIM/GIS integration and understand their advantages and disadvantages; (2 consider the possibility of other data models that are available for data level integration; and (3 provide direction on the future of BIM/GIS data integration.

  6. Representation and Integration of Scientific Information

    Science.gov (United States)

    1998-01-01

    The objective of this Joint Research Interchange with NASA-Ames was to investigate how the Tsimmis technology could be used to represent and integrate scientific information. The main goal of the Tsimmis project is to allow a decision maker to find information of interest from such sources, fuse it, and process it (e.g., summarize it, visualize it, discover trends). Another important goal is the easy incorporation of new sources, as well the ability to deal with sources whose structure or services evolve. During the Interchange we had research meetings approximately every month or two. The funds provided by NASA supported work that lead to the following two papers: Fusion Queries over Internet Databases; Efficient Query Subscription Processing in a Multicast Environment.

  7. Integrating an Information Literacy Quiz into the Learning Management System

    Science.gov (United States)

    Lowe, M. Sara; Booth, Char; Tagge, Natalie; Stone, Sean

    2014-01-01

    The Claremont Colleges Library Instruction Services Department developed a quiz that could be integrated into the consortial learning management software to accompany a local online, open-source information literacy tutorial. The quiz is integrated into individual course pages, allowing students to receive a grade for completion and improving…

  8. Strengthening Rehabilitation in Health Systems Worldwide by Integrating Information on Functioning in National Health Information Systems.

    Science.gov (United States)

    Stucki, Gerold; Bickenbach, Jerome; Melvin, John

    2017-09-01

    A complete understanding of the experience of health requires information relevant not merely to the health indicators of mortality and morbidity but also to functioning-that is, information about what it means to live in a health state, "the lived experience of health." Not only is functioning information relevant to healthcare and the overall objectives of person-centered healthcare but to the successful operation of all components of health systems.In light of population aging and major epidemiological trends, the health strategy of rehabilitation, whose aim has always been to optimize functioning and minimize disability, will become a key health strategy. The increasing prominence of the rehabilitative strategy within the health system drives the argument for the integration of functioning information as an essential component in national health information systems.Rehabilitation professionals and researchers have long recognized in WHO's International Classification of Functioning, Disability and Health the best prospect for an internationally recognized, sufficiently complete and powerful information reference for the documentation of functioning information. This paper opens the discussion of the promise of integrating the ICF as an essential component in national health systems to secure access to functioning information for rehabilitation, across health systems and countries.

  9. Automated extraction of chemical structure information from digital raster images

    Directory of Open Access Journals (Sweden)

    Shedden Kerby A

    2009-02-01

    Full Text Available Abstract Background To search for chemical structures in research articles, diagrams or text representing molecules need to be translated to a standard chemical file format compatible with cheminformatic search engines. Nevertheless, chemical information contained in research articles is often referenced as analog diagrams of chemical structures embedded in digital raster images. To automate analog-to-digital conversion of chemical structure diagrams in scientific research articles, several software systems have been developed. But their algorithmic performance and utility in cheminformatic research have not been investigated. Results This paper aims to provide critical reviews for these systems and also report our recent development of ChemReader – a fully automated tool for extracting chemical structure diagrams in research articles and converting them into standard, searchable chemical file formats. Basic algorithms for recognizing lines and letters representing bonds and atoms in chemical structure diagrams can be independently run in sequence from a graphical user interface-and the algorithm parameters can be readily changed-to facilitate additional development specifically tailored to a chemical database annotation scheme. Compared with existing software programs such as OSRA, Kekule, and CLiDE, our results indicate that ChemReader outperforms other software systems on several sets of sample images from diverse sources in terms of the rate of correct outputs and the accuracy on extracting molecular substructure patterns. Conclusion The availability of ChemReader as a cheminformatic tool for extracting chemical structure information from digital raster images allows research and development groups to enrich their chemical structure databases by annotating the entries with published research articles. Based on its stable performance and high accuracy, ChemReader may be sufficiently accurate for annotating the chemical database with links

  10. Computer-based information management system for interventional radiology

    International Nuclear Information System (INIS)

    Forman, B.H.; Silverman, S.G.; Mueller, P.R.; Hahn, P.F.; Papanicolaou, N.; Tung, G.A.; Brink, J.A.; Ferrucci, J.T.

    1989-01-01

    The authors authored and implemented a computer-based information management system (CBIMS) for the integrated analysis of data from a variety of abdominal nonvascular interventional procedures. The CBIMS improved on their initial handwritten-card system (which listed only patient name, hospital number, and type of procedure) by capturing relevant patient data in an organized fashion and integrating information for meaningful analysis. Advantages of CBIMS include enhanced compilation of monthly census, easy access to a patient's interventional history, and flexible querying capability that allows easy extraction of subsets of information from the patient database

  11. Information Integration in Risky Choice: Identification and Stability

    OpenAIRE

    Stewart, Neil

    2011-01-01

    How is information integrated across the\\ud attributes of an option when making risky\\ud choices? In most descriptive models of\\ud decision under risk, information about\\ud risk, and reward is combined multiplicatively\\ud (e.g., expected value; expected utility\\ud theory, Bernouli, 1738/1954; subjective\\ud expected utility theory, Savage, 1954;\\ud Edwards, 1955; prospect theory, Kahneman\\ud and Tversky, 1979; rank-dependent utility,\\ud Quiggin, 1993; decision field theory,\\ud Busemeyer and To...

  12. A State-of-the-Art Review on the Integration of Building Information Modeling (BIM and Geographic Information System (GIS

    Directory of Open Access Journals (Sweden)

    Xin Liu

    2017-02-01

    Full Text Available The integration of Building Information Modeling (BIM and Geographic Information System (GIS has been identified as a promising but challenging topic to transform information towards the generation of knowledge and intelligence. Achievement of integrating these two concepts and enabling technologies will have a significant impact on solving problems in the civil, building and infrastructure sectors. However, since GIS and BIM were originally developed for different purposes, numerous challenges are being encountered for the integration. To better understand these two different domains, this paper reviews the development and dissimilarities of GIS and BIM, the existing integration methods, and investigates their potential in various applications. This study shows that the integration methods are developed for various reasons and aim to solve different problems. The parameters influencing the choice can be summarized and named as “EEEF” criteria: effectiveness, extensibility, effort, and flexibility. Compared with other methods, semantic web technologies provide a promising and generalized integration solution. However, the biggest challenges of this method are the large efforts required at early stage and the isolated development of ontologies within one particular domain. The isolation problem also applies to other methods. Therefore, openness is the key of the success of BIM and GIS integration.

  13. Moral Judgment as Information Processing: An Integrative Review

    Directory of Open Access Journals (Sweden)

    Steve eGuglielmo

    2015-10-01

    Full Text Available This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two fundamental questions: (1 What input information guides moral judgments?; and (2 What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework, critically evaluates them on empirical and theoretical grounds, outlines a general integrative model grounded in information processing, and offers conceptual and methodological suggestions for future research. The information processing perspective provides a useful theoretical framework for organizing extant and future work in the rapidly growing field of moral judgment.

  14. Integrated environmental monitoring and information system

    International Nuclear Information System (INIS)

    Klinda, J.; Lieskovska, Z.

    1998-01-01

    The concept of the environmental monitoring within the territory of the Slovak Republic and the concept of the integrated environmental information system of the Slovak Republic were accepted and confirmed by the Government Order No. 449/1992. The state monitoring system covering the whole territory of Slovakia is the most important and consists of 13 Partial Monitoring Systems (PMSs). List of PMSs is included. The listed PMSs are managed according to the concept of the Sectoral Information System (SIS) of the Ministry of the Environment of the Slovak Republic (MESR) which was established by the National Council Act No. 261/1995 Coll. on the SIS. The SIS consists of 18 subsystems which are listed. The overviews of budget of PMSs as well as of environmental publications and periodicals of the MESR are included

  15. Information integration for a sky survey by data warehousing

    Science.gov (United States)

    Luo, A.; Zhang, Y.; Zhao, Y.

    The virtualization service of data system for a sky survey LAMOST is very important for astronomers The service needs to integrate information from data collections catalogs and references and support simple federation of a set of distributed files and associated metadata Data warehousing has been in existence for several years and demonstrated superiority over traditional relational database management systems by providing novel indexing schemes that supported efficient on-line analytical processing OLAP of large databases Now relational database systems such as Oracle etc support the warehouse capability which including extensions to the SQL language to support OLAP operations and a number of metadata management tools have been created The information integration of LAMOST by applying data warehousing is to effectively provide data and knowledge on-line

  16. PKDE4J: Entity and relation extraction for public knowledge discovery.

    Science.gov (United States)

    Song, Min; Kim, Won Chul; Lee, Dahee; Heo, Go Eun; Kang, Keun Young

    2015-10-01

    Due to an enormous number of scientific publications that cannot be handled manually, there is a rising interest in text-mining techniques for automated information extraction, especially in the biomedical field. Such techniques provide effective means of information search, knowledge discovery, and hypothesis generation. Most previous studies have primarily focused on the design and performance improvement of either named entity recognition or relation extraction. In this paper, we present PKDE4J, a comprehensive text-mining system that integrates dictionary-based entity extraction and rule-based relation extraction in a highly flexible and extensible framework. Starting with the Stanford CoreNLP, we developed the system to cope with multiple types of entities and relations. The system also has fairly good performance in terms of accuracy as well as the ability to configure text-processing components. We demonstrate its competitive performance by evaluating it on many corpora and found that it surpasses existing systems with average F-measures of 85% for entity extraction and 81% for relation extraction. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Remote Sensing Information Sciences Research Group, Santa Barbara Information Sciences Research Group, year 3

    Science.gov (United States)

    Estes, J. E.; Smith, T.; Star, J. L.

    1986-01-01

    Research continues to focus on improving the type, quantity, and quality of information which can be derived from remotely sensed data. The focus is on remote sensing and application for the Earth Observing System (Eos) and Space Station, including associated polar and co-orbiting platforms. The remote sensing research activities are being expanded, integrated, and extended into the areas of global science, georeferenced information systems, machine assissted information extraction from image data, and artificial intelligence. The accomplishments in these areas are examined.

  18. The integration of weighted gene association networks based on information entropy.

    Science.gov (United States)

    Yang, Fan; Wu, Duzhi; Lin, Limei; Yang, Jian; Yang, Tinghong; Zhao, Jing

    2017-01-01

    Constructing genome scale weighted gene association networks (WGAN) from multiple data sources is one of research hot spots in systems biology. In this paper, we employ information entropy to describe the uncertain degree of gene-gene links and propose a strategy for data integration of weighted networks. We use this method to integrate four existing human weighted gene association networks and construct a much larger WGAN, which includes richer biology information while still keeps high functional relevance between linked gene pairs. The new WGAN shows satisfactory performance in disease gene prediction, which suggests the reliability of our integration strategy. Compared with existing integration methods, our method takes the advantage of the inherent characteristics of the component networks and pays less attention to the biology background of the data. It can make full use of existing biological networks with low computational effort.

  19. Effects of integrated designs of alarm and process information on diagnosis performance in digital nuclear power plants.

    Science.gov (United States)

    Wu, Xiaojun; She, Manrong; Li, Zhizhong; Song, Fei; Sang, Wei

    2017-12-01

    In the main control rooms of nuclear power plants (NPPs), operators frequently switch between alarm displays and system-information displays to incorporate information from different screens. In this study, we investigated two integrated designs of alarm and process information - integrating alarm information into process displays (denoted as Alarm2Process integration) and integrating process information into alarm displays (denoted as Process2Alarm integration). To analyse the effects of the two integration approaches and time pressure on the diagnosis performance, a laboratory experiment was conducted with ninety-six students. The results show that compared with the non-integrated case, Process2Alarm integration yields better diagnosis performance in terms of diagnosis accuracy, time required to generate correct hypothesis and completion time. In contrast, the Alarm2Process integration leads to higher levels of workload, with no improvement in diagnosis performance. The diagnosis performance of Process2Alarm integration was consistently better than that of Alarm2Process integration, regardless of the levels of time pressure. Practitioner Summary: To facilitate operator's synthesis of NPP information when performing diagnosis tasks, we proposed to integrate process information into alarm displays. The laboratory validation shows that the integration approach significantly improves the diagnosis performance for both low and high time-pressure levels.

  20. Using remote sensing to inform integrated coastal zone management

    CSIR Research Space (South Africa)

    Roberts, W

    2010-06-01

    Full Text Available TO INFORM INTERGRATED COASTAL ZONE MANAGEMENT GISSA Western Cape Regional Meeting Wesley Roberts & Melanie Luck-Vogel 2 June 2010 CSIR NRE Ecosystems Earth Observation Group What is Integrated Coastal Zone Management? Integrated coastal management... D1D1 B a n d 1 Band 2 Quick theory of CVA Magnitude Direction ( ) ( )22 xaxbyaybM ?+?= Quadrant 1 (++) Accretion Quadrant 2 (-+) Quadrant 4 (+-) Quadrant 3 (--) Erosion CVA Results & Conclusions ? Change in image time series...

  1. Integrated Modeling Approach for the Development of Climate-Informed, Actionable Information

    Directory of Open Access Journals (Sweden)

    David R. Judi

    2018-06-01

    Full Text Available Flooding is a prevalent natural disaster with both short and long-term social, economic, and infrastructure impacts. Changes in intensity and frequency of precipitation (including rain, snow, and rain-on-snow events create challenges for the planning and management of resilient infrastructure and communities. While there is general acknowledgment that new infrastructure design should account for future climate change, no clear methods or actionable information are available to community planners and designers to ensure resilient designs considering an uncertain climate future. This research demonstrates an approach for an integrated, multi-model, and multi-scale simulation to evaluate future flood impacts. This research used regional climate projections to drive high-resolution hydrology and flood models to evaluate social, economic, and infrastructure resilience for the Snohomish Watershed, WA, USA. Using the proposed integrated modeling approach, the peaks of precipitation and streamflows were found to shift from spring and summer to the earlier winter season. Moreover, clear non-stationarities in future flood risk were discovered under various climate scenarios. This research provides a clear approach for the incorporation of climate science in flood resilience analysis and to also provides actionable information relative to the frequency and intensity of future precipitation events.

  2. Extract transformation loading from OLTP to OLAP data using pentaho data integration

    Science.gov (United States)

    Salaki, R. J.; Waworuntu, J.; Tangkawarow, I. R. H. T.

    2016-04-01

    The design of the data warehouse in this case is expected to solve the problem of evaluation of learning results as well as the relevance of the information received to support decision-making by the leader. Data warehouse design is very important, which is designed to utilize the existing resources of information. GPA (Grade Point Average) data warehouse can be used for the process of evaluation, decision making and even further planning of the study program of PTIK. The diversity of data sources in the course PTIK make decisionmaking and evaluation process becomes not easier. Pentaho Data Integration is used to integrate data in PTIK easy. CPI data warehouse design with multidimensional database modeling approach using the dimension tables and fact tables.

  3. Evaluating Information System Integration approaches for fixed asset management framework in Tanzania

    Directory of Open Access Journals (Sweden)

    Theophil Assey

    2017-10-01

    Full Text Available Information systems are developed based on different requirements and different technologies. Integration of these systems is of vital importance as they cannot work in isolation, they need to share and exchange data with other information systems. The Information Systems handle data of different types and formats’, finding a way to make them communicate is important as they need to exchange data during transactions, communication and different aspects which may require their interactions. In Tanzanian Local Government Authorities (LGAs, fixed asset data are not centralized, individual Local Government Authority stores their own data in isolation yet accountability is required through the provision of centralized storage for easy data access and easier data integration with other Information Systems in order to enhance fixed asset accountability. The study was carried out through reviewing of literature on the existing Information System integration approaches in order to identify and propose the best approach to be used in fixed asset management systems in LGA’s in Tanzania. The different approaches which are used for systems integration such as Service Oriented Architecture (SOA, Common Object Request Broker (CORBA, Common Object Model (COM and eXtensible Markup Language (XML were evaluated under the factors considered at the LGA. The XML was preferred over SOA, CORBA and COM because of some challenges in governance, data security, availability of expertise for support, maintenance, implementation cost, performance, compliance with government changing policies and service reliability. The proposed approach integrates data for all the Local Government Authorities at a centralized location and middleware transforms the centralized data into XML so it can easily be used by other Information Systems.

  4. Information and image integration: project spectrum

    Science.gov (United States)

    Blaine, G. James; Jost, R. Gilbert; Martin, Lori; Weiss, David A.; Lehmann, Ron; Fritz, Kevin

    1998-07-01

    The BJC Health System (BJC) and the Washington University School of Medicine (WUSM) formed a technology alliance with industry collaborators to develop and implement an integrated, advanced clinical information system. The industry collaborators include IBM, Kodak, SBC and Motorola. The activity, called Project Spectrum, provides an integrated clinical repository for the multiple hospital facilities of the BJC. The BJC System consists of 12 acute care hospitals serving over one million patients in Missouri and Illinois. An interface engine manages transactions from each of the hospital information systems, lab systems and radiology information systems. Data is normalized to provide a consistent view for the primary care physician. Access to the clinical repository is supported by web-based server/browser technology which delivers patient data to the physician's desktop. An HL7 based messaging system coordinates the acquisition and management of radiological image data and sends image keys to the clinical data repository. Access to the clinical chart browser currently provides radiology reports, laboratory data, vital signs and transcribed medical reports. A chart metaphor provides tabs for the selection of the clinical record for review. Activation of the radiology tab facilitates a standardized view of radiology reports and provides an icon used to initiate retrieval of available radiology images. The selection of the image icon spawns an image browser plug-in and utilizes the image key from the clinical repository to access the image server for the requested image data. The Spectrum system is collecting clinical data from five hospital systems and imaging data from two hospitals. Domain specific radiology imaging systems support the acquisition and primary interpretation of radiology exams. The spectrum clinical workstations are deployed to over 200 sites utilizing local area networks and ISDN connectivity.

  5. Integrating Records Management (RM) and Information Technology (IT)

    Energy Technology Data Exchange (ETDEWEB)

    NUSBAUM,ANNA W.; CUSIMANO,LINDA J.

    2000-03-02

    Records Managers are continually exploring ways to integrate their services with those offered by Information Technology-related professions to capitalize on the advantages of providing customers a total solution to managing their records and information. In this day and age, where technology abounds, there often exists a fear on the part of records management that this integration will result in a loss of identity and the focus of one's own mission - a fear that records management may become subordinated to the fast-paced technology fields. They need to remember there is strength in numbers and it benefits RM, IT, and the customer when they can bring together the unique offerings each possess to reach synergy for the benefit of all the corporations. Records Managers, need to continually strive to move ''outside the records management box'', network, expand their knowledge, and influence the IT disciplines to incorporate the concept of ''management'' into their customer solutions.

  6. Knowledge-Intensive Gathering and Integration of Statistical Information on European Fisheries

    NARCIS (Netherlands)

    Klinkert, M.; Treur, J.; Verwaart, T.; Loganantharaj, R.; Palm, G.; Ali, M.

    2000-01-01

    Gathering, maintenance, integration and presentation of statistics are major activities of the Dutch Agricultural Economics Research Institute LEI. In this paper we explore how knowledge and agent technology can be exploited to support the information gathering and integration process. In

  7. Automated concept-level information extraction to reduce the need for custom software and rules development.

    Science.gov (United States)

    D'Avolio, Leonard W; Nguyen, Thien M; Goryachev, Sergey; Fiore, Louis D

    2011-01-01

    Despite at least 40 years of promising empirical performance, very few clinical natural language processing (NLP) or information extraction systems currently contribute to medical science or care. The authors address this gap by reducing the need for custom software and rules development with a graphical user interface-driven, highly generalizable approach to concept-level retrieval. A 'learn by example' approach combines features derived from open-source NLP pipelines with open-source machine learning classifiers to automatically and iteratively evaluate top-performing configurations. The Fourth i2b2/VA Shared Task Challenge's concept extraction task provided the data sets and metrics used to evaluate performance. Top F-measure scores for each of the tasks were medical problems (0.83), treatments (0.82), and tests (0.83). Recall lagged precision in all experiments. Precision was near or above 0.90 in all tasks. Discussion With no customization for the tasks and less than 5 min of end-user time to configure and launch each experiment, the average F-measure was 0.83, one point behind the mean F-measure of the 22 entrants in the competition. Strong precision scores indicate the potential of applying the approach for more specific clinical information extraction tasks. There was not one best configuration, supporting an iterative approach to model creation. Acceptable levels of performance can be achieved using fully automated and generalizable approaches to concept-level information extraction. The described implementation and related documentation is available for download.

  8. Extraction of quantifiable information from complex systems

    CERN Document Server

    Dahmen, Wolfgang; Griebel, Michael; Hackbusch, Wolfgang; Ritter, Klaus; Schneider, Reinhold; Schwab, Christoph; Yserentant, Harry

    2014-01-01

    In April 2007, the  Deutsche Forschungsgemeinschaft (DFG) approved the  Priority Program 1324 “Mathematical Methods for Extracting Quantifiable Information from Complex Systems.” This volume presents a comprehensive overview of the most important results obtained over the course of the program.   Mathematical models of complex systems provide the foundation for further technological developments in science, engineering and computational finance.  Motivated by the trend toward steadily increasing computer power, ever more realistic models have been developed in recent years. These models have also become increasingly complex, and their numerical treatment poses serious challenges.   Recent developments in mathematics suggest that, in the long run, much more powerful numerical solution strategies could be derived if the interconnections between the different fields of research were systematically exploited at a conceptual level. Accordingly, a deeper understanding of the mathematical foundations as w...

  9. THE EXTRACTION OF INDOOR BUILDING INFORMATION FROM BIM TO OGC INDOORGML

    Directory of Open Access Journals (Sweden)

    T.-A. Teo

    2017-07-01

    Full Text Available Indoor Spatial Data Infrastructure (indoor-SDI is an important SDI for geosptial analysis and location-based services. Building Information Model (BIM has high degree of details in geometric and semantic information for building. This study proposed direct conversion schemes to extract indoor building information from BIM to OGC IndoorGML. The major steps of the research include (1 topological conversion from building model into indoor network model; and (2 generation of IndoorGML. The topological conversion is a major process of generating and mapping nodes and edges from IFC to indoorGML. Node represents every space (e.g. IfcSpace and objects (e.g. IfcDoor in the building while edge shows the relationships between nodes. According to the definition of IndoorGML, the topological model in the dual space is also represented as a set of nodes and edges. These definitions of IndoorGML are the same as in the indoor network. Therefore, we can extract the necessary data in the indoor network and easily convert them into IndoorGML based on IndoorGML Schema. The experiment utilized a real BIM model to examine the proposed method. The experimental results indicated that the 3D indoor model (i.e. IndoorGML model can be automatically imported from IFC model by the proposed procedure. In addition, the geometric and attribute of building elements are completely and correctly converted from BIM to indoor-SDI.

  10. Advanced Recovery and Integrated Extraction System (ARIES) program plan. Rev. 1

    International Nuclear Information System (INIS)

    Nelson, T.O.; Massey, P.W.; Cremers, T.L.

    1996-01-01

    The Advanced Recovery and Integrated Extraction System (ARIES) demonstration combines various technologies, some of which were/are being developed under previous/other Department of Energy (DOE) funded programs. ARIES is an overall processing system for the dismantlement of nuclear weapon primaries. The program will demonstrate dismantlement of nuclear weapons and retrieval of the plutonium into a form that is compatible with long term storage and that is inspectable in an unclassified form appropriate for the application of traditional international safeguards. The success of the ARIES demonstration would lead to the development of a transportable modular or other facility type systems for weapons dismantlement to be used at other DOE sites as well as in other countries

  11. 45 CFR 61.14 - Confidentiality of Healthcare Integrity and Protection Data Bank information.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Confidentiality of Healthcare Integrity and Protection Data Bank information. 61.14 Section 61.14 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION HEALTHCARE INTEGRITY AND PROTECTION DATA BANK FOR FINAL ADVERSE INFORMATION ON...

  12. Three-tiered integration of PACS and HIS toward next generation total hospital information system.

    Science.gov (United States)

    Kim, J H; Lee, D H; Choi, J W; Cho, H I; Kang, H S; Yeon, K M; Han, M C

    1998-01-01

    The Seoul National University Hospital (SNUH) started a project to innovate the hospital information facilities. This project includes installation of high speed hospital network, development of new HIS, OCS (order communication system), RIS and PACS. This project aims at the implementation of the first total hospital information system by seamlessly integrating these systems together. To achieve this goal, we took three-tiered systems integration approach: network level, database level, and workstation level integration. There are 3 loops of networks in SNUH: proprietary star network for host computer based HIS, Ethernet based hospital LAN for OCS and RIS, and ATM based network for PACS. They are linked together at the backbone level to allow high speed communication between these systems. We have developed special communication modules for each system that allow data interchange between different databases and computer platforms. We have also developed an integrated workstation in which both the OCS and PACS application programs run on a single computer in an integrated manner allowing the clinical users to access and display radiological images as well as textual clinical information within a single user environment. A study is in progress toward a total hospital information system in SNUH by seamlessly integrating the main hospital information resources such as HIS, OCS, and PACS. With the three-tiered systems integration approach, we could successfully integrate the systems from the network level to the user application level.

  13. ONTOGRABBING: Extracting Information from Texts Using Generative Ontologies

    DEFF Research Database (Denmark)

    Nilsson, Jørgen Fischer; Szymczak, Bartlomiej Antoni; Jensen, P.A.

    2009-01-01

    We describe principles for extracting information from texts using a so-called generative ontology in combination with syntactic analysis. Generative ontologies are introduced as semantic domains for natural language phrases. Generative ontologies extend ordinary finite ontologies with rules...... for producing recursively shaped terms representing the ontological content (ontological semantics) of NL noun phrases and other phrases. We focus here on achieving a robust, often only partial, ontology-driven parsing of and ascription of semantics to a sentence in the text corpus. The aim of the ontological...... analysis is primarily to identify paraphrases, thereby achieving a search functionality beyond mere keyword search with synsets. We further envisage use of the generative ontology as a phrase-based rather than word-based browser into text corpora....

  14. HL7 and DICOM based integration of radiology departments with healthcare enterprise information systems.

    Science.gov (United States)

    Blazona, Bojan; Koncar, Miroslav

    2007-12-01

    Integration based on open standards, in order to achieve communication and information interoperability, is one of the key aspects of modern health care information systems. However, this requirement represents one of the major challenges for the Information and Communication Technology (ICT) solutions, as systems today use diverse technologies, proprietary protocols and communication standards which are often not interoperable. One of the main producers of clinical information in healthcare settings represent Radiology Information Systems (RIS) that communicate using widely adopted DICOM (Digital Imaging and COmmunications in Medicine) standard, but in very few cases can efficiently integrate information of interest with other systems. In this context we identified HL7 standard as the world's leading medical ICT standard that is envisioned to provide the umbrella for medical data semantic interoperability, which amongst other things represents the cornerstone for the Croatia's National Integrated Healthcare Information System (IHCIS). The aim was to explore the ability to integrate and exchange RIS originated data with Hospital Information Systems based on HL7's CDA (Clinical Document Architecture) standard. We explored the ability of HL7 CDA specifications and methodology to address the need of RIS integration HL7 based healthcare information systems. We introduced the use of WADO service interconnection to IHCIS and finally CDA rendering in widely used Internet explorers. The outcome of our pilot work proves our original assumption of HL7 standard being able to adopt radiology data into the integrated healthcare systems. Uniform DICOM to CDA translation scripts and business processes within IHCIS is desired and cost effective regarding to use of supporting IHCIS services aligned to SOA.

  15. The Effects of Inquiry-Based Integrated Information Literacy Instruction: Four-Year Trends

    Directory of Open Access Journals (Sweden)

    Lin Ching Chen

    2014-07-01

    Full Text Available The purpose of this study was to examine the effects of four-year integrated information literacy instruction via a framework of inquiry-based learning on elementary students’ memory and comprehension. Moderating factors of students’ academic achievement was another focus of this study. The subjects were 72 students who have participated in this study since they entered an elementary school in Chiayi district. This elementary school adopted the integrated information literacy instruction, designed by the researchers and elementary school teachers, and integrated it into various subject matters via a framework of inquiry-based learning, such as Super 3 and Big6 models. A series of inquiry-based integrated information literacy instruction has been implemented since the second semester of the subjects’ first grade. A total of seven inquiry learning projects has been implemented from grade one through grade four. Fourteen instruments were used as pretests and posttests to assess students’ factual recall and conceptual understanding of subject contents in different projects. The results showed that inquiry-based integrated information literacy instruction couldhelp students memorize facts and comprehend concepts of subject contents. Regardless ofacademic achievements, if students would like to devote their efforts to inquiry processes, their memory and comprehension of subject contents improvedeffectively. However, students of low-academic achievement might need more time to be familiar with the inquiry-based learning strategy.

  16. MedEx: a medication information extraction system for clinical narratives

    Science.gov (United States)

    Stenner, Shane P; Doan, Son; Johnson, Kevin B; Waitman, Lemuel R; Denny, Joshua C

    2010-01-01

    Medication information is one of the most important types of clinical data in electronic medical records. It is critical for healthcare safety and quality, as well as for clinical research that uses electronic medical record data. However, medication data are often recorded in clinical notes as free-text. As such, they are not accessible to other computerized applications that rely on coded data. We describe a new natural language processing system (MedEx), which extracts medication information from clinical notes. MedEx was initially developed using discharge summaries. An evaluation using a data set of 50 discharge summaries showed it performed well on identifying not only drug names (F-measure 93.2%), but also signature information, such as strength, route, and frequency, with F-measures of 94.5%, 93.9%, and 96.0% respectively. We then applied MedEx unchanged to outpatient clinic visit notes. It performed similarly with F-measures over 90% on a set of 25 clinic visit notes. PMID:20064797

  17. Methods to extract information on the atomic and molecular states from scientific abstracts

    International Nuclear Information System (INIS)

    Sasaki, Akira; Ueshima, Yutaka; Yamagiwa, Mitsuru; Murata, Masaki; Kanamaru, Toshiyuki; Shirado, Tamotsu; Isahara, Hitoshi

    2005-01-01

    We propose a new application of information technology to recognize and extract expressions of atomic and molecular states from electrical forms of scientific abstracts. Present results will help scientists to understand atomic states as well as the physics discussed in the articles. Combining with the internet search engines, it will make one possible to collect not only atomic and molecular data but broader scientific information over a wide range of research fields. (author)

  18. Integrated photooxidative extractive deep desulfurization using metal doped TiO2 and eutectic based ionic liquid

    Science.gov (United States)

    Zaid, Hayyiratul Fatimah Mohd; Kait, Chong Fai; Mutalib, Mohamed Ibrahim Abdul

    2016-11-01

    A series of metal doped TiO2 namely Fe/TiO2, Cu/TiO2 and Cu-Fe/TiO2 were synthesized and characterized, to be used as a photocatalyst in the integrated photooxidative extractive deep desulfurization for model oil (dodecane) and diesel fuel. The order of the photocatalytic activity was Cu-Fe/TiO2 followed by Cu/TiO2 and then Fe/TiO2. Cu-Fe/TiO2 was an effective photocatalyst for sulfur conversion at ambient atmospheric pressure. Hydrogen peroxide was used as the source of oxidant and eutectic-based ionic liquid as the extractant. Sulfur conversion in model oil reached 100%. Removal of sulfur from model oil was done by two times extraction with a removal of 97.06% in the first run and 2.94% in the second run.

  19. Integrated and convenient procedure for protein extraction from formalin-fixed, paraffin-embedded tissues for LC-MS/MS analysis.

    Science.gov (United States)

    Lai, Xianyin; Schneider, Bryan P

    2014-11-01

    Because fresh-frozen tissue samples associated with long-term clinical data and of rare diseases are often unobtainable at the present time, formalin-fixed paraffin-embedded (FFPE) tissue samples are considered a highly valuable resource for researchers. However, protein extraction from FFPE tissues faces challenges of deparaffinization and cross-link reversion. Current procedures for protein extraction from FFPE tissue require separate steps and toxic solvents, resulting in inconvenience in protein extraction. To overcome these limitations, an integrated method was developed using nontoxic solvents in four types of FFPE tissues. The average amount of proteins from three replicates of bladder, kidney, liver, and lung FFPE tissues were 442.6, 728.9, 736.4, and 694.7 μg with CVs of 7.5, 5.8, 2.4, and 4.5%, respectively. Proteomic analysis showed that 348, 417, 607, and 304 unique proteins were identified and quantified without specification of isoform by a least two peptides from bladder, kidney, liver, and lung FFPE tissue samples, respectively. The analysis of individual protein CV demonstrated that 97-99% of the proteins were quantified with a CV ≤ 30%, verifying the reproducibility of the integrated protein extraction method. In summary, the developed method is high-yield, reproducible, convenient, simple, low cost, nonvolatile, nonflammable, and nontoxic. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Efficacy of integrating information literacy education into a women's health course on information literacy for RN-BSN students.

    Science.gov (United States)

    Ku, Ya-Lie; Sheu, Sheila; Kuo, Shih-Ming

    2007-03-01

    Information literacy, essential to evidences-based nursing, can promote nurses' capability for life-long learning. Nursing education should strive to employ information literacy education in nursing curricula to improve information literacy abilities among nursing students. This study explored the effectiveness of information literacy education by comparing information literacy skills among a group of RN-BSN (Registered Nurse to Bachelors of Science in Nursing) students who received information literacy education with a group that did not. This quasi-experimental study was conducted during a women's health issues course taught between March and June 2004. Content was presented to the 32 RN-BSN students enrolled in this course, which also taught skills on searching and screening, integrating, analyzing, applying, and presenting information. At the beginning and end of the program, 75 RN-BSN student self-evaluated on a 10 point Likert scale their attained skills in searching and screening, integrating, analyzing, applying, and presenting information. Results identified no significant differences between the experimental (n = 32) and control groups (n = 43) in terms of age, marital status, job title, work unit, years of work experience, and information literacy skills as measured at the beginning of the semester. At the end of the semester during which content was taught, the information literacy of the experimental group in all categories, with the exception of information presentation, was significantly improved as compared to that of the control group. Results were especially significant in terms of integrating, analyzing, and applying skill categories. It is hoped that in the future nursing students will apply enhanced information literacy to address and resolve patients' health problems in clinical settings.

  1. Integrating information for better environmental decisions.

    Energy Technology Data Exchange (ETDEWEB)

    MacDonell, M.; Morgan, K.; Newland, L.; Environmental Assessment; Texas Christian Univ.

    2002-01-01

    As more is learned about the complex nature and extent of environmental impacts from progressive human disturbance, scientists, policy analysts, decision makers, educators, and communicators are increasingly joining forces to develop strategies for preserving and protecting the environment. The Eco-Informa Foundation is an educational scientific organization dedicated to promoting the collaborative development and sharing of scientific information. The Foundation participated in a recent international conference on environmental informatics through a special symposium on integrating information for better environmental decisions. Presentations focused on four general themes: (1) remote sensing and data interpretation, including through new knowledge management tools; (2) risk assessment and communication, including for radioactively contaminated facilities, introduced biological hazards, and food safety; (3) community involvement in cleanup projects; and (4) environmental education. The general context for related issues, methods and applications, and results and recommendations from those discussions are highlighted here.

  2. Integration of Information Literacy into the Curriculum: Constructive Alignment from Theory into Practice

    Directory of Open Access Journals (Sweden)

    Claes Dahlqvist

    2016-12-01

    Full Text Available Librarian-teacher cooperation is essential for the integration of information literacy into course syllabi. Therefore, a common theoretical and methodological platform is needed. As librarians at Kristianstad University we have had the opportunity to develop such a platform when teaching information literacy in a basic course for teachers in higher education pedagogy. Information literacy is taught in context with academic writing, distance learning and teaching, and development of course syllabi. Constructive Alignment in Theory: We used constructive alignment in designing our part of the course. John Biggs’ ideas tell us that assessment tasks (ATs should be aligned to what is intended to be learned. Intended learning outcomes (ILOs specify teaching/learning activities (TLAs based on the content of learning. TLAs should be designed in ways that enable students to construct knowledge from their own experience. The ILOs for the course are to have arguments for the role of information literacy in higher education and ideas of implementing them in TLAs. The content of learning is for example the concept of information literacy, theoretical perspectives and constructive alignment for integration in course syllabi. TLAs are written pre-lecture reflections on the concept of information literacy, used as a starting point for the three-hour seminar. Learning reflections are written afterwards. The AT is to revise a syllabus (preferably using constructive alignment for a course the teacher is responsible for, where information literacy must be integrated with the other parts and topics of the course. Constructive Alignment in Practice: Using constructive alignment has taught us that this model serves well as the foundation of the theoretical and methodological platform for librarian-teacher cooperation when integrating information literacy in course syllabi. It contains all important aspects of the integration of information literacy in course

  3. Analysis Methods for Extracting Knowledge from Large-Scale WiFi Monitoring to Inform Building Facility Planning

    DEFF Research Database (Denmark)

    Ruiz-Ruiz, Antonio; Blunck, Henrik; Prentow, Thor Siiger

    2014-01-01

    realistic data to inform facility planning. In this paper, we propose analysis methods to extract knowledge from large sets of network collected WiFi traces to better inform facility management and planning in large building complexes. The analysis methods, which build on a rich set of temporal and spatial......The optimization of logistics in large building com- plexes with many resources, such as hospitals, require realistic facility management and planning. Current planning practices rely foremost on manual observations or coarse unverified as- sumptions and therefore do not properly scale or provide....... Spatio-temporal visualization tools built on top of these methods enable planners to inspect and explore extracted information to inform facility-planning activities. To evaluate the methods, we present results for a large hospital complex covering more than 10 hectares. The evaluation is based on Wi...

  4. Social Information Is Integrated into Value and Confidence Judgments According to Its Reliability.

    Science.gov (United States)

    De Martino, Benedetto; Bobadilla-Suarez, Sebastian; Nouguchi, Takao; Sharot, Tali; Love, Bradley C

    2017-06-21

    How much we like something, whether it be a bottle of wine or a new film, is affected by the opinions of others. However, the social information that we receive can be contradictory and vary in its reliability. Here, we tested whether the brain incorporates these statistics when judging value and confidence. Participants provided value judgments about consumer goods in the presence of online reviews. We found that participants updated their initial value and confidence judgments in a Bayesian fashion, taking into account both the uncertainty of their initial beliefs and the reliability of the social information. Activity in dorsomedial prefrontal cortex tracked the degree of belief update. Analogous to how lower-level perceptual information is integrated, we found that the human brain integrates social information according to its reliability when judging value and confidence. SIGNIFICANCE STATEMENT The field of perceptual decision making has shown that the sensory system integrates different sources of information according to their respective reliability, as predicted by a Bayesian inference scheme. In this work, we hypothesized that a similar coding scheme is implemented by the human brain to process social signals and guide complex, value-based decisions. We provide experimental evidence that the human prefrontal cortex's activity is consistent with a Bayesian computation that integrates social information that differs in reliability and that this integration affects the neural representation of value and confidence. Copyright © 2017 De Martino et al.

  5. Gstruct: a system for extracting schemas from GML documents

    Science.gov (United States)

    Chen, Hui; Zhu, Fubao; Guan, Jihong; Zhou, Shuigeng

    2008-10-01

    Geography Markup Language (GML) becomes the de facto standard for geographic information representation on the internet. GML schema provides a way to define the structure, content, and semantic of GML documents. It contains useful structural information of GML documents and plays an important role in storing, querying and analyzing GML data. However, GML schema is not mandatory, and it is common that a GML document contains no schema. In this paper, we present Gstruct, a tool for GML schema extraction. Gstruct finds the features in the input GML documents, identifies geometry datatypes as well as simple datatypes, then integrates all these features and eliminates improper components to output the optimal schema. Experiments demonstrate that Gstruct is effective in extracting semantically meaningful schemas from GML documents.

  6. Moral judgment as information processing: an integrative review.

    Science.gov (United States)

    Guglielmo, Steve

    2015-01-01

    How do humans make moral judgments about others' behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment.

  7. Moral judgment as information processing: an integrative review

    Science.gov (United States)

    Guglielmo, Steve

    2015-01-01

    How do humans make moral judgments about others’ behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment. PMID:26579022

  8. A research of road centerline extraction algorithm from high resolution remote sensing images

    Science.gov (United States)

    Zhang, Yushan; Xu, Tingfa

    2017-09-01

    Satellite remote sensing technology has become one of the most effective methods for land surface monitoring in recent years, due to its advantages such as short period, large scale and rich information. Meanwhile, road extraction is an important field in the applications of high resolution remote sensing images. An intelligent and automatic road extraction algorithm with high precision has great significance for transportation, road network updating and urban planning. The fuzzy c-means (FCM) clustering segmentation algorithms have been used in road extraction, but the traditional algorithms did not consider spatial information. An improved fuzzy C-means clustering algorithm combined with spatial information (SFCM) is proposed in this paper, which is proved to be effective for noisy image segmentation. Firstly, the image is segmented using the SFCM. Secondly, the segmentation result is processed by mathematical morphology to remover the joint region. Thirdly, the road centerlines are extracted by morphology thinning and burr trimming. The average integrity of the centerline extraction algorithm is 97.98%, the average accuracy is 95.36% and the average quality is 93.59%. Experimental results show that the proposed method in this paper is effective for road centerline extraction.

  9. An Integrated Information Retrieval Support System for Campus Network

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    This paper presents a new integrated information retrieval support system (IIRSS) which can help Web search engines retrieve cross-lingual information from heterogeneous resources stored in multi-databases in Intranet. The IIRSS, with a three-layer architecture, can cooperate with other application servers running in Intranet. By using intelligent agents to collect information and to create indexes on-the-fly, using an access control strategy to confine a user to browsing those accessible documents for him/her through a single portal, and using a new cross-lingual translation tool to help the search engine retrieve documents, the new system provides controllable information access with different authorizations, personalized services, and real-time information retrieval.

  10. Integrated Engineering Information Technology, FY93 accommplishments

    Energy Technology Data Exchange (ETDEWEB)

    Harris, R.N.; Miller, D.K.; Neugebauer, G.L.; Orona, J.R.; Partridge, R.A.; Herman, J.D.

    1994-03-01

    The Integrated Engineering Information Technology (IEIT) project is providing a comprehensive, easy-to-use computer network solution or communicating with coworkers both inside and outside Sandia National Laboratories. IEIT capabilities include computer networking, electronic mail, mechanical design, and data management. These network-based tools have one fundamental purpose: to help create a concurrent engineering environment that will enable Sandia organizations to excel in today`s increasingly competitive business environment.

  11. Development of an Information Database for the Integrated Airline Management System (IAMS

    Directory of Open Access Journals (Sweden)

    Bogdane Ruta

    2017-08-01

    Full Text Available In present conditions the activity of any enterprise is represented as a combination of operational processes. Each of them corresponds to relevant airline management systems. Combining two or more management systems, it is possible to obtain an integrated management system. For the effective functioning of the integrated management system, an appropriate information system should be developed. This article proposes a model of such an information system.

  12. Domain-independent information extraction in unstructured text

    Energy Technology Data Exchange (ETDEWEB)

    Irwin, N.H. [Sandia National Labs., Albuquerque, NM (United States). Software Surety Dept.

    1996-09-01

    Extracting information from unstructured text has become an important research area in recent years due to the large amount of text now electronically available. This status report describes the findings and work done during the second year of a two-year Laboratory Directed Research and Development Project. Building on the first-year`s work of identifying important entities, this report details techniques used to group words into semantic categories and to output templates containing selective document content. Using word profiles and category clustering derived during a training run, the time-consuming knowledge-building task can be avoided. Though the output still lacks in completeness when compared to systems with domain-specific knowledge bases, the results do look promising. The two approaches are compatible and could complement each other within the same system. Domain-independent approaches retain appeal as a system that adapts and learns will soon outpace a system with any amount of a priori knowledge.

  13. ACCOUNTING INFORMATION INTEGRATION TROUGH AN ENTERPRISE PORTAL

    Directory of Open Access Journals (Sweden)

    Gianina RIZESCU

    2014-06-01

    Full Text Available If companies are lacking integrated enterprise software applications, or they simply do not use them on a large scale, accounting departments have to face lots of difficulties, concerning both the inflexibility in achieving good results and the limited possibility of communicating these results. Thus, most times, accounting departments are limited to generating predefined reports provided by a software application and the most they can do is export these reports into Microsoft Excel. Another cause which leads to late obtaining and publishing of accounting information is the lack of data from other departments and their corresponding software applications. That is why, in many enterprises, accounting data becomes irrelevant for the users. The main goal of this article is to show how accounting can benefit from an integrated software solution, namely an enterprise portal.

  14. An integrated biohydrogen refinery: synergy of photofermentation, extractive fermentation and hydrothermal hydrolysis of food wastes.

    Science.gov (United States)

    Redwood, Mark D; Orozco, Rafael L; Majewski, Artur J; Macaskie, Lynne E

    2012-09-01

    An Integrated Biohydrogen Refinery (IBHR) and experimental net energy analysis are reported. The IBHR converts biomass to electricity using hydrothermal hydrolysis, extractive biohydrogen fermentation and photobiological hydrogen fermentation for electricity generation in a fuel cell. An extractive fermentation, developed previously, is applied to waste-derived substrates following hydrothermal pre-treatment, achieving 83-99% biowaste destruction. The selective separation of organic acids from waste-fed fermentations provided suitable substrate for photofermentative hydrogen production, which enhanced the gross energy generation up to 11-fold. Therefore, electrodialysis provides the key link in an IBHR for 'waste to energy'. The IBHR compares favourably to 'renewables' (photovoltaics, on-shore wind, crop-derived biofuels) and also emerging biotechnological options (microbial electrolysis) and anaerobic digestion. Copyright © 2012 Elsevier Ltd. All rights reserved.

  15. Behavior Selection of Mobile Robot Based on Integration of Multimodal Information

    Science.gov (United States)

    Chen, Bin; Kaneko, Masahide

    Recently, biologically inspired robots have been developed to acquire the capacity for directing visual attention to salient stimulus generated from the audiovisual environment. On purpose to realize this behavior, a general method is to calculate saliency maps to represent how much the external information attracts the robot's visual attention, where the audiovisual information and robot's motion status should be involved. In this paper, we represent a visual attention model where three modalities, that is, audio information, visual information and robot's motor status are considered, while the previous researches have not considered all of them. Firstly, we introduce a 2-D density map, on which the value denotes how much the robot pays attention to each spatial location. Then we model the attention density using a Bayesian network where the robot's motion statuses are involved. Secondly, the information from both of audio and visual modalities is integrated with the attention density map in integrate-fire neurons. The robot can direct its attention to the locations where the integrate-fire neurons are fired. Finally, the visual attention model is applied to make the robot select the visual information from the environment, and react to the content selected. Experimental results show that it is possible for robots to acquire the visual information related to their behaviors by using the attention model considering motion statuses. The robot can select its behaviors to adapt to the dynamic environment as well as to switch to another task according to the recognition results of visual attention.

  16. IHE, Solution for integration of information systems and PACS

    Directory of Open Access Journals (Sweden)

    Milad Janghorban Lariche

    2014-10-01

    Full Text Available PACS is used as a way to store images and matches well with the workflow in the radiology department and can spread to other parts of hospital. Integration with other PACS and other hospital systems like radiology information system (RIS, hospital information system (HIS, and electronic patient records has been completely done, but there are still problems. PACS also provide good conditions for setting up Tele-radiology. The next step for PACS is where hospitals and health care organizations share photos in integrated electronic patient record. Among the different ways for sharing photos between different hospitals, IHE (integrating the health care enterprise standard indexes the cross-enterprise document sharing profile (XDS and allows sharing photos from various hospitals even if their PACS has different brands and different vendors. Application of XDS is useful for sharing images between health care organizations without duplicating them in a central archive. Images need to be indexed in a central registry. In the XDS profile, IHE defines an indexing mechanism for printing and indexing images in the central document registry. IHE also defines mechanisms to be used by each hospital to retrieve images, regardless of storing them in hospital PACS.

  17. Impact of informal institutions on the development integration processes

    Directory of Open Access Journals (Sweden)

    Sidorova Alexandra, M.

    2015-06-01

    Full Text Available The paper deals with the impact of informal institutions on the definition of the vector integration processes and the development of integration processes in the countries of the Customs Union and Ukraine. The degree of scientific development of the phenomenon in different economic schools is determined in this article. Economic mentality is a basic informal institutions, which determines the degree of effectiveness of the integration processes. This paper examines the nature, characteristics and effects of economic mentality on the economic activities of people. Ethnometrichal method allows to quantify the economic mentality that enables deeper understanding and analysis of the formation and functioning of political and economic system, especially business and management, establishing contacts with other cultures. It was measured modern Belarusian economic mentality based on international methodology Hofstede and compared with the economic mentality of Russia, Ukraine and Kazakhstan. With the help of cluster analysis congruence economic mentality of the Customs Union and Ukraine was determined. Economic mentality of these countries was also compared with the economic mentality of other countries in order to identify the main types of economic culture.

  18. Methodology of the design of an integrated telecommunications and computer network in a control information system for artillery battalion fire support

    Directory of Open Access Journals (Sweden)

    Slobodan M. Miletić

    2012-04-01

    Full Text Available A Command Information System (CIS in a broader sense can be defined as a set of hardware and software solutions by which one achieves real-time integration of organizational structures, doctrine, technical and technological systems and facilities, information flows and processes for efficient and rational decision-making and functioning. Time distribution and quality of information directly affect the implementation of the decision making process and criteria for evaluating the effectiveness of the system in which the achievement of the most important role is an integrated telecommunications and computer network (ITCN, dimensioned to the spatial distribution of tactical combat units connecting all the elements in a communications unit. The aim is to establish the design methodology as a way of the ITCN necessary to conduct analysis and extract all the necessary elements for modeling that are mapped to the elements of network infrastructure, and then analyzed from the perspective of telecommunications communication standards and parameters of the layers of the OSI network model. A relevant way to verify the designed model ITCN is the development of a simulation model with which adequate results can be obtained. Conclusions on the compliance with the requirements of tactical combat and tactical communication requirements are drawn on the basis of these results.

  19. An Integrated Hydrologic Model and Remote Sensing Synthesis Approach to Study Groundwater Extraction During a Historic Drought in the California Central Valley

    Science.gov (United States)

    Thatch, L. M.; Maxwell, R. M.; Gilbert, J. M.

    2017-12-01

    Over the past century, groundwater levels in California's San Joaquin Valley have dropped more than 30 meters in some areas due to excessive groundwater extraction to irrigate agricultural lands and feed a growing population. Between 2012 and 2016 California experienced the worst drought in its recorded history, further exacerbating this groundwater depletion. Due to lack of groundwater regulation, exact quantities of extracted groundwater in California are unknown and hard to quantify. We use a synthesis of integrated hydrologic model simulations and remote sensing products to quantify the impact of drought and groundwater pumping on the Central Valley water tables. The Parflow-CLM model was used to evaluate groundwater depletion in the San Joaquin River basin under multiple groundwater extraction scenarios simulated from pre-drought through recent drought years. Extraction scenarios included pre-development conditions, with no groundwater pumping; historical conditions based on decreasing groundwater level measurements; and estimated groundwater extraction rates calculated from the deficit between the predicted crop water demand, based on county land use surveys, and available surface water supplies. Results were compared to NASA's Gravity Recover and Climate Experiment (GRACE) data products to constrain water table decline from groundwater extraction during severe drought. This approach untangles various factors leading to groundwater depletion within the San Joaquin Valley both during drought and years of normal recharge to help evaluate which areas are most susceptible to groundwater overdraft, as well as further evaluating the spatially and temporally variable sustainable yield. Recent efforts to improve water management and ensure reliable water supplies are highlighted by California's Sustainable Groundwater Management Act (SGMA) which mandates Groundwater Sustainability Agencies to determine the maximum quantity of groundwater that can be withdrawn through

  20. Integrating Web 2.0-Based Informal Learning with Workplace Training

    Science.gov (United States)

    Zhao, Fang; Kemp, Linzi J.

    2012-01-01

    Informal learning takes place in the workplace through connection and collaboration mediated by Web 2.0 applications. However, little research has yet been published that explores informal learning and how to integrate it with workplace training. We aim to address this research gap by developing a conceptual Web 2.0-based workplace learning and…

  1. Optimization of the German integrated information and measurement system (IMIS)

    International Nuclear Information System (INIS)

    Wirth, E.; Weiss, W.

    2002-01-01

    The Chernobyl accident led to a widespread contamination of the environment in most European countries. In Germany, like in all other countries, it took some time to evaluate the radiological situation, time which is extremely valuable in the early phases of an accident when decisions on countermeasures like sheltering, iodine prophylaxis or evacuation have to be taken. For a better emergency preparedness the Integrated Information and Measurement System (IMIS) has been developed and established in Germany. In case of a widespread contamination of the environment, the system will provide the decision makers with all information necessary to evaluate the radiological situation and to decide on countermeasures. Presently this system is upgraded due to the adoption of the European decision supporting system RODOS and by the improvement of the national information exchange. For this purpose the web based information system ELAN has been developed. The national systems have to be integrated into the European and international communication systems. In this presentation the IMIS system is briefly described and the new features and modules of the system are discussed in greater detail

  2. Integration of radiology and hospital information systems (RIS, HIS) with PACS

    International Nuclear Information System (INIS)

    Mosser, H.; Urban, M.; Hruby, W.; Duerr, M.; Rueger, W.

    1992-01-01

    PACS development has now reached a stage where it can clearly be stated that the technology for storage, networking and display in a fully digital environment is available. This is reflected by an already large and rapidly increasing number of PACS installations in USA, Western Europe and Japan. Such installations consist of a great variety of information systems, more or less interconnected, like PACS, HIS, RIS and other departmental systems, differing in both hardware and software. Various data -even if they only concern one person- are stored in different systems distributed in the hospital. The integration of all digital systems into a functional unit is determined by the radiologist's need of quick access to all relevant information regardless where it is stored. The interconnection and functional integration of all digital systems in the hospital determine the clinical benefits of PACS. This paper describes the radiologist's requirements concerning this integration, and presents some realistic solutions such as the Siemens ISI (Information System Interface), and a mobile viewing station for the wards (visitBox). (author). 9 refs., 4 figs

  3. Entropy in Postmerger and Acquisition Integration from an Information Technology Perspective

    Science.gov (United States)

    Williams, Gloria S.

    2012-01-01

    Mergers and acquisitions have historically experienced failure rates from 50% to more than 80%. Successful integration of information technology (IT) systems can be the difference between postmerger success or failure. The purpose of this phenomenological study was to explore the entropy phenomenon during postmerger IT integration. To that end, a…

  4. Systematically extracting metal- and solvent-related occupational information from free-text responses to lifetime occupational history questionnaires.

    Science.gov (United States)

    Friesen, Melissa C; Locke, Sarah J; Tornow, Carina; Chen, Yu-Cheng; Koh, Dong-Hee; Stewart, Patricia A; Purdue, Mark; Colt, Joanne S

    2014-06-01

    Lifetime occupational history (OH) questionnaires often use open-ended questions to capture detailed information about study participants' jobs. Exposure assessors use this information, along with responses to job- and industry-specific questionnaires, to assign exposure estimates on a job-by-job basis. An alternative approach is to use information from the OH responses and the job- and industry-specific questionnaires to develop programmable decision rules for assigning exposures. As a first step in this process, we developed a systematic approach to extract the free-text OH responses and convert them into standardized variables that represented exposure scenarios. Our study population comprised 2408 subjects, reporting 11991 jobs, from a case-control study of renal cell carcinoma. Each subject completed a lifetime OH questionnaire that included verbatim responses, for each job, to open-ended questions including job title, main tasks and activities (task), tools and equipment used (tools), and chemicals and materials handled (chemicals). Based on a review of the literature, we identified exposure scenarios (occupations, industries, tasks/tools/chemicals) expected to involve possible exposure to chlorinated solvents, trichloroethylene (TCE) in particular, lead, and cadmium. We then used a SAS macro to review the information reported by study participants to identify jobs associated with each exposure scenario; this was done using previously coded standardized occupation and industry classification codes, and a priori lists of associated key words and phrases related to possibly exposed tasks, tools, and chemicals. Exposure variables representing the occupation, industry, and task/tool/chemicals exposure scenarios were added to the work history records of the study respondents. Our identification of possibly TCE-exposed scenarios in the OH responses was compared to an expert's independently assigned probability ratings to evaluate whether we missed identifying

  5. The Feature Extraction Based on Texture Image Information for Emotion Sensing in Speech

    Directory of Open Access Journals (Sweden)

    Kun-Ching Wang

    2014-09-01

    Full Text Available In this paper, we present a novel texture image feature for Emotion Sensing in Speech (ESS. This idea is based on the fact that the texture images carry emotion-related information. The feature extraction is derived from time-frequency representation of spectrogram images. First, we transform the spectrogram as a recognizable image. Next, we use a cubic curve to enhance the image contrast. Then, the texture image information (TII derived from the spectrogram image can be extracted by using Laws’ masks to characterize emotional state. In order to evaluate the effectiveness of the proposed emotion recognition in different languages, we use two open emotional databases including the Berlin Emotional Speech Database (EMO-DB and eNTERFACE corpus and one self-recorded database (KHUSC-EmoDB, to evaluate the performance cross-corpora. The results of the proposed ESS system are presented using support vector machine (SVM as a classifier. Experimental results show that the proposed TII-based feature extraction inspired by visual perception can provide significant classification for ESS systems. The two-dimensional (2-D TII feature can provide the discrimination between different emotions in visual expressions except for the conveyance pitch and formant tracks. In addition, the de-noising in 2-D images can be more easily completed than de-noising in 1-D speech.

  6. TechIP: A Methodology for Emerging Information Technology Insertion & Integration

    National Research Council Canada - National Science Library

    Patel, Has

    2004-01-01

    ...) processing and software agents. To implement these requirements, the system designers are required to insert, integrate and manage proven advances in Emerging Information Technology (EIT) in to the...

  7. Waste Information Management System with Integrated Transportation Forecast Data

    International Nuclear Information System (INIS)

    Upadhyay, H.; Quintero, W.; Shoffner, P.; Lagos, L.

    2009-01-01

    The Waste Information Management System with Integrated Transportation Forecast Data was developed to support the Department of Energy (DOE) mandated accelerated cleanup program. The schedule compression required close coordination and a comprehensive review and prioritization of the barriers that impeded treatment and disposition of the waste streams at each site. Many issues related to site waste treatment and disposal were potential critical path issues under the accelerated schedules. In order to facilitate accelerated cleanup initiatives, waste managers at DOE field sites and at DOE Headquarters in Washington, D.C., needed timely waste forecast and transportation information regarding the volumes and types of waste that would be generated by the DOE sites over the next 40 years. Each local DOE site has historically collected, organized, and displayed site waste forecast information in separate and unique systems. However, waste and shipment information from all sites needed a common application to allow interested parties to understand and view the complete complex-wide picture. The Waste Information Management System with Integrated Transportation Forecast Data allows identification of total forecasted waste volumes, material classes, disposition sites, choke points, technological or regulatory barriers to treatment and disposal, along with forecasted waste transportation information by rail, truck and inter-modal shipments. The Applied Research Center (ARC) at Florida International University (FIU) in Miami, Florida, has deployed the web-based forecast and transportation system and is responsible for updating the waste forecast and transportation data on a regular basis to ensure the long-term viability and value of this system. (authors)

  8. Evolution of natural history information in the 21st century – developing an integrated framework for biological and geographical data

    Science.gov (United States)

    Reusser, Deborah A.; Lee, Henry

    2011-01-01

    Threats to marine and estuarine species operate over many spatial scales, from nutrient enrichment at the watershed/estuarine scale to invasive species and climate change at regional and global scales. To help address research questions across these scales, we provide here a standardized framework for a biogeographical information system containing queriable biological data that allows extraction of information on multiple species, across a variety of spatial scales based on species distributions, natural history attributes and habitat requirements. As scientists shift from research on localized impacts on individual species to regional and global scale threats, macroecological approaches of studying multiple species over broad geographical areas are becoming increasingly important. The standardized framework described here for capturing and integrating biological and geographical data is a critical first step towards addressing these macroecological questions and we urge organizations capturing biogeoinformatics data to consider adopting this framework.

  9. Uncertainty analysis of an integrated energy system based on information theory

    International Nuclear Information System (INIS)

    Fu, Xueqian; Sun, Hongbin; Guo, Qinglai; Pan, Zhaoguang; Xiong, Wen; Wang, Li

    2017-01-01

    Currently, a custom-designed configuration of different renewable technologies named the integrated energy system (IES) has become popular due to its high efficiency, benefiting from complementary multi-energy technologies. This paper proposes an information entropy approach to quantify uncertainty in an integrated energy system based on a stochastic model that drives a power system model derived from an actual network on Barry Island. Due to the complexity of co-behaviours between generators, a copula-based approach is utilized to articulate the dependency structure of the generator outputs with regard to such factors as weather conditions. Correlation coefficients and mutual information, which are effective for assessing the dependence relationships, are applied to judge whether the stochastic IES model is correct. The calculated information values can be used to analyse the impacts of the coupling of power and heat on power flows and heat flows, and this approach will be helpful for improving the operation of IES. - Highlights: • The paper explores uncertainty of an integrated energy system. • The dependent weather model is verified from the perspective of correlativity. • The IES model considers the dependence between power and heat. • The information theory helps analyse the complexity of IES operation. • The application of the model is studied using an operational system on Barry Island.

  10. Open critical area model and extraction algorithm based on the net flow-axis

    International Nuclear Information System (INIS)

    Wang Le; Wang Jun-Ping; Gao Yan-Hong; Xu Dan; Li Bo-Bo; Liu Shi-Gang

    2013-01-01

    In the integrated circuit manufacturing process, the critical area extraction is a bottleneck to the layout optimization and the integrated circuit yield estimation. In this paper, we study the problem that the missing material defects may result in the open circuit fault. Combining the mathematical morphology theory, we present a new computation model and a novel extraction algorithm for the open critical area based on the net flow-axis. Firstly, we find the net flow-axis for different nets. Then, the net flow-edges based on the net flow-axis are obtained. Finally, we can extract the open critical area by the mathematical morphology. Compared with the existing methods, the nets need not to divide into the horizontal nets and the vertical nets, and the experimental results show that our model and algorithm can accurately extract the size of the open critical area and obtain the location information of the open circuit critical area. (interdisciplinary physics and related areas of science and technology)

  11. 76 FR 17145 - Agency Information Collection Activities: Business Transformation-Automated Integrated Operating...

    Science.gov (United States)

    2011-03-28

    ... Collection Activities: Business Transformation--Automated Integrated Operating Environment (IOE), New... through efforts like USCIS' Business Transformation initiative. The IOE will be implemented by USCIS and... information collection. (2) Title of the Form/Collection: Business Transformation-- Automated Integrated...

  12. Mass extraction container closure integrity physical testing method development for parenteral container closure systems.

    Science.gov (United States)

    Yoon, Seung-Yil; Sagi, Hemi; Goldhammer, Craig; Li, Lei

    2012-01-01

    Container closure integrity (CCI) is a critical factor to ensure that product sterility is maintained over its entire shelf life. Assuring the CCI during container closure (C/C) system qualification, routine manufacturing and stability is important. FDA guidance also encourages industry to develop a CCI physical testing method in lieu of sterility testing in a stability program. A mass extraction system has been developed to check CCI for a variety of container closure systems such as vials, syringes, and cartridges. Various types of defects (e.g., glass micropipette, laser drill, wire) were created and used to demonstrate a detection limit. Leakage, detected as mass flow in this study, changes as a function of defect length and diameter. Therefore, the morphology of defects has been examined in detail with fluid theories. This study demonstrated that a mass extraction system was able to distinguish between intact samples and samples with 2 μm defects reliably when the defect was exposed to air, water, placebo, or drug product (3 mg/mL concentration) solution. Also, it has been verified that the method was robust, and capable of determining the acceptance limit using 3σ for syringes and 6σ for vials. Sterile products must maintain their sterility over their entire shelf life. Container closure systems such as those found in syringes and vials provide a seal between rubber and glass containers. This seal must be ensured to maintain product sterility. A mass extraction system has been developed to check container closure integrity for a variety of container closure systems such as vials, syringes, and cartridges. In order to demonstrate the method's capability, various types of defects (e.g., glass micropipette, laser drill, wire) were created in syringes and vials and were tested. This study demonstrated that a mass extraction system was able to distinguish between intact samples and samples with 2 μm defects reliably when the defect was exposed to air, water

  13. An integration of Emergency Department Information and Ambulance Systems.

    Science.gov (United States)

    Al-Harbi, Nada; El-Masri, Samir; Saddik, Basema

    2012-01-01

    In this paper we propose an Emergency Department Information System that will be integrated with the ambulance system to improve the communication, enhance the quality of provided emergency services and facilitate information sharing. The proposed system utilizes new advanced technologies such as mobile web services that overcome the problems of interoperability between different systems, HL7 and GPS. The system is unique in that it allows ambulance officers to locate the nearest specialized hospital and allows access to the patient's electronic health record as well as providing the hospital with required information to prepare for the incoming patient.

  14. Project Integration Architecture: A Practical Demonstration of Information Propagation

    Science.gov (United States)

    Jones, William Henry

    2005-01-01

    One of the goals of the Project Integration Architecture (PIA) effort is to provide the ability to propagate information between disparate applications. With this ability, applications may then be formed into an application graph constituting a super-application. Such a super-application would then provide all of the analysis appropriate to a given technical system. This paper reports on a small demonstration of this concept in which a Computer Aided Design (CAD) application was connected to an inlet analysis code and geometry information automatically propagated from one to the other. The majority of the work reported involved not the technology of information propagation, but rather the conversion of propagated information into a form usable by the receiving application.

  15. Semantic integration of information about orthologs and diseases: the OGO system.

    Science.gov (United States)

    Miñarro-Gimenez, Jose Antonio; Egaña Aranguren, Mikel; Martínez Béjar, Rodrigo; Fernández-Breis, Jesualdo Tomás; Madrid, Marisa

    2011-12-01

    Semantic Web technologies like RDF and OWL are currently applied in life sciences to improve knowledge management by integrating disparate information. Many of the systems that perform such task, however, only offer a SPARQL query interface, which is difficult to use for life scientists. We present the OGO system, which consists of a knowledge base that integrates information of orthologous sequences and genetic diseases, providing an easy to use ontology-constrain driven query interface. Such interface allows the users to define SPARQL queries through a graphical process, therefore not requiring SPARQL expertise. Copyright © 2011 Elsevier Inc. All rights reserved.

  16. 45 CFR 61.12 - Requesting information from the Healthcare Integrity and Protection Data Bank.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Requesting information from the Healthcare Integrity and Protection Data Bank. 61.12 Section 61.12 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION HEALTHCARE INTEGRITY AND PROTECTION DATA BANK FOR FINAL ADVERSE INFORMATION...

  17. From Specific Information Extraction to Inferences: A Hierarchical Framework of Graph Comprehension

    Science.gov (United States)

    2004-09-01

    The skill to interpret the information displayed in graphs is so important to have, the National Council of Teachers of Mathematics has created...guidelines to ensure that students learn these skills ( NCTM : Standards for Mathematics , 2003). These guidelines are based primarily on the extraction of...graphical perception. Human Computer Interaction, 8, 353-388. NCTM : Standards for Mathematics . (2003, 2003). Peebles, D., & Cheng, P. C.-H. (2002

  18. Immediate integration of prosodic information from speech and visual information from pictures in the absence of focused attention: a mismatch negativity study.

    Science.gov (United States)

    Li, X; Yang, Y; Ren, G

    2009-06-16

    Language is often perceived together with visual information. Recent experimental evidences indicated that, during spoken language comprehension, the brain can immediately integrate visual information with semantic or syntactic information from speech. Here we used the mismatch negativity to further investigate whether prosodic information from speech could be immediately integrated into a visual scene context or not, and especially the time course and automaticity of this integration process. Sixteen Chinese native speakers participated in the study. The materials included Chinese spoken sentences and picture pairs. In the audiovisual situation, relative to the concomitant pictures, the spoken sentence was appropriately accented in the standard stimuli, but inappropriately accented in the two kinds of deviant stimuli. In the purely auditory situation, the speech sentences were presented without pictures. It was found that the deviants evoked mismatch responses in both audiovisual and purely auditory situations; the mismatch negativity in the purely auditory situation peaked at the same time as, but was weaker than that evoked by the same deviant speech sounds in the audiovisual situation. This pattern of results suggested immediate integration of prosodic information from speech and visual information from pictures in the absence of focused attention.

  19. Integrated Information Centers within Academic Environments: Introduction and Overview.

    Science.gov (United States)

    Lunin, Luis F., Ed.; D'Elia, George, Ed.

    1991-01-01

    Introduces eight articles on the Integrated Information Center (IIC) Project, which investigated significant behavioral, technological, organizational, financial, and legal factors involved in the management of IICs. Four articles address design and management issues of general interest, and four focus on specific design considerations and a…

  20. Two-dimensional parasitic capacitance extraction for integrated circuit with dual discrete geometric methods

    International Nuclear Information System (INIS)

    Ren Dan; Ren Zhuoxiang; Qu Hui; Xu Xiaoyu

    2015-01-01

    Capacitance extraction is one of the key issues in integrated circuits and also a typical electrostatic problem. The dual discrete geometric method (DGM) is investigated to provide relative solutions in two-dimensional unstructured mesh space. The energy complementary characteristic and quick field energy computation thereof based on it are emphasized. Contrastive analysis between the dual finite element methods and the dual DGMs are presented both from theoretical derivation and through case studies. The DGM, taking the scalar potential as unknown on dual interlocked meshes, with simple form and good accuracy, is expected to be one of the mainstreaming methods in associated areas. (paper)

  1. DESIGN OF INFORMATION MANAGEMENT SYSTEM OF VERTICALLY INTEGRATED AGRICULTURAL HOLDINGS

    Directory of Open Access Journals (Sweden)

    Александр Витальевич ШМАТКО

    2015-05-01

    Full Text Available The paper deals with an approach to the design and development of information systems for the management and optimization of the organizational structure of vertically integrated agricultural holdings. A review of the problems of building and improving the organizational structure of vertically integrated agricultural holding is made. A method of constructing a discrete model management structure agricultural holding, which minimizes the costs associated with attracting applicants to work, is proposed.

  2. Integrated care: an Information Model for Patient Safety and Vigilance Reporting Systems.

    Science.gov (United States)

    Rodrigues, Jean-Marie; Schulz, Stefan; Souvignet, Julien

    2015-01-01

    Quality management information systems for safety as a whole or for specific vigilances share the same information types but are not interoperable. An international initiative tries to develop an integrated information model for patient safety and vigilance reporting to support a global approach of heath care quality.

  3. Benefits and problems in implementation for integrated medical information system

    International Nuclear Information System (INIS)

    Park, Chang Seo; Kim, Kee Deog; Park, Hyok; Jeong, Ho Gul

    2005-01-01

    Once the decision has been made to adopt an integrated medical information system (IMIS), there are a number of tissues to overcome. Users need to be aware of the impact the change will make on end users and be prepared to address issues that arise before they become problems. The purpose of this study is to investigate the benefits and unexpected problems encountered in the implementation of IMIS and to determine a useful framework for IMIS. The Yonsei University Dental Hospital is steadily constructing an IMIS. The vendor's PACS software, Piview STAR, supports transactions between workstations that are approved to integrating the health care enterprise (IHE) with security function. It is necessary to develop an excellent framework that is good for the patient, health care provider and information system vendors, in an expert, efficient, and cost-effective manner. The problems encountered with IMIS implementation were high initial investments, delay of EMR enforcement, underdevelopment of digital radiographic appliances and software and insufficient educational training for users. The clinical environments of dental IMIS is some different from the medical situation. The best way to overcome these differences is to establish a gold standard of dental IMIS integration, which estimates the cost payback. The IHE and its technical framework are good for the patient, the health care provider and all information systems vendors.

  4. Integrated management of information inside maintenance processes. From the building registry to BIM systems

    Directory of Open Access Journals (Sweden)

    Cinzia Talamo

    2014-10-01

    Full Text Available The paper presents objec- tives, methods and results of two researches dealing with the improvement of integrated information management within maintenance processes. Focusing on information needs regarding the last phases of the building process, the two researches draft approaches characterizing a path of progressive improve- ment of strategies for integration: from a building registry, unique for the whole construction process, to an integrated management of the building process with the support of BIM systems.

  5. Gulf of Mexico Integrated Science - Tampa Bay Study - Data Information Management System (DIMS)

    Science.gov (United States)

    Johnston, James

    2004-01-01

    The Tampa Bay Integrated Science Study is an effort by the U.S. Geological Survey (USGS) that combines the expertise of federal, state and local partners to address some of the most pressing ecological problems of the Tampa Bay estuary. This project serves as a template for the application of integrated research projects in other estuaries in the Gulf of Mexico. Efficient information and data distribution for the Tampa Bay Study has required the development of a Data Information Management System (DIMS). This information system is being used as an outreach management tool, providing information to scientists, decision makers and the public on the coastal resources of the Gulf of Mexico.

  6. Integration of genomic information with biological networks using Cytoscape.

    Science.gov (United States)

    Bauer-Mehren, Anna

    2013-01-01

    Cytoscape is an open-source software for visualizing, analyzing, and modeling biological networks. This chapter explains how to use Cytoscape to analyze the functional effect of sequence variations in the context of biological networks such as protein-protein interaction networks and signaling pathways. The chapter is divided into five parts: (1) obtaining information about the functional effect of sequence variation in a Cytoscape readable format, (2) loading and displaying different types of biological networks in Cytoscape, (3) integrating the genomic information (SNPs and mutations) with the biological networks, and (4) analyzing the effect of the genomic perturbation onto the network structure using Cytoscape built-in functions. Finally, we briefly outline how the integrated data can help in building mathematical network models for analyzing the effect of the sequence variation onto the dynamics of the biological system. Each part is illustrated by step-by-step instructions on an example use case and visualized by many screenshots and figures.

  7. Stakeholder engagement: a key component of integrating genomic information into electronic health records.

    Science.gov (United States)

    Hartzler, Andrea; McCarty, Catherine A; Rasmussen, Luke V; Williams, Marc S; Brilliant, Murray; Bowton, Erica A; Clayton, Ellen Wright; Faucett, William A; Ferryman, Kadija; Field, Julie R; Fullerton, Stephanie M; Horowitz, Carol R; Koenig, Barbara A; McCormick, Jennifer B; Ralston, James D; Sanderson, Saskia C; Smith, Maureen E; Trinidad, Susan Brown

    2013-10-01

    Integrating genomic information into clinical care and the electronic health record can facilitate personalized medicine through genetically guided clinical decision support. Stakeholder involvement is critical to the success of these implementation efforts. Prior work on implementation of clinical information systems provides broad guidance to inform effective engagement strategies. We add to this evidence-based recommendations that are specific to issues at the intersection of genomics and the electronic health record. We describe stakeholder engagement strategies employed by the Electronic Medical Records and Genomics Network, a national consortium of US research institutions funded by the National Human Genome Research Institute to develop, disseminate, and apply approaches that combine genomic and electronic health record data. Through select examples drawn from sites of the Electronic Medical Records and Genomics Network, we illustrate a continuum of engagement strategies to inform genomic integration into commercial and homegrown electronic health records across a range of health-care settings. We frame engagement as activities to consult, involve, and partner with key stakeholder groups throughout specific phases of health information technology implementation. Our aim is to provide insights into engagement strategies to guide genomic integration based on our unique network experiences and lessons learned within the broader context of implementation research in biomedical informatics. On the basis of our collective experience, we describe key stakeholder practices, challenges, and considerations for successful genomic integration to support personalized medicine.

  8. Implementation of integrated heterogeneous electronic electrocardiography data into Maharaj Nakorn Chiang Mai Hospital Information System.

    Science.gov (United States)

    Khumrin, Piyapong; Chumpoo, Pitupoom

    2016-03-01

    Electrocardiography is one of the most important non-invasive diagnostic tools for diagnosing coronary heart disease. The electrocardiography information system in Maharaj Nakorn Chiang Mai Hospital required a massive manual labor effort. In this article, we propose an approach toward the integration of heterogeneous electrocardiography data and the implementation of an integrated electrocardiography information system into the existing Hospital Information System. The system integrates different electrocardiography formats into a consistent electrocardiography rendering by using Java software. The interface acts as middleware to seamlessly integrate different electrocardiography formats. Instead of using a common electrocardiography protocol, we applied a central format based on Java classes for mapping different electrocardiography formats which contains a specific parser for each electrocardiography format to acquire the same information. Our observations showed that the new system improved the effectiveness of data management, work flow, and data quality; increased the availability of information; and finally improved quality of care. © The Author(s) 2014.

  9. Information science team

    Science.gov (United States)

    Billingsley, F.

    1982-01-01

    Concerns are expressed about the data handling aspects of system design and about enabling technology for data handling and data analysis. The status, contributing factors, critical issues, and recommendations for investigations are listed for data handling, rectification and registration, and information extraction. Potential supports to individual P.I., research tasks, systematic data system design, and to system operation. The need for an airborne spectrometer class instrument for fundamental research in high spectral and spatial resolution is indicated. Geographic information system formatting and labelling techniques, very large scale integration, and methods for providing multitype data sets must also be developed.

  10. Integrate offsites management with information systems

    Energy Technology Data Exchange (ETDEWEB)

    Valleur, M. (TECHNIP, Paris (France))

    1993-11-01

    Computerized offsites management systems in oil refineries offer a unique opportunity to integrate advanced technology into a coherent refinery information system that contributes to benefits-driven optimal operations: from long-term, multirefinery linear programming (LP) models to sequential control of transfer lineups in the tank farm. There are strong incentives to automate and optimize the offsites operations, and benefits can be quantified to justify properly sized projects. The paper discusses the following: business opportunities, oil movement and advanced technology, project scoping and sizing, review of functional requirements, transfer automation, blending optimal control, on-line analyzers, oil movement and scheduling, organizational issues, and investment and benefits analysis.

  11. Data reduction pipeline for the CHARIS integral-field spectrograph I: detector readout calibration and data cube extraction

    Science.gov (United States)

    Brandt, Timothy D.; Rizzo, Maxime; Groff, Tyler; Chilcote, Jeffrey; Greco, Johnny P.; Kasdin, N. Jeremy; Limbach, Mary Anne; Galvin, Michael; Loomis, Craig; Knapp, Gillian; McElwain, Michael W.; Jovanovic, Nemanja; Currie, Thayne; Mede, Kyle; Tamura, Motohide; Takato, Naruhisa; Hayashi, Masahiko

    2017-10-01

    We present the data reduction pipeline for CHARIS, a high-contrast integral-field spectrograph for the Subaru Telescope. The pipeline constructs a ramp from the raw reads using the measured nonlinear pixel response and reconstructs the data cube using one of three extraction algorithms: aperture photometry, optimal extraction, or χ2 fitting. We measure and apply both a detector flatfield and a lenslet flatfield and reconstruct the wavelength- and position-dependent lenslet point-spread function (PSF) from images taken with a tunable laser. We use these measured PSFs to implement a χ2-based extraction of the data cube, with typical residuals of ˜5% due to imperfect models of the undersampled lenslet PSFs. The full two-dimensional residual of the χ2 extraction allows us to model and remove correlated read noise, dramatically improving CHARIS's performance. The χ2 extraction produces a data cube that has been deconvolved with the line-spread function and never performs any interpolations of either the data or the individual lenslet spectra. The extracted data cube also includes uncertainties for each spatial and spectral measurement. CHARIS's software is parallelized, written in Python and Cython, and freely available on github with a separate documentation page. Astrometric and spectrophotometric calibrations of the data cubes and PSF subtraction will be treated in a forthcoming paper.

  12. Dutch virtual integration of healthcare information.

    Science.gov (United States)

    de Graaf, J C; Vlug, A E; van Boven, G J

    2007-01-01

    As information technology creates opportunities for cooperation which crosses the boundaries between healthcare institutions, it will become an integral part of the Dutch healthcare system. Along with many involved organizations in healthcare the National IT Institute for Healthcare in the Netherlands (NICTIZ) is working on the realization of a national IT infrastructure for healthcare and a national electronic patient record (EPR). An underlying national architecture is designed to enable the Dutch EPR virtually, not in a national database, nor on a patient's smartcard. The required secure infrastructure provides generic functions for healthcare applications: patient identification, authentication and authorization of healthcare professionals. The first national applications in the EPR program using a national index of where patient data is stored, are the electronic medication record and the electronic record for after hours GP services. The rollout of the electronic medication record and electronic record for after hours GP services has been started in 2007. To guarantee progress of electronic data exchange in healthcare in the Netherlands we have primarily opted for two healthcare applications: the electronic medication record and the electronic record for after hours GP services. The use of a national switch-point containing the registry of where to find what information, guarantees that the professional receives the most recent information and omits large databases to contain downloaded data. Proper authorization, authentication as well as tracing by the national switchpoint also ensures a secure environment for the communication of delicate information.

  13. MiDas: automatic extraction of a common domain of discourse in sleep medicine for multi-center data integration.

    Science.gov (United States)

    Sahoo, Satya S; Ogbuji, Chimezie; Luo, Lingyun; Dong, Xiao; Cui, Licong; Redline, Susan S; Zhang, Guo-Qiang

    2011-01-01

    Clinical studies often use data dictionaries with controlled sets of terms to facilitate data collection, limited interoperability and sharing at a local site. Multi-center retrospective clinical studies require that these data dictionaries, originating from individual participating centers, be harmonized in preparation for the integration of the corresponding clinical research data. Domain ontologies are often used to facilitate multi-center data integration by modeling terms from data dictionaries in a logic-based language, but interoperability among domain ontologies (using automated techniques) is an unresolved issue. Although many upper-level reference ontologies have been proposed to address this challenge, our experience in integrating multi-center sleep medicine data highlights the need for an upper level ontology that models a common set of terms at multiple-levels of abstraction, which is not covered by the existing upper-level ontologies. We introduce a methodology underpinned by a Minimal Domain of Discourse (MiDas) algorithm to automatically extract a minimal common domain of discourse (upper-domain ontology) from an existing domain ontology. Using the Multi-Modality, Multi-Resource Environment for Physiological and Clinical Research (Physio-MIMI) multi-center project in sleep medicine as a use case, we demonstrate the use of MiDas in extracting a minimal domain of discourse for sleep medicine, from Physio-MIMI's Sleep Domain Ontology (SDO). We then extend the resulting domain of discourse with terms from the data dictionary of the Sleep Heart and Health Study (SHHS) to validate MiDas. To illustrate the wider applicability of MiDas, we automatically extract the respective domains of discourse from 6 sample domain ontologies from the National Center for Biomedical Ontologies (NCBO) and the OBO Foundry.

  14. Strict integrity control of biomedical images

    Science.gov (United States)

    Coatrieux, Gouenou; Maitre, Henri; Sankur, Bulent

    2001-08-01

    The control of the integrity and authentication of medical images is becoming ever more important within the Medical Information Systems (MIS). The intra- and interhospital exchange of images, such as in the PACS (Picture Archiving and Communication Systems), and the ease of copying, manipulation and distribution of images have brought forth the security aspects. In this paper we focus on the role of watermarking for MIS security and address the problem of integrity control of medical images. We discuss alternative schemes to extract verification signatures and compare their tamper detection performance.

  15. Electronic processing of informed consents in a global pharmaceutical company environment.

    Science.gov (United States)

    Vishnyakova, Dina; Gobeill, Julien; Oezdemir-Zaech, Fatma; Kreim, Olivier; Vachon, Therese; Clade, Thierry; Haenning, Xavier; Mikhailov, Dmitri; Ruch, Patrick

    2014-01-01

    We present an electronic capture tool to process informed consents, which are mandatory recorded when running a clinical trial. This tool aims at the extraction of information expressing the duration of the consent given by the patient to authorize the exploitation of biomarker-related information collected during clinical trials. The system integrates a language detection module (LDM) to route a document into the appropriate information extraction module (IEM). The IEM is based on language-specific sets of linguistic rules for the identification of relevant textual facts. The achieved accuracy of both the LDM and IEM is 99%. The architecture of the system is described in detail.

  16. Integration of Solid-phase Extraction with Electrothermal Atomic Absorption Spectrometry for Determination of Trace Elements

    OpenAIRE

    NUKATSUKA, Isoshi; OHZEKI, Kunio

    2006-01-01

    An enrichment step in a sample treatment is essential for trace analysis to improve the sensitivity and to eliminate the matrix of the sample. Solid-phase extraction (SPE) is one of the widely used enrichment technique. Electrothermal atomic absorption spectrometry (ETAAS) is a well-established determination technique for trace elements. The integration of SPE with ETAAS leads to further improvement of sensitivity, an automation of the measurement and the economy in the sample size, amounts o...

  17. Multi-fields' coordination information integrated platform for nuclear power plant operation preparation

    International Nuclear Information System (INIS)

    Yuan Chang; Li Yong; Ye Zhiqiang

    2011-01-01

    To realize the coordination in multi-fields' work and information sharing, by applying the method of Enterprise Architecture (EA), the business architecture, functional flow and application architecture of Nuclear Power Plant's operation preparation information integrated platform are designed, which can realize the information sharing and coordination of multi fields. (authors)

  18. Dietary integration with natural extract in rabbit: effects on growth performances and meat quality.

    Directory of Open Access Journals (Sweden)

    Sara Chiapparini

    2018-06-01

    Full Text Available In many countries of Europe rabbit meat is consumed for its nutritional characteristics, (Dalle Zotte, 2014; Hernández and Gondret, 2006. Since the ban of the use of antibiotic as growth promoter, natural substances have been studied as alternative with antioxidant, anti-inflammatory, antimicrobic and antiviral properties. The aim was to evaluate the effect of a dietary supplementation with natural extract mixture in growing rabbit on growth performances, carcass characteristics and Longissimus lumborum (LL muscle parameters. The trial was performed at the Research Institute for Animal Production (Nitra, Slovak Republic and lasted 42 days. At 35 days of age, 144 New Zealand White rabbits were randomly selected and divided in 3 experimental groups (4 rabbits/cage. The first fed a basal diet, the second (T1 and the third one (T2 received 0.3% and 0.6% of natural extract mixture, containing polyphenols from plants and seaweeds.  Dietary integration with natural extract improve (P<0.05 growth performances (ADG, FI and FC in T1 group. The fatty acid composition of LL muscle was positively affected (P=0.037 by natural extract supplementation with an increase of n-3 FA in T2 group than other treatments. Cholesterol content tended to be lower in T2 group (P=0.082 than T1 and C group (24.8 mg/100g T2 vs 34.6 mg/100g T1 vs 33.2 mg/100g C. Sensory analysis revealed that only the aroma was affected (P<0.05 by dietary treatments. Overall these results highlight that dietary supplementation with natural extract mixture, containing polyphenols from plants and seaweeds enhance growth performances, carcass weight, improving LL muscle nutritional parameters.

  19. Information security architecture an integrated approach to security in the organization

    CERN Document Server

    Killmeyer, Jan

    2000-01-01

    An information security architecture is made up of several components. Each component in the architecture focuses on establishing acceptable levels of control. These controls are then applied to the operating environment of an organization. Functionally, information security architecture combines technical, practical, and cost-effective solutions to provide an adequate and appropriate level of security.Information Security Architecture: An Integrated Approach to Security in the Organization details the five key components of an information security architecture. It provides C-level executives

  20. Ensuring the integrity of information resources based methods dvooznakovoho structural data encoding

    Directory of Open Access Journals (Sweden)

    О.К. Юдін

    2009-01-01

    Full Text Available  Developed methods of estimation of noise stability and correction of structural code constructions to distortion in comunication of data in informatively communication systems and networks taking into account providing of integrity of informative resource.

  1. Whatever the cost? Information integration in memory-based inferences depends on cognitive effort.

    Science.gov (United States)

    Hilbig, Benjamin E; Michalkiewicz, Martha; Castela, Marta; Pohl, Rüdiger F; Erdfelder, Edgar

    2015-05-01

    One of the most prominent models of probabilistic inferences from memory is the simple recognition heuristic (RH). The RH theory assumes that judgments are based on recognition in isolation, such that other information is ignored. However, some prior research has shown that available knowledge is not generally ignored. In line with the notion of adaptive strategy selection--and, thus, a trade-off between accuracy and effort--we hypothesized that information integration crucially depends on how easily accessible information beyond recognition is, how much confidence decision makers have in this information, and how (cognitively) costly it is to acquire it. In three experiments, we thus manipulated (a) the availability of information beyond recognition, (b) the subjective usefulness of this information, and (c) the cognitive costs associated with acquiring this information. In line with the predictions, we found that RH use decreased substantially, the more easily and confidently information beyond recognition could be integrated, and increased substantially with increasing cognitive costs.

  2. Extracting the Beat: An Experience-dependent Complex Integration of Multisensory Information Involving Multiple Levels of the Nervous System

    Directory of Open Access Journals (Sweden)

    Laurel J. Trainor

    2009-04-01

    Full Text Available In a series of studies we have shown that movement (or vestibular stimulation that is synchronized to every second or every third beat of a metrically ambiguous rhythm pattern biases people to perceive the meter as a march or as a waltz, respectively. Riggle (this volume claims that we postulate an "innate", "specialized brain unit" for beat perception that is "directly" influenced by vestibular input. In fact, to the contrary, we argue that experience likely plays a large role in the development of rhythmic auditory-movement interactions, and that rhythmic processing in the brain is widely distributed and includes subcortical and cortical areas involved in sound processing and movement. Further, we argue that vestibular and auditory information are integrated at various subcortical and cortical levels along with input from other sensory modalities, and it is not clear which levels are most important for rhythm processing or, indeed, what a "direct" influence of vestibular input would mean. Finally, we argue that vestibular input to sound location mechanisms may be involved, but likely cannot explain the influence of vestibular input on the perception of auditory rhythm. This remains an empirical question for future research.

  3. Integrated information system for analysis of nuclear power plants

    International Nuclear Information System (INIS)

    Galperin, A.

    1994-01-01

    Performing complicated engineering analyses of a nuclear power plant requires storage and manipulation of a large amount of information, both data and knowledge. This information is characterized by its multidisciplinary nature, complexity, and diversity. The problems caused by inefficient and lengthy manual operations involving the data flow management within the frame-work of the safety-related analysis of a power plant can be solved by applying the computer aided engineering principles. These principles are the basis of the design of an integrated information storage system (IRIS). The basic idea is to create a computerized environment, which includes both database and functional capabilities. Consideration and analysis of the data types and required data manipulation capabilities as well as operational requirements, resulted in the choice of an object-oriented data-base management system (OODBMS) as a development platform for solving the software engineering problems. Several advantages of OODBMSs over conventional relations database systems were found of crucial importance, especially providing the necessary flexibility for different data types and extensibility potential. A detailed design of a data model is produced for the plant technical data and for the storage of analysis results. The overall system architecture was designed to assure the feasibility of integrating database capabilities with procedures and functions written in conventional algorithmic programming languages

  4. Towards integrating control and information theories from information-theoretic measures to control performance limitations

    CERN Document Server

    Fang, Song; Ishii, Hideaki

    2017-01-01

    This book investigates the performance limitation issues in networked feedback systems. The fact that networked feedback systems consist of control and communication devices and systems calls for the integration of control theory and information theory. The primary contributions of this book lie in two aspects: the newly-proposed information-theoretic measures and the newly-discovered control performance limitations. We first propose a number of information notions to facilitate the analysis. Using those notions, classes of performance limitations of networked feedback systems, as well as state estimation systems, are then investigated. In general, the book presents a unique, cohesive treatment of performance limitation issues of networked feedback systems via an information-theoretic approach. This book is believed to be the first to treat the aforementioned subjects systematically and in a unified manner, offering a unique perspective differing from existing books.

  5. AERIS: An Integrated Domain Information System for Aerospace Science and Technology

    Science.gov (United States)

    Hatua, Sudip Ranjan; Madalli, Devika P.

    2011-01-01

    Purpose: The purpose of this paper is to discuss the methodology in building an integrated domain information system with illustrations that provide proof of concept. Design/methodology/approach: The present work studies the usual search engine approach to information and its pitfalls. A methodology was adopted for construction of a domain-based…

  6. An Integrated Information System for Supporting Quality Management Tasks

    Science.gov (United States)

    Beyer, N.; Helmreich, W.

    2004-08-01

    In a competitive environment, well defined processes become the strategic advantage of a company. Hence, targeted Quality Management ensures efficiency, trans- parency and, ultimately, customer satisfaction. In the particular context of a Space Test Centre, a num- ber of specific Quality Management standards have to be applied. According to the revision of ISO 9001 dur- ing 2000, and due to the adaptation of ECSS-Q20-07, process orientation and data analysis are key tasks for ensuring and evaluating the efficiency of a company's processes. In line with these requirements, an integrated management system for accessing the necessary infor- mation to support Quality Management and other proc- esses has been established. Some of its test-related fea- tures are presented here. Easy access to the integrated management system from any work place at IABG's Space Test Centre is ensured by means of an intranet portal. It comprises a full set of quality-related process descriptions, information on test facilities, emergency procedures, and other relevant in- formation. The portal's web interface provides direct access to a couple of external applications. Moreover, easy updating of all information and low cost mainte- nance are features of this integrated information system. The timely and transparent management of non- conformances is covered by a dedicated NCR database which incorporates full documentation capability, elec- tronic signature and e-mail notification of concerned staff. A search interface allows for queries across all documented non-conformances. Furthermore, print ver- sions can be generated at any stage in the process, e.g. for distribution to customers. Feedback on customer satisfaction is sought through a web-based questionnaire. The process is initiated by the responsible test manager through submission of an e- mail that contains a hyperlink to a secure website, ask- ing the customer to complete the brief online form, which is directly fed to a database

  7. An information integration system for structured documents, Web, and databases

    OpenAIRE

    Morishima, Atsuyuki

    1998-01-01

    Rapid advance in computer network technology has changed the style of computer utilization. Distributed computing resources over world-wide computer networks are available from our local computers. They include powerful computers and a variety of information sources. This change is raising more advanced requirements. Integration of distributed information sources is one of such requirements. In addition to conventional databases, structured documents have been widely used, and have increasing...

  8. Heat and power demands in babassu palm oil extraction industry in Brazil

    International Nuclear Information System (INIS)

    Teixeira, Marcos A.

    2005-01-01

    The objective of this paper is to analyze the energy use profile of the babassu (Orbignya ssp-Palmae) oil extraction industry in Brazil in order to establish the basis for a cogeneration study of this important part of the Brazilian Northeast region economy, which is still ignored by energetic biomass studies. The work used information from new equipment suppliers that was analyzed against field information from operating units. The data was used to establish a basis for the thermal and mechanical energy consumption for the two main basic unit profiles for the sector: a simple one with just oil extraction and the other, more vertically integrated with other secondary by-products. For the energetic demand taken from the only oil extraction unit profile study, the minimum pressure for the steam process was estimated at 1.4MPa, electric demand at 5.79kW/ton of processed kernel and heat consumption at 2071MJ/ton of processed kernel (829kg steam/ton of processed kernel). For the vertically integrated unit profile, the following values were found: minimum pressure for the steam process 1.4MPa, electric demand 6.22kW/ton of processed kernel and heat consumption 21,503MJ/ton of processed kernel (7600kg steam/ton of processed kernel)

  9. International seminar on integrated information systems. Book of extended synopses

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-04-01

    The information available to the IAEA under comprehensive safeguards agreement with an Additional protocol is intended to provide for as complete a picture as practicable of a State's current or planned nuclear programme. The central components of the strengthened safeguards system are: increased IAEA access to and evaluation of information about States' nuclear and nuclear-related activities and increased physical access to relevant locations for verification of the exclusively peaceful content of a States' nuclear programme. Strengthening measures implemented under the existing legal authority of the Agency have contributed to increased information and physical access. Thus the role of integrated information systems for safeguards relevant data acquisition became more significant.

  10. International seminar on integrated information systems. Book of extended synopses

    International Nuclear Information System (INIS)

    2000-04-01

    The information available to the IAEA under comprehensive safeguards agreement with an Additional protocol is intended to provide for as complete a picture as practicable of a State's current or planned nuclear programme. The central components of the strengthened safeguards system are: increased IAEA access to and evaluation of information about States' nuclear and nuclear-related activities and increased physical access to relevant locations for verification of the exclusively peaceful content of a States' nuclear programme. Strengthening measures implemented under the existing legal authority of the Agency have contributed to increased information and physical access. Thus the role of integrated information systems for safeguards relevant data acquisition became more significant

  11. Identifying influential factors on integrated marketing planning using information technology

    Directory of Open Access Journals (Sweden)

    Karim Hamdi

    2014-07-01

    Full Text Available This paper presents an empirical investigation to identify important factors influencing integrated marketing planning using information technology. The proposed study designs a questionnaire for measuring integrated marketing planning, which consists of three categories of structural factors, behavioral factors and background factors. There are 40 questions associated with the proposed study in Likert scale. Cronbach alphas have been calculated for structural factors, behavioral factors and background factors as 0.89, 0.86 and 0.83, respectively. Using some statistical test, the study has confirmed the effects of three factors on integrated marketing. In addition, the implementation of Freedman test has revealed that structural factors were the most important factor followed by background factors and behavioral factors.

  12. 5W1H Information Extraction with CNN-Bidirectional LSTM

    Science.gov (United States)

    Nurdin, A.; Maulidevi, N. U.

    2018-03-01

    In this work, information about who, did what, when, where, why, and how on Indonesian news articles were extracted by combining Convolutional Neural Network and Bidirectional Long Short-Term Memory. Convolutional Neural Network can learn semantically meaningful representations of sentences. Bidirectional LSTM can analyze the relations among words in the sequence. We also use word embedding word2vec for word representation. By combining these algorithms, we obtained F-measure 0.808. Our experiments show that CNN-BLSTM outperforms other shallow methods, namely IBk, C4.5, and Naïve Bayes with the F-measure 0.655, 0.645, and 0.595, respectively.

  13. The fruit extract of Berberis crataegina DC: exerts potent antioxidant activity and protects DNA integrity.

    Science.gov (United States)

    Charehsaz, Mohammad; Sipahi, Hande; Celep, Engin; Üstündağ, Aylin; Cemiloğlu Ülker, Özge; Duydu, Yalçın; Aydın, Ahmet; Yesilada, Erdem

    2015-04-17

    Dried fruits of Berberis crataegina (Berberidaceae) have been frequently consumed as food garniture in Turkish cuisine, while its fruit paste has been used to increase stamina and in particular to prevent from cardiovascular dysfunctions in Northeastern Black Sea region of Turkey. This study investigated this folkloric information in order to explain the claimed healing effects as well as to evaluate possible risks. Total phenolic, flavonoid and proanthocyanidin contents and antioxidant capacity of the methanolic fruit extract were evaluated through several in vitro assays. The cytotoxic and genotoxic effects of B. crataegina fruit extract were also assessed in both cervical cancer cell line (HeLa) and human peripheral blood lymphocytes. The extract showed protective effects against ferric-induced oxidative stress and had a relatively good antioxidant activity. It also ameliorated the H2O2 mediated DNA damage in lymphocytes, suggesting the protective effect against oxidative DNA damage. The methanolic extract of B. crataegina fruits may be a potential antioxidant nutrient and also may exert a protective role against lipid peroxidation as well as oxidative DNA damage.

  14. Respiratory Information Extraction from Electrocardiogram Signals

    KAUST Repository

    Amin, Gamal El Din Fathy

    2010-12-01

    The Electrocardiogram (ECG) is a tool measuring the electrical activity of the heart, and it is extensively used for diagnosis and monitoring of heart diseases. The ECG signal reflects not only the heart activity but also many other physiological processes. The respiratory activity is a prominent process that affects the ECG signal due to the close proximity of the heart and the lungs. In this thesis, several methods for the extraction of respiratory process information from the ECG signal are presented. These methods allow an estimation of the lung volume and the lung pressure from the ECG signal. The potential benefit of this is to eliminate the corresponding sensors used to measure the respiration activity. A reduction of the number of sensors connected to patients will increase patients’ comfort and reduce the costs associated with healthcare. As a further result, the efficiency of diagnosing respirational disorders will increase since the respiration activity can be monitored with a common, widely available method. The developed methods can also improve the detection of respirational disorders that occur while patients are sleeping. Such disorders are commonly diagnosed in sleeping laboratories where the patients are connected to a number of different sensors. Any reduction of these sensors will result in a more natural sleeping environment for the patients and hence a higher sensitivity of the diagnosis.

  15. Integrating Environmental and Information Systems Management: An Enterprise Architecture Approach

    Science.gov (United States)

    Noran, Ovidiu

    Environmental responsibility is fast becoming an important aspect of strategic management as the reality of climate change settles in and relevant regulations are expected to tighten significantly in the near future. Many businesses react to this challenge by implementing environmental reporting and management systems. However, the environmental initiative is often not properly integrated in the overall business strategy and its information system (IS) and as a result the management does not have timely access to (appropriately aggregated) environmental information. This chapter argues for the benefit of integrating the environmental management (EM) project into the ongoing enterprise architecture (EA) initiative present in all successful companies. This is done by demonstrating how a reference architecture framework and a meta-methodology using EA artefacts can be used to co-design the EM system, the organisation and its IS in order to achieve a much needed synergy.

  16. Integrating Information Services in an Academic Setting: The Organizational and Technical Challenge.

    Science.gov (United States)

    Branin, Joseph J.; And Others

    1993-01-01

    Describes a project to integrate the support and delivery of information services to faculty and staff at the University of Minnesota from the planning phase to implementation of a new organizational entity. Topics addressed include technical and organizational integration, control and delivery of services, and networking and organizational fit.…

  17. Unified Information Access in Product Creation with an Integrated Control Desk

    Science.gov (United States)

    Wrasse, Kevin; Diener, Holger; Hayka, Haygazun; Stark, Rainer

    2017-06-01

    Customers demand for individualized products leads to a large variety of different products in small series and single-unit production. A high flexibility pressure in product creation is one result of this trend. In order to counteract the pressure, the information steadily increasing by Industry 4.0 must be made available at the workplace. Additionally, a better exchange of information between product development, production planning and production is necessary. The improvement of individual systems, like CAD, PDM, ERP and MES, can only achieve this to a limited extent. Since they mostly use systems from different manufacturers, the necessary deeper integration of information is only feasible for SMEs to a limited extend. The presented control desk helps to ensure a more flexible product creation as well as information exchange. It captures information from different IT systems in the production process and presents them integrated, task-oriented and oriented to the user’s mental model, e.g. information of the production combined with the 3D model of product parts, or information about product development on the 3D model of the production. The solution is a digital 3D model of the manufacturing environment, which is enriched by billboards for a quick information overview and web service windows to access detailed MES and PDM information. By this, the level of abstraction can be reduced and reacts to changed requirements in the short term, making informed decisions. The interaction with the control stands utilizes the touch skills of mobile and fixed systems such as smartphones, tablets and multitouch tables.

  18. Information extraction and knowledge graph construction from geoscience literature

    Science.gov (United States)

    Wang, Chengbin; Ma, Xiaogang; Chen, Jianguo; Chen, Jingwen

    2018-03-01

    Geoscience literature published online is an important part of open data, and brings both challenges and opportunities for data analysis. Compared with studies of numerical geoscience data, there are limited works on information extraction and knowledge discovery from textual geoscience data. This paper presents a workflow and a few empirical case studies for that topic, with a focus on documents written in Chinese. First, we set up a hybrid corpus combining the generic and geology terms from geology dictionaries to train Chinese word segmentation rules of the Conditional Random Fields model. Second, we used the word segmentation rules to parse documents into individual words, and removed the stop-words from the segmentation results to get a corpus constituted of content-words. Third, we used a statistical method to analyze the semantic links between content-words, and we selected the chord and bigram graphs to visualize the content-words and their links as nodes and edges in a knowledge graph, respectively. The resulting graph presents a clear overview of key information in an unstructured document. This study proves the usefulness of the designed workflow, and shows the potential of leveraging natural language processing and knowledge graph technologies for geoscience.

  19. Multi-sensor integration for autonomous robots in nuclear power plants

    International Nuclear Information System (INIS)

    Mann, R.C.; Jones, J.P.; Beckerman, M.; Glover, C.W.; Farkas, L.; Bilbro, G.L.; Snyder, W.

    1989-01-01

    As part of a concerted RandD program in advanced robotics for hazardous environments, scientists and engineers at the Oak Ridge National Laboratory (ORNL) are performing research in the areas of systems integration, range-sensor-based 3-D world modeling, and multi-sensor integration. This program features a unique teaming arrangement that involves the universities of Florida, Michigan, Tennessee, and Texas; Odetics Corporation; and ORNL. This paper summarizes work directed at integrating information extracted from data collected with range sensors and CCD cameras on-board a mobile robot, in order to produce reliable descriptions of the robot's environment. Specifically, the paper describes the integration of two-dimensional vision and sonar range information, and an approach to integrate registered luminance and laser range images. All operations are carried out on-board the mobile robot using a 16-processor hypercube computer. 14 refs., 4 figs

  20. Integrated project management information systems: the French nuclear industry experience

    International Nuclear Information System (INIS)

    Jacquin, J.-C.; Caupin, G.-M.

    1990-01-01

    The article discusses the desirability of integrated project management systems within the French nuclear power industry. Change in demand for nuclear generation facilities over the last two decades has necessitated a change of policy concerning organization, cost and planning within the industry. Large corporate systems can benefit from integrating equipment and bulk materials tracking. Project management for the nuclear industry will, in future, need to incorporate computer aided design tools and project management information systems data bases as well as equipment and planning data. (UK)

  1. Integrated project management information systems: the French nuclear industry experience

    Energy Technology Data Exchange (ETDEWEB)

    Jacquin, J.-C.; Caupin, G.-M.

    1990-03-01

    The article discusses the desirability of integrated project management systems within the French nuclear power industry. Change in demand for nuclear generation facilities over the last two decades has necessitated a change of policy concerning organization, cost and planning within the industry. Large corporate systems can benefit from integrating equipment and bulk materials tracking. Project management for the nuclear industry will, in future, need to incorporate computer aided design tools and project management information systems data bases as well as equipment and planning data. (UK).

  2. The Integration of the Information and Communication Functions, and the Marketing of the Resulting Products.

    Science.gov (United States)

    Harris, Susan C.

    1985-01-01

    Discusses the theoretical basis for integration of information functions and communication functions, the relevance of this integration in the scientific information cycle, and its positive effect on commodity research networks. The application of this theory is described using three commodity programs of the Centro Internacional de Agricultura…

  3. VaProS: a database-integration approach for protein/genome information retrieval

    KAUST Repository

    Gojobori, Takashi; Ikeo, Kazuho; Katayama, Yukie; Kawabata, Takeshi; Kinjo, Akira R.; Kinoshita, Kengo; Kwon, Yeondae; Migita, Ohsuke; Mizutani, Hisashi; Muraoka, Masafumi; Nagata, Koji; Omori, Satoshi; Sugawara, Hideaki; Yamada, Daichi; Yura, Kei

    2016-01-01

    Life science research now heavily relies on all sorts of databases for genome sequences, transcription, protein three-dimensional (3D) structures, protein–protein interactions, phenotypes and so forth. The knowledge accumulated by all the omics research is so vast that a computer-aided search of data is now a prerequisite for starting a new study. In addition, a combinatory search throughout these databases has a chance to extract new ideas and new hypotheses that can be examined by wet-lab experiments. By virtually integrating the related databases on the Internet, we have built a new web application that facilitates life science researchers for retrieving experts’ knowledge stored in the databases and for building a new hypothesis of the research target. This web application, named VaProS, puts stress on the interconnection between the functional information of genome sequences and protein 3D structures, such as structural effect of the gene mutation. In this manuscript, we present the notion of VaProS, the databases and tools that can be accessed without any knowledge of database locations and data formats, and the power of search exemplified in quest of the molecular mechanisms of lysosomal storage disease. VaProS can be freely accessed at http://p4d-info.nig.ac.jp/vapros/.

  4. VaProS: a database-integration approach for protein/genome information retrieval

    KAUST Repository

    Gojobori, Takashi

    2016-12-24

    Life science research now heavily relies on all sorts of databases for genome sequences, transcription, protein three-dimensional (3D) structures, protein–protein interactions, phenotypes and so forth. The knowledge accumulated by all the omics research is so vast that a computer-aided search of data is now a prerequisite for starting a new study. In addition, a combinatory search throughout these databases has a chance to extract new ideas and new hypotheses that can be examined by wet-lab experiments. By virtually integrating the related databases on the Internet, we have built a new web application that facilitates life science researchers for retrieving experts’ knowledge stored in the databases and for building a new hypothesis of the research target. This web application, named VaProS, puts stress on the interconnection between the functional information of genome sequences and protein 3D structures, such as structural effect of the gene mutation. In this manuscript, we present the notion of VaProS, the databases and tools that can be accessed without any knowledge of database locations and data formats, and the power of search exemplified in quest of the molecular mechanisms of lysosomal storage disease. VaProS can be freely accessed at http://p4d-info.nig.ac.jp/vapros/.

  5. The integration of information and communication technology into community pharmacists practice in Barcelona.

    Science.gov (United States)

    Lupiáñez-Villanueva, Francisco; Hardey, Michael; Lluch, Maria

    2014-03-01

    The study aims to identify community pharmacists' (CPs) utilization of information and communication technology (ICT); to develop and characterize a typology of CPs' utilization of ICT and to identify factors that can enhance or inhibit the use of these technologies. An online survey of the 7649 members of the Pharmacist Association of Barcelona who had a registered email account in 2006 was carried out. Factor analysis, cluster analysis and binomial logit modelling were undertaken. Multivariate analysis of the CPs' responses to the survey (648) revealed two profiles of adoption of ICT. The first profile (40.75%) represents those CPs who place high emphasis on ICT within their practice. This group is therefore referred to as 'integrated CPs'. The second profile (59.25%) represents those CPs who make less use of ICT and so are consequently labelled 'non-integrated CPs'. Statistical modelling was used to identify variables that were important in predisposing CPs to integrate ICT with their work. From the analysis it is evident that responses to questions relating to 'recommend patients going on line for health information'; 'patients discuss or share their Internet health information findings'; 'emphasis on the Internet for communication and dissemination' and 'Pharmacists Professional Association information' play a positive and significant role in the probability of being an 'integrated CP'. The integration of ICT within CPs' practices cannot be adequately understood and appreciated without examining how CPs are making use of ICT within their own practice, their organizational context and the nature of the pharmacists-client relationship.

  6. Information Literacy for Multiple Disciplines: Toward a Campus-Wide Integration Model at Indiana University, Bloomington

    Directory of Open Access Journals (Sweden)

    Brian Winterman

    2011-11-01

    Full Text Available Within disciplines are a set of shared values and thought processes that students must master in order to become participants of that discipline. Information literacy as defined by the ACRL is a set of standards and principles that can apply to all disciplines. In order to produce information literate undergraduates in a given discipline, information literacy standards must be integrated with the values and processes of the discipline. In this study, librarians partnered with faculty in gender studies and molecular biology to integrate information literacy with courses in those areas. Student performance and attitudes improved as a result of the collaboration. This article discusses the collaboration process, the assessment methods and results, and the long-term importance of developing best practices for information literacy integration at the campus level through a disciplinary approach.

  7. An open, component-based information infrastructure for integrated health information networks.

    Science.gov (United States)

    Tsiknakis, Manolis; Katehakis, Dimitrios G; Orphanoudakis, Stelios C

    2002-12-18

    A fundamental requirement for achieving continuity of care is the seamless sharing of multimedia clinical information. Different technological approaches can be adopted for enabling the communication and sharing of health record segments. In the context of the emerging global information society, the creation of and access to the integrated electronic health record (I-EHR) of a citizen has been assigned high priority in many countries. This requirement is complementary to an overall requirement for the creation of a health information infrastructure (HII) to support the provision of a variety of health telematics and e-health services. In developing a regional or national HII, the components or building blocks that make up the overall information system ought to be defined and an appropriate component architecture specified. This paper discusses current international priorities and trends in developing the HII. It presents technological challenges and alternative approaches towards the creation of an I-EHR, being the aggregation of health data created during all interactions of an individual with the healthcare system. It also presents results from an ongoing Research and Development (R&D) effort towards the implementation of the HII in HYGEIAnet, the regional health information network of Crete, Greece, using a component-based software engineering approach. Critical design decisions and related trade-offs, involved in the process of component specification and development, are also discussed and the current state of development of an I-EHR service is presented. Finally, Human Computer Interaction (HCI) and security issues, which are important for the deployment and use of any I-EHR service, are considered.

  8. Extraction of neutron spectral information from Bonner-Sphere data

    CERN Document Server

    Haney, J H; Zaidins, C S

    1999-01-01

    We have extended a least-squares method of extracting neutron spectral information from Bonner-Sphere data which was previously developed by Zaidins et al. (Med. Phys. 5 (1978) 42). A pulse-height analysis with background stripping is employed which provided a more accurate count rate for each sphere. Newer response curves by Mares and Schraube (Nucl. Instr. and Meth. A 366 (1994) 461) were included for the moderating spheres and the bare detector which comprise the Bonner spectrometer system. Finally, the neutron energy spectrum of interest was divided using the philosophy of fuzzy logic into three trapezoidal regimes corresponding to slow, moderate, and fast neutrons. Spectral data was taken using a PuBe source in two different environments and the analyzed data is presented for these cases as slow, moderate, and fast neutron fluences. (author)

  9. A scalable architecture for extracting, aligning, linking, and visualizing multi-Int data

    Science.gov (United States)

    Knoblock, Craig A.; Szekely, Pedro

    2015-05-01

    An analyst today has a tremendous amount of data available, but each of the various data sources typically exists in their own silos, so an analyst has limited ability to see an integrated view of the data and has little or no access to contextual information that could help in understanding the data. We have developed the Domain-Insight Graph (DIG) system, an innovative architecture for extracting, aligning, linking, and visualizing massive amounts of domain-specific content from unstructured sources. Under the DARPA Memex program we have already successfully applied this architecture to multiple application domains, including the enormous international problem of human trafficking, where we extracted, aligned and linked data from 50 million online Web pages. DIG builds on our Karma data integration toolkit, which makes it easy to rapidly integrate structured data from a variety of sources, including databases, spreadsheets, XML, JSON, and Web services. The ability to integrate Web services allows Karma to pull in live data from the various social media sites, such as Twitter, Instagram, and OpenStreetMaps. DIG then indexes the integrated data and provides an easy to use interface for query, visualization, and analysis.

  10. A Method of Road Extraction from High-resolution Remote Sensing Images Based on Shape Features

    Directory of Open Access Journals (Sweden)

    LEI Xiaoqi

    2016-02-01

    Full Text Available Road extraction from high-resolution remote sensing image is an important and difficult task.Since remote sensing images include complicated information,the methods that extract roads by spectral,texture and linear features have certain limitations.Also,many methods need human-intervention to get the road seeds(semi-automatic extraction,which have the great human-dependence and low efficiency.The road-extraction method,which uses the image segmentation based on principle of local gray consistency and integration shape features,is proposed in this paper.Firstly,the image is segmented,and then the linear and curve roads are obtained by using several object shape features,so the method that just only extract linear roads are rectified.Secondly,the step of road extraction is carried out based on the region growth,the road seeds are automatic selected and the road network is extracted.Finally,the extracted roads are regulated by combining the edge information.In experiments,the images that including the better gray uniform of road and the worse illuminated of road surface were chosen,and the results prove that the method of this study is promising.

  11. Information Extraction for Clinical Data Mining: A Mammography Case Study.

    Science.gov (United States)

    Nassif, Houssam; Woods, Ryan; Burnside, Elizabeth; Ayvaci, Mehmet; Shavlik, Jude; Page, David

    2009-01-01

    Breast cancer is the leading cause of cancer mortality in women between the ages of 15 and 54. During mammography screening, radiologists use a strict lexicon (BI-RADS) to describe and report their findings. Mammography records are then stored in a well-defined database format (NMD). Lately, researchers have applied data mining and machine learning techniques to these databases. They successfully built breast cancer classifiers that can help in early detection of malignancy. However, the validity of these models depends on the quality of the underlying databases. Unfortunately, most databases suffer from inconsistencies, missing data, inter-observer variability and inappropriate term usage. In addition, many databases are not compliant with the NMD format and/or solely consist of text reports. BI-RADS feature extraction from free text and consistency checks between recorded predictive variables and text reports are crucial to addressing this problem. We describe a general scheme for concept information retrieval from free text given a lexicon, and present a BI-RADS features extraction algorithm for clinical data mining. It consists of a syntax analyzer, a concept finder and a negation detector. The syntax analyzer preprocesses the input into individual sentences. The concept finder uses a semantic grammar based on the BI-RADS lexicon and the experts' input. It parses sentences detecting BI-RADS concepts. Once a concept is located, a lexical scanner checks for negation. Our method can handle multiple latent concepts within the text, filtering out ultrasound concepts. On our dataset, our algorithm achieves 97.7% precision, 95.5% recall and an F 1 -score of 0.97. It outperforms manual feature extraction at the 5% statistical significance level.

  12. Extending Current Theories of Cross-Boundary Information Sharing and Integration: A Case Study of Taiwan e-Government

    Science.gov (United States)

    Yang, Tung-Mou

    2011-01-01

    Information sharing and integration has long been considered an important approach for increasing organizational efficiency and performance. With advancements in information and communication technologies, sharing and integrating information across organizations becomes more attractive and practical to organizations. However, achieving…

  13. Integration of Hospital Information and Clinical Decision Support Systems to Enable the Reuse of Electronic Health Record Data.

    Science.gov (United States)

    Kopanitsa, Georgy

    2017-05-18

    The efficiency and acceptance of clinical decision support systems (CDSS) can increase if they reuse medical data captured during health care delivery. High heterogeneity of the existing legacy data formats has become the main barrier for the reuse of data. Thus, we need to apply data modeling mechanisms that provide standardization, transformation, accumulation and querying medical data to allow its reuse. In this paper, we focus on the interoperability issues of the hospital information systems (HIS) and CDSS data integration. Our study is based on the approach proposed by Marcos et al. where archetypes are used as a standardized mechanism for the interaction of a CDSS with an electronic health record (EHR). We build an integration tool to enable CDSSs collect data from various institutions without a need for modifications in the implementation. The approach implies development of a conceptual level as a set of archetypes representing concepts required by a CDSS. Treatment case data from Regional Clinical Hospital in Tomsk, Russia was extracted, transformed and loaded to the archetype database of a clinical decision support system. Test records' normalization has been performed by defining transformation and aggregation rules between the EHR data and the archetypes. These mapping rules were used to automatically generate openEHR compliant data. After the transformation, archetype data instances were loaded into the CDSS archetype based data storage. The performance times showed acceptable performance for the extraction stage with a mean of 17.428 s per year (3436 case records). The transformation times were also acceptable with 136.954 s per year (0.039 s per one instance). The accuracy evaluation showed the correctness and applicability of the method for the wide range of HISes. These operations were performed without interrupting the HIS workflow to prevent the HISes from disturbing the service provision to the users. The project results have proven that

  14. Management information system for cost-schedule integration control for nuclear power projects

    International Nuclear Information System (INIS)

    Liu Wei; Wang Yongqing; Tian Li

    2001-01-01

    Based on the project management experience abroad and at home, a cost-schedule integration control model was developed to improve nuclear power project management. The model integrates cost data with the scheduling data by unity coding to efficiently implement cost-schedule integration control on line. The software system architecture and database is designed and implemented. The system functions include estimating and forecasting dynamically cash flow, scheduling and evaluating deviation from the cost-schedule plan, etc. The research and development of the system should improve the architecture of computer integrated management information systems for nuclear power projects in China

  15. Economic Analysis of an Integrated Annatto Seeds-Sugarcane Biorefinery Using Supercritical CO2 Extraction as a First Step

    Directory of Open Access Journals (Sweden)

    Juliana Q. Albarelli

    2016-06-01

    Full Text Available Recently, supercritical fluid extraction (SFE has been indicated to be utilized as part of a biorefinery, rather than as a stand-alone technology, since besides extracting added value compounds selectively it has been shown to have a positive effect on the downstream processing of biomass. To this extent, this work evaluates economically the encouraging experimental results regarding the use of SFE during annatto seeds valorization. Additionally, other features were discussed such as the benefits of enhancing the bioactive compounds concentration through physical processes and of integrating the proposed annatto seeds biorefinery to a hypothetical sugarcane biorefinery, which produces its essential inputs, e.g., CO2, ethanol, heat and electricity. For this, first, different configurations were modeled and simulated using the commercial simulator Aspen Plus® to determine the mass and energy balances. Next, each configuration was economically assessed using MATLAB. SFE proved to be decisive to the economic feasibility of the proposed annatto seeds-sugarcane biorefinery concept. SFE pretreatment associated with sequential fine particles separation process enabled higher bixin-rich extract production using low-pressure solvent extraction method employing ethanol, meanwhile tocotrienols-rich extract is obtained as a first product. Nevertheless, the economic evaluation showed that increasing tocotrienols-rich extract production has a more pronounced positive impact on the economic viability of the concept.

  16. Internationalisation of information services for publishers' open access policies: the DINI multilingual integration layer

    Science.gov (United States)

    Scholze, Frank

    2008-01-01

    It is essential for the strategy of open access self-archiving that scientific authors are given comprehensive information on publisher copyright policies. DINI, the German Initiative for Networked Information, has developed a German (and potentially multilingual) interface to the English SHERPA/RoMEO service to provide additional information on German publishers' open access policies. As a next step, this interface was enhanced to an integration layer combining different sources on publisher copyright policies. This integration layer can be used in many different contexts. Together with the SHERPA/RoMEO team, DINI aims to build an international support structure for open access information. PMID:18662383

  17. The Relationship between Hospital Financial Performance and Information Technology Integration Strategy Selection

    Science.gov (United States)

    Xie, Yue

    2012-01-01

    In light of the new healthcare regulations, hospitals are increasingly reevaluating their IT integration strategies to meet expanded healthcare information exchange requirements. Nevertheless, hospital executives do not have all the information they need to differentiate between the available strategies and recognize what may better fit their…

  18. The informed application of building-integrated wind power

    Energy Technology Data Exchange (ETDEWEB)

    Breshears, J.; Briscoe, C. [Zimmer Gunsal Frasca Architects, Portland, OR (United States)

    2009-07-01

    This paper reported on an exercise that was undertaken to integrate small-scale wind turbines into the design of an urban high-rise in Portland, Oregon. Wind behaviour in the urban environment is very complex, as the flow of wind over and around buildings often triggers multiple transitions of the air from laminar flow to turbulent. The study documented the process of moving beyond a simplistic approach to a truly informed application of building-integrated wind generation. The 4 key issues addressed in the study process were quantifying the geographical wind regime; predicting wind flow over the building; turbine selection; and pragmatics regarding the design of roof mounting to accommodate structural loads and mitigate vibration. The results suggested that the turbine array should produce in the range of only 1 per cent of the electrical load of the building. 13 refs., 11 figs.

  19. Overview of image processing tools to extract physical information from JET videos

    Science.gov (United States)

    Craciunescu, T.; Murari, A.; Gelfusa, M.; Tiseanu, I.; Zoita, V.; EFDA Contributors, JET

    2014-11-01

    In magnetic confinement nuclear fusion devices such as JET, the last few years have witnessed a significant increase in the use of digital imagery, not only for the surveying and control of experiments, but also for the physical interpretation of results. More than 25 cameras are routinely used for imaging on JET in the infrared (IR) and visible spectral regions. These cameras can produce up to tens of Gbytes per shot and their information content can be very different, depending on the experimental conditions. However, the relevant information about the underlying physical processes is generally of much reduced dimensionality compared to the recorded data. The extraction of this information, which allows full exploitation of these diagnostics, is a challenging task. The image analysis consists, in most cases, of inverse problems which are typically ill-posed mathematically. The typology of objects to be analysed is very wide, and usually the images are affected by noise, low levels of contrast, low grey-level in-depth resolution, reshaping of moving objects, etc. Moreover, the plasma events have time constants of ms or tens of ms, which imposes tough conditions for real-time applications. On JET, in the last few years new tools and methods have been developed for physical information retrieval. The methodology of optical flow has allowed, under certain assumptions, the derivation of information about the dynamics of video objects associated with different physical phenomena, such as instabilities, pellets and filaments. The approach has been extended in order to approximate the optical flow within the MPEG compressed domain, allowing the manipulation of the large JET video databases and, in specific cases, even real-time data processing. The fast visible camera may provide new information that is potentially useful for disruption prediction. A set of methods, based on the extraction of structural information from the visual scene, have been developed for the

  20. Overview of image processing tools to extract physical information from JET videos

    International Nuclear Information System (INIS)

    Craciunescu, T; Tiseanu, I; Zoita, V; Murari, A; Gelfusa, M

    2014-01-01

    In magnetic confinement nuclear fusion devices such as JET, the last few years have witnessed a significant increase in the use of digital imagery, not only for the surveying and control of experiments, but also for the physical interpretation of results. More than 25 cameras are routinely used for imaging on JET in the infrared (IR) and visible spectral regions. These cameras can produce up to tens of Gbytes per shot and their information content can be very different, depending on the experimental conditions. However, the relevant information about the underlying physical processes is generally of much reduced dimensionality compared to the recorded data. The extraction of this information, which allows full exploitation of these diagnostics, is a challenging task. The image analysis consists, in most cases, of inverse problems which are typically ill-posed mathematically. The typology of objects to be analysed is very wide, and usually the images are affected by noise, low levels of contrast, low grey-level in-depth resolution, reshaping of moving objects, etc. Moreover, the plasma events have time constants of ms or tens of ms, which imposes tough conditions for real-time applications. On JET, in the last few years new tools and methods have been developed for physical information retrieval. The methodology of optical flow has allowed, under certain assumptions, the derivation of information about the dynamics of video objects associated with different physical phenomena, such as instabilities, pellets and filaments. The approach has been extended in order to approximate the optical flow within the MPEG compressed domain, allowing the manipulation of the large JET video databases and, in specific cases, even real-time data processing. The fast visible camera may provide new information that is potentially useful for disruption prediction. A set of methods, based on the extraction of structural information from the visual scene, have been developed for the

  1. Extraction of prospecting information of uranium deposit based on high spatial resolution satellite data. Taking bashibulake region as an example

    International Nuclear Information System (INIS)

    Yang Xu; Liu Dechang; Zhang Jielin

    2008-01-01

    In this study, the signification and content of prospecting information of uranium deposit are expounded. Quickbird high spatial resolution satellite data are used to extract the prospecting information of uranium deposit in Bashibulake area in the north of Tarim Basin. By using the pertinent methods of image processing, the information of ore-bearing bed, ore-control structure and mineralized alteration have been extracted. The results show a high consistency with the field survey. The aim of this study is to explore practicability of high spatial resolution satellite data for prospecting minerals, and to broaden the thinking of prospectation at similar area. (authors)

  2. The development of an integrated IT system at Albian Sands Energy

    Energy Technology Data Exchange (ETDEWEB)

    Michaud, L. H. [Albian Sands Energy Inc., Fort McMurray, AB (Canada)

    2003-01-01

    Factors considered in the selection, implementation and integration of computer applications in an oil sands surface mining and extraction operation are discussed. The company's objective in choosing the system was to optimize the use of information and to meet technical, business and information technology requirements. In a departure from typical practice where the system is selected by the information technology team, with minimal input from the technical and business units, in the case of Albian Sands Energy the company's technical and business people were closely involved in the selection process. Integration of the system was a primary consideration, including linking all applications through a data warehouse and electronic data management system. Details of the applications architecture, construction of the applications inventory, selection of the applications, identification of integration requirements, project management issues, and benefits of an integrated system are described. 6 refs., 2 tabs., 1 fig.

  3. Extracting Low-Frequency Information from Time Attenuation in Elastic Waveform Inversion

    Science.gov (United States)

    Guo, Xuebao; Liu, Hong; Shi, Ying; Wang, Weihong

    2017-03-01

    Low-frequency information is crucial for recovering background velocity, but the lack of low-frequency information in field data makes inversion impractical without accurate initial models. Laplace-Fourier domain waveform inversion can recover a smooth model from real data without low-frequency information, which can be used for subsequent inversion as an ideal starting model. In general, it also starts with low frequencies and includes higher frequencies at later inversion stages, while the difference is that its ultralow frequency information comes from the Laplace-Fourier domain. Meanwhile, a direct implementation of the Laplace-transformed wavefield using frequency domain inversion is also very convenient. However, because broad frequency bands are often used in the pure time domain waveform inversion, it is difficult to extract the wavefields dominated by low frequencies in this case. In this paper, low-frequency components are constructed by introducing time attenuation into the recorded residuals, and the rest of the method is identical to the traditional time domain inversion. Time windowing and frequency filtering are also applied to mitigate the ambiguity of the inverse problem. Therefore, we can start at low frequencies and to move to higher frequencies. The experiment shows that the proposed method can achieve a good inversion result in the presence of a linear initial model and records without low-frequency information.

  4. Performance measurement integrated information framework in e-Manufacturing

    Science.gov (United States)

    Teran, Hilaida; Hernandez, Juan Carlos; Vizán, Antonio; Ríos, José

    2014-11-01

    The implementation of Internet technologies has led to e-Manufacturing technologies becoming more widely used and to the development of tools for compiling, transforming and synchronising manufacturing data through the Web. In this context, a potential area for development is the extension of virtual manufacturing to performance measurement (PM) processes, a critical area for decision making and implementing improvement actions in manufacturing. This paper proposes a PM information framework to integrate decision support systems in e-Manufacturing. Specifically, the proposed framework offers a homogeneous PM information exchange model that can be applied through decision support in e-Manufacturing environment. Its application improves the necessary interoperability in decision-making data processing tasks. It comprises three sub-systems: a data model, a PM information platform and PM-Web services architecture. A practical example of data exchange for measurement processes in the area of equipment maintenance is shown to demonstrate the utility of the model.

  5. Study on Integrating Accounting in the Information System of the Organization

    Directory of Open Access Journals (Sweden)

    Ciuhureanu Alina-Teodora

    2017-12-01

    Full Text Available In order to gain power in an ever-changing economy, in diversified markets, the organization must have an up-to-date information system that enables managers to get a detailed understanding of the organization’s status and to obtain what is needed to manage – the information. Starting from these premises, the empirical research presents the components of the information system. One of the main contributions, however, is to customize the opinions of the specialists and to create a logical scheme on the accounting information system. Moreover, through selective research, the article analyzed the managers’ interest in accounting information and its integration into the information system of the organization.

  6. How to integrate proxy data from two informants in life event assessment in psychological autopsy.

    Science.gov (United States)

    Zhang, Jie; Wang, Youqing; Fang, Le

    2018-04-27

    Life event assessment is an important part in psychological autopsy, and how to integrate its proxy data from two informants is a major methodological issue which needs solving. Totally 416 living subjects and their two informants were interviewed by psychological autopsy, and life events were assessed with Paykel's Interview for Recent Life Events. Validities of integrated proxy data using six psychological autopsy information reconstruction methods were evaluated, with living subjects' self-reports used as gold-standard criteria. For all the life events, average value of Youden Indexes for proxy data by type C information reconstruction method (choosing positive value from two informants) was larger than other five methods'. For family life related events, proxy data by type 1st information reconstruction method were not significantly different from living subjects' self-reports (P = 0.828). For all other life events, proxy data by type C information reconstruction method were not significantly different from the gold-standard. Choosing positive value is a relatively better method for integrating dichotomous (positive vs. negative) proxy data from two informants in life event assessment in psychological autopsy, except for family life related events. In that case, using information provided by 1st informants (mainly family member) is recommended.

  7. A Risk Assessment System with Automatic Extraction of Event Types

    Science.gov (United States)

    Capet, Philippe; Delavallade, Thomas; Nakamura, Takuya; Sandor, Agnes; Tarsitano, Cedric; Voyatzi, Stavroula

    In this article we describe the joint effort of experts in linguistics, information extraction and risk assessment to integrate EventSpotter, an automatic event extraction engine, into ADAC, an automated early warning system. By detecting as early as possible weak signals of emerging risks ADAC provides a dynamic synthetic picture of situations involving risk. The ADAC system calculates risk on the basis of fuzzy logic rules operated on a template graph whose leaves are event types. EventSpotter is based on a general purpose natural language dependency parser, XIP, enhanced with domain-specific lexical resources (Lexicon-Grammar). Its role is to automatically feed the leaves with input data.

  8. Comparison of methods of extracting information for meta-analysis of observational studies in nutritional epidemiology

    Directory of Open Access Journals (Sweden)

    Jong-Myon Bae

    2016-01-01

    Full Text Available OBJECTIVES: A common method for conducting a quantitative systematic review (QSR for observational studies related to nutritional epidemiology is the “highest versus lowest intake” method (HLM, in which only the information concerning the effect size (ES of the highest category of a food item is collected on the basis of its lowest category. However, in the interval collapsing method (ICM, a method suggested to enable a maximum utilization of all available information, the ES information is collected by collapsing all categories into a single category. This study aimed to compare the ES and summary effect size (SES between the HLM and ICM. METHODS: A QSR for evaluating the citrus fruit intake and risk of pancreatic cancer and calculating the SES by using the HLM was selected. The ES and SES were estimated by performing a meta-analysis using the fixed-effect model. The directionality and statistical significance of the ES and SES were used as criteria for determining the concordance between the HLM and ICM outcomes. RESULTS: No significant differences were observed in the directionality of SES extracted by using the HLM or ICM. The application of the ICM, which uses a broader information base, yielded more-consistent ES and SES, and narrower confidence intervals than the HLM. CONCLUSIONS: The ICM is advantageous over the HLM owing to its higher statistical accuracy in extracting information for QSR on nutritional epidemiology. The application of the ICM should hence be recommended for future studies.

  9. Unifying Kohlberg with Information Integration: The Moral Algebra of Recompense and of Kohlbergian Moral Informers

    Science.gov (United States)

    Hommers, Wilfried; Lee, Wha-Yong

    2010-01-01

    In order to unify two major theories of moral judgment, a novel task is employed which combines elements of Kohlberg's stage theory and of the theory of information integration. In contrast to the format of Kohlberg's moral judgment interview, a nonverbal and quantitative response which makes low demands on verbal facility was used. Moral…

  10. A COMPARATIVE ANALYSIS OF WEB INFORMATION EXTRACTION TECHNIQUES DEEP LEARNING vs. NAÏVE BAYES vs. BACK PROPAGATION NEURAL NETWORKS IN WEB DOCUMENT EXTRACTION

    Directory of Open Access Journals (Sweden)

    J. Sharmila

    2016-01-01

    Full Text Available Web mining related exploration is getting the chance to be more essential these days in view of the reason that a lot of information is overseen through the web. Web utilization is expanding in an uncontrolled way. A particular framework is required for controlling such extensive measure of information in the web space. Web mining is ordered into three noteworthy divisions: Web content mining, web usage mining and web structure mining. Tak-Lam Wong has proposed a web content mining methodology in the exploration with the aid of Bayesian Networks (BN. In their methodology, they were learning on separating the web data and characteristic revelation in view of the Bayesian approach. Roused from their investigation, we mean to propose a web content mining methodology, in view of a Deep Learning Algorithm. The Deep Learning Algorithm gives the interest over BN on the basis that BN is not considered in any learning architecture planning like to propose system. The main objective of this investigation is web document extraction utilizing different grouping algorithm and investigation. This work extricates the data from the web URL. This work shows three classification algorithms, Deep Learning Algorithm, Bayesian Algorithm and BPNN Algorithm. Deep Learning is a capable arrangement of strategies for learning in neural system which is connected like computer vision, speech recognition, and natural language processing and biometrics framework. Deep Learning is one of the simple classification technique and which is utilized for subset of extensive field furthermore Deep Learning has less time for classification. Naive Bayes classifiers are a group of basic probabilistic classifiers in view of applying Bayes hypothesis with concrete independence assumptions between the features. At that point the BPNN algorithm is utilized for classification. Initially training and testing dataset contains more URL. We extract the content presently from the dataset. The

  11. Integration of the enterprise electronic health record and anesthesia information management systems.

    Science.gov (United States)

    Springman, Scott R

    2011-09-01

    Fewer than 5% of anesthesia departments use an electronic medical record (EMR) that is anesthesia specific. Many anesthesia information management systems (AIMS) have been developed with a focus only on the unique needs of anesthesia providers, without being fully integrated into other electronic health record components of the entire enterprise medical system. To understand why anesthesia providers should embrace health information technology (HIT) on a health system-wide basis, this article reviews recent HIT history and reviews HIT concepts. The author explores current developments in efforts to expand enterprise HIT, and the pros and cons of full enterprise integration with an AIMS. Copyright © 2011 Elsevier Inc. All rights reserved.

  12. Zone analysis in biology articles as a basis for information extraction.

    Science.gov (United States)

    Mizuta, Yoko; Korhonen, Anna; Mullen, Tony; Collier, Nigel

    2006-06-01

    In the field of biomedicine, an overwhelming amount of experimental data has become available as a result of the high throughput of research in this domain. The amount of results reported has now grown beyond the limits of what can be managed by manual means. This makes it increasingly difficult for the researchers in this area to keep up with the latest developments. Information extraction (IE) in the biological domain aims to provide an effective automatic means to dynamically manage the information contained in archived journal articles and abstract collections and thus help researchers in their work. However, while considerable advances have been made in certain areas of IE, pinpointing and organizing factual information (such as experimental results) remains a challenge. In this paper we propose tackling this task by incorporating into IE information about rhetorical zones, i.e. classification of spans of text in terms of argumentation and intellectual attribution. As the first step towards this goal, we introduce a scheme for annotating biological texts for rhetorical zones and provide a qualitative and quantitative analysis of the data annotated according to this scheme. We also discuss our preliminary research on automatic zone analysis, and its incorporation into our IE framework.

  13. Contextual Sensing: Integrating Contextual Information with Human and Technical Geo-Sensor Information for Smart Cities.

    Science.gov (United States)

    Sagl, Günther; Resch, Bernd; Blaschke, Thomas

    2015-07-14

    In this article we critically discuss the challenge of integrating contextual information, in particular spatiotemporal contextual information, with human and technical sensor information, which we approach from a geospatial perspective. We start by highlighting the significance of context in general and spatiotemporal context in particular and introduce a smart city model of interactions between humans, the environment, and technology, with context at the common interface. We then focus on both the intentional and the unintentional sensing capabilities of today's technologies and discuss current technological trends that we consider have the ability to enrich human and technical geo-sensor information with contextual detail. The different types of sensors used to collect contextual information are analyzed and sorted into three groups on the basis of names considering frequently used related terms, and characteristic contextual parameters. These three groups, namely technical in situ sensors, technical remote sensors, and human sensors are analyzed and linked to three dimensions involved in sensing (data generation, geographic phenomena, and type of sensing). In contrast to other scientific publications, we found a large number of technologies and applications using in situ and mobile technical sensors within the context of smart cities, and surprisingly limited use of remote sensing approaches. In this article we further provide a critical discussion of possible impacts and influences of both technical and human sensing approaches on society, pointing out that a larger number of sensors, increased fusion of information, and the use of standardized data formats and interfaces will not necessarily result in any improvement in the quality of life of the citizens of a smart city. This article seeks to improve our understanding of technical and human geo-sensing capabilities, and to demonstrate that the use of such sensors can facilitate the integration of different

  14. Contextual Sensing: Integrating Contextual Information with Human and Technical Geo-Sensor Information for Smart Cities

    Science.gov (United States)

    Sagl, Günther; Resch, Bernd; Blaschke, Thomas

    2015-01-01

    In this article we critically discuss the challenge of integrating contextual information, in particular spatiotemporal contextual information, with human and technical sensor information, which we approach from a geospatial perspective. We start by highlighting the significance of context in general and spatiotemporal context in particular and introduce a smart city model of interactions between humans, the environment, and technology, with context at the common interface. We then focus on both the intentional and the unintentional sensing capabilities of today’s technologies and discuss current technological trends that we consider have the ability to enrich human and technical geo-sensor information with contextual detail. The different types of sensors used to collect contextual information are analyzed and sorted into three groups on the basis of names considering frequently used related terms, and characteristic contextual parameters. These three groups, namely technical in situ sensors, technical remote sensors, and human sensors are analyzed and linked to three dimensions involved in sensing (data generation, geographic phenomena, and type of sensing). In contrast to other scientific publications, we found a large number of technologies and applications using in situ and mobile technical sensors within the context of smart cities, and surprisingly limited use of remote sensing approaches. In this article we further provide a critical discussion of possible impacts and influences of both technical and human sensing approaches on society, pointing out that a larger number of sensors, increased fusion of information, and the use of standardized data formats and interfaces will not necessarily result in any improvement in the quality of life of the citizens of a smart city. This article seeks to improve our understanding of technical and human geo-sensing capabilities, and to demonstrate that the use of such sensors can facilitate the integration of different

  15. Nonlinear integral equations for the sausage model

    Science.gov (United States)

    Ahn, Changrim; Balog, Janos; Ravanini, Francesco

    2017-08-01

    The sausage model, first proposed by Fateev, Onofri, and Zamolodchikov, is a deformation of the O(3) sigma model preserving integrability. The target space is deformed from the sphere to ‘sausage’ shape by a deformation parameter ν. This model is defined by a factorizable S-matrix which is obtained by deforming that of the O(3) sigma model by a parameter λ. Clues for the deformed sigma model are provided by various UV and IR information through the thermodynamic Bethe ansatz (TBA) analysis based on the S-matrix. Application of TBA to the sausage model is, however, limited to the case of 1/λ integer where the coupled integral equations can be truncated to a finite number. In this paper, we propose a finite set of nonlinear integral equations (NLIEs), which are applicable to generic value of λ. Our derivation is based on T-Q relations extracted from the truncated TBA equations. For a consistency check, we compute next-leading order corrections of the vacuum energy and extract the S-matrix information in the IR limit. We also solved the NLIE both analytically and numerically in the UV limit to get the effective central charge and compared with that of the zero-mode dynamics to obtain exact relation between ν and λ. Dedicated to the memory of Petr Petrovich Kulish.

  16. Intensive care unit nurses' information needs and recommendations for integrated displays to improve nurses' situation awareness.

    Science.gov (United States)

    Koch, Sven H; Weir, Charlene; Haar, Maral; Staggers, Nancy; Agutter, Jim; Görges, Matthias; Westenskow, Dwayne

    2012-01-01

    Fatal errors can occur in intensive care units (ICUs). Researchers claim that information integration at the bedside may improve nurses' situation awareness (SA) of patients and decrease errors. However, it is unclear which information should be integrated and in what form. Our research uses the theory of SA to analyze the type of tasks, and their associated information gaps. We aimed to provide recommendations for integrated, consolidated information displays to improve nurses' SA. Systematic observations methods were used to follow 19 ICU nurses for 38 hours in 3 clinical practice settings. Storyboard methods and concept mapping helped to categorize the observed tasks, the associated information needs, and the information gaps of the most frequent tasks by SA level. Consensus and discussion of the research team was used to propose recommendations to improve information displays at the bedside based on information deficits. Nurses performed 46 different tasks at a rate of 23.4 tasks per hour. The information needed to perform the most common tasks was often inaccessible, difficult to see at a distance or located on multiple monitoring devices. Current devices at the ICU bedside do not adequately support a nurse's information-gathering activities. Medication management was the most frequent category of tasks. Information gaps were present at all levels of SA and across most of the tasks. Using a theoretical model to understand information gaps can aid in designing functional requirements. Integrated information that enhances nurses' Situation Awareness may decrease errors and improve patient safety in the future.

  17. Framework for integration of informal waste management sector with the formal sector in Pakistan.

    Science.gov (United States)

    Masood, Maryam; Barlow, Claire Y

    2013-10-01

    Historically, waste pickers around the globe have utilised urban solid waste as a principal source of livelihood. Formal waste management sectors usually perceive the informal waste collection/recycling networks as backward, unhygienic and generally incompatible with modern waste management systems. It is proposed here that through careful planning and administration, these seemingly troublesome informal networks can be integrated into formal waste management systems in developing countries, providing mutual benefits. A theoretical framework for integration based on a case study in Lahore, Pakistan, is presented. The proposed solution suggests that the municipal authority should draw up and agree on a formal work contract with the group of waste pickers already operating in the area. The proposed system is assessed using the integration radar framework to classify and analyse possible intervention points between the sectors. The integration of the informal waste workers with the formal waste management sector is not a one dimensional or single step process. An ideal solution might aim for a balanced focus on all four categories of intervention, although this may be influenced by local conditions. Not all the positive benefits will be immediately apparent, but it is expected that as the acceptance of such projects increases over time, the informal recycling economy will financially supplement the formal system in many ways.

  18. Air extraction in gas turbines burning coal-derived gas

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Tah-teh; Agrawal, A.K.; Kapat, J.S.

    1993-11-01

    In the first phase of this contracted research, a comprehensive investigation was performed. Principally, the effort was directed to identify the technical barriers which might exist in integrating the air-blown coal gasification process with a hot gas cleanup scheme and the state-of-the-art, US made, heavy-frame gas turbine. The guiding rule of the integration is to keep the compressor and the expander unchanged if possible. Because of the low-heat content of coal gas and of the need to accommodate air extraction, the combustor and perhaps, the flow region between the compressor exit and the expander inlet might need to be modified. In selecting a compressed air extraction scheme, one must consider how the scheme affects the air supply to the hot section of the turbine and the total pressure loss in the flow region. Air extraction must preserve effective cooling of the hot components, such as the transition pieces. It must also ensure proper air/fuel mixing in the combustor, hence the combustor exit pattern factor. The overall thermal efficiency of the power plant can be increased by minimizing the total pressure loss in the diffusers associated with the air extraction. Therefore, a study of airflow in the pre- and dump-diffusers with and without air extraction would provide information crucial to attaining high-thermal efficiency and to preventing hot spots. The research group at Clemson University suggested using a Griffith diffuser for the prediffuser and extracting air from the diffuser inlet. The present research establishes that the analytically identified problems in the impingement cooling flow are factual. This phase of the contracted research substantiates experimentally the advantage of using the Griffith diffuser with air extraction at the diffuser inlet.

  19. Integration of Information and Scientific Literacy: Promoting Literacy in Undergraduates

    Science.gov (United States)

    Wolbach, Kevin C.; Purzycki, Catherine B.; Bowman, Leslie A.; Agbada, Eva; Mostrom, Alison M.

    2010-01-01

    The Association of College and Research Libraries recommends incorporating information literacy (IL) skills across university and college curricula, for the goal of developing information literate graduates. Congruent with this goal, the Departments of Biological Sciences and Information Science developed an integrated IL and scientific literacy (SL) exercise for use in a first-year biology course. Students were provided the opportunity to access, retrieve, analyze, and evaluate primary scientific literature. By the completion of this project, student responses improved concerning knowledge and relevance of IL and SL skills. This project exposes students to IL and SL early in their undergraduate experience, preparing them for future academic advancement. PMID:21123700

  20. The integration of Information and Communication Technology into medical practice.

    Science.gov (United States)

    Lupiáñez-Villanueva, Francisco; Hardey, Michael; Torrent, Joan; Ficapal, Pilar

    2010-07-01

    To identify doctors' utilization of ICT; to develop and characterise a typology of doctors' utilization of ICT and to identify factors that can enhance or inhibit the use of these technologies within medical practice. An online survey of the 16,531 members of the Physicians Association of Barcelona who had a registered email account in 2006 was carried out. Factor analysis, cluster analysis and binomial logit model were undertaken. Multivariate statistics analysis of the 2199 responses obtained revealed two profiles of adoption of ICT. The first profile (38.61% of respondents) represents those doctors who place high emphasis on ICT within their practice. This group is thus referred to as 'integrated doctors'. The second profile (61.39% of respondents) represents those doctors who make less use of ICT so are consequently labelled 'non-integrated doctors'. From the statistical modelling, it was observed that an emphasis on international information; emphasis on ICT for research and medical practice; emphasis on information systems to consult and prescribe; undertaking teaching/research activities; a belief that the use of the Internet improved communication with patients and practice in both public and private health organizations play a positive and significant role in the probability of being an 'integrated doctor'. The integration of ICT within medical practice cannot be adequately understood and appreciated without examining how doctors are making use of ICT within their own practice, organizational contexts and the opportunities and constraints afforded by institutional, professional and patient expectations and demands. 2010 Elsevier Ireland Ltd. All rights reserved.

  1. ROAD AND ROADSIDE FEATURE EXTRACTION USING IMAGERY AND LIDAR DATA FOR TRANSPORTATION OPERATION

    Directory of Open Access Journals (Sweden)

    S. Ural

    2015-03-01

    Full Text Available Transportation agencies require up-to-date, reliable, and feasibly acquired information on road geometry and features within proximity to the roads as input for evaluating and prioritizing new or improvement road projects. The information needed for a robust evaluation of road projects includes road centerline, width, and extent together with the average grade, cross-sections, and obstructions near the travelled way. Remote sensing is equipped with a large collection of data and well-established tools for acquiring the information and extracting aforementioned various road features at various levels and scopes. Even with many remote sensing data and methods available for road extraction, transportation operation requires more than the centerlines. Acquiring information that is spatially coherent at the operational level for the entire road system is challenging and needs multiple data sources to be integrated. In the presented study, we established a framework that used data from multiple sources, including one-foot resolution color infrared orthophotos, airborne LiDAR point clouds, and existing spatially non-accurate ancillary road networks. We were able to extract 90.25% of a total of 23.6 miles of road networks together with estimated road width, average grade along the road, and cross sections at specified intervals. Also, we have extracted buildings and vegetation within a predetermined proximity to the extracted road extent. 90.6% of 107 existing buildings were correctly identified with 31% false detection rate.

  2. Web-based integrated public healthcare information system of Korea: development and performance.

    Science.gov (United States)

    Ryu, Seewon; Park, Minsu; Lee, Jaegook; Kim, Sung-Soo; Han, Bum Soo; Mo, Kyoung Chun; Lee, Hyung Seok

    2013-12-01

    The Web-based integrated public healthcare information system (PHIS) of Korea was planned and developed from 2005 to 2010, and it is being used in 3,501 regional health organizations. This paper introduces and discusses development and performance of the system. We reviewed and examined documents about the development process and performance of the newly integrated PHIS. The resources we analyzed the national plan for public healthcare, information strategy for PHIS, usage and performance reports of the system. The integrated PHIS included 19 functional business areas, 47 detailed health programs, and 48 inter-organizational tasks. The new PHIS improved the efficiency and effectiveness of the business process and inter-organizational business, and enhanced user satisfaction. Economic benefits were obtained from five categories: labor, health education and monitoring, clinical information management, administration and civil service, and system maintenance. The system was certified by a patent from the Korean Intellectual Property Office and accredited as an ISO 9001. It was also reviewed and received preliminary comments about its originality, advancement, and business applicability from the Patent Cooperation Treaty. It has been found to enhance the quality of policy decision-making about regional healthcare at the self-governing local government level. PHIS, a Web-based integrated system, has contributed to the improvement of regional healthcare services of Korea. However, when it comes to an appropriate evolution, the needs and changing environments of community-level healthcare service and IT infrastructure should be analyzed properly in advance.

  3. Integration of auditory and visual communication information in the primate ventrolateral prefrontal cortex.

    Science.gov (United States)

    Sugihara, Tadashi; Diltz, Mark D; Averbeck, Bruno B; Romanski, Lizabeth M

    2006-10-25

    The integration of auditory and visual stimuli is crucial for recognizing objects, communicating effectively, and navigating through our complex world. Although the frontal lobes are involved in memory, communication, and language, there has been no evidence that the integration of communication information occurs at the single-cell level in the frontal lobes. Here, we show that neurons in the macaque ventrolateral prefrontal cortex (VLPFC) integrate audiovisual communication stimuli. The multisensory interactions included both enhancement and suppression of a predominantly auditory or a predominantly visual response, although multisensory suppression was the more common mode of response. The multisensory neurons were distributed across the VLPFC and within previously identified unimodal auditory and visual regions (O'Scalaidhe et al., 1997; Romanski and Goldman-Rakic, 2002). Thus, our study demonstrates, for the first time, that single prefrontal neurons integrate communication information from the auditory and visual domains, suggesting that these neurons are an important node in the cortical network responsible for communication.

  4. GEMMER: GEnome-wide tool for Multi-scale Modeling data Extraction and Representation for Saccharomyces cerevisiae.

    Science.gov (United States)

    Mondeel, Thierry D G A; Crémazy, Frédéric; Barberis, Matteo

    2018-02-01

    Multi-scale modeling of biological systems requires integration of various information about genes and proteins that are connected together in networks. Spatial, temporal and functional information is available; however, it is still a challenge to retrieve and explore this knowledge in an integrated, quick and user-friendly manner. We present GEMMER (GEnome-wide tool for Multi-scale Modelling data Extraction and Representation), a web-based data-integration tool that facilitates high quality visualization of physical, regulatory and genetic interactions between proteins/genes in Saccharomyces cerevisiae. GEMMER creates network visualizations that integrate information on function, temporal expression, localization and abundance from various existing databases. GEMMER supports modeling efforts by effortlessly gathering this information and providing convenient export options for images and their underlying data. GEMMER is freely available at http://gemmer.barberislab.com. Source code, written in Python, JavaScript library D3js, PHP and JSON, is freely available at https://github.com/barberislab/GEMMER. M.Barberis@uva.nl. Supplementary data are available at Bioinformatics online. © The Author(s) 2018. Published by Oxford University Press.

  5. Promoting better integration of health information systems: best practices and challenges

    NARCIS (Netherlands)

    Michelsen, K.; Brand, H.; Achterberg, P.; Wilkinson, J.

    2015-01-01

    Health Evidence Network Synthesis Report [Promoting better integration of health information systems: best practices and challenges K Michelsen, H Brand, P Achterberg, J Wilkinson - 2015 ... Authors Kai Michelsen Department of International Health, Maastricht University Maastricht,

  6. Development of a real-world direct interface for integrated DNA extraction and amplification in a microfluidic device.

    Science.gov (United States)

    Shaw, Kirsty J; Joyce, Domino A; Docker, Peter T; Dyer, Charlotte E; Greenway, Gillian M; Greenman, John; Haswell, Stephen J

    2011-02-07

    Integrated DNA extraction and amplification have been carried out in a microfluidic device using electro-osmotic pumping (EOP) for fluidic control. All the necessary reagents for performing both DNA extraction and polymerase chain reaction (PCR) amplification were pre-loaded into the microfluidic device following encapsulation in agarose gel. Buccal cells were collected using OmniSwabs [Whatman™, UK] and manually added to a chaotropic binding/lysis solution pre-loaded into the microfluidic device. The released DNA was then adsorbed onto a silica monolith contained within the DNA extraction chamber and the microfluidic device sealed using polymer electrodes. The washing and elution steps for DNA extraction were carried out using EOP, resulting in transfer of the eluted DNA into the PCR chamber. Thermal cycling, achieved using a Peltier element, resulted in amplification of the Amelogenin locus as confirmed using conventional capillary gel electrophoresis. It was demonstrated that the PCR reagents could be stored in the microfluidic device for at least 8 weeks at 4 °C with no significant loss of activity. Such methodology lends itself to the production of 'ready-to-use' microfluidic devices containing all the necessary reagents for sample processing, with many obvious applications in forensics and clinical medicine.

  7. Creation of integrated information model of 'Ukryttia' object premises condition to support the works

    International Nuclear Information System (INIS)

    Postil, S.D.; Ermolenko, A.I.; Ivanov, V.V.; Kotlyarov, V.T.

    2002-01-01

    A technology for creation of integrated information model of 'Ukryttia' Object premises conditions was developed on the basis of geoinformation system AutoCad. DB Access and instrumental utility 3D MAX. Information models and database for conditions of 'Ukryttia' object's premises located between 0.000 and 67.000 marks in axes 41-52, row G-T, were created. Using integrated information model of 'Ukryttia' object premises conditions, 3D surface distribution of radiation field in the object premises on level 0.000 has been received. It is revealed that maximum values of radiation field are concentrated over the clusters of fuel-containing materials

  8. Evaluation of needle trap micro-extraction and solid-phase micro-extraction: Obtaining comprehensive information on volatile emissions from in vitro cultures.

    Science.gov (United States)

    Oertel, Peter; Bergmann, Andreas; Fischer, Sina; Trefz, Phillip; Küntzel, Anne; Reinhold, Petra; Köhler, Heike; Schubert, Jochen K; Miekisch, Wolfram

    2018-05-14

    Volatile organic compounds (VOCs) emitted from in vitro cultures may reveal information on species and metabolism. Owing to low nmol L -1 concentration ranges, pre-concentration techniques are required for gas chromatography-mass spectrometry (GC-MS) based analyses. This study was intended to compare the efficiency of established micro-extraction techniques - solid-phase micro-extraction (SPME) and needle-trap micro-extraction (NTME) - for the analysis of complex VOC patterns. For SPME, a 75 μm Carboxen®/polydimethylsiloxane fiber was used. The NTME needle was packed with divinylbenzene, Carbopack X and Carboxen 1000. The headspace was sampled bi-directionally. Seventy-two VOCs were calibrated by reference standard mixtures in the range of 0.041-62.24 nmol L -1 by means of GC-MS. Both pre-concentration methods were applied to profile VOCs from cultures of Mycobacterium avium ssp. paratuberculosis. Limits of detection ranged from 0.004 to 3.93 nmol L -1 (median = 0.030 nmol L -1 ) for NTME and from 0.001 to 5.684 nmol L -1 (median = 0.043 nmol L -1 ) for SPME. NTME showed advantages in assessing polar compounds such as alcohols. SPME showed advantages in reproducibility but disadvantages in sensitivity for N-containing compounds. Micro-extraction techniques such as SPME and NTME are well suited for trace VOC profiling over cultures if the limitations of each technique is taken into account. Copyright © 2018 John Wiley & Sons, Ltd.

  9. Tight integration of computerized procedures with plant information at the South Texas Project

    International Nuclear Information System (INIS)

    Brtis, J.S.; Green, T.

    1996-01-01

    This paper describes a unique undertaking that is underway at Houston Lighting and Power's South Texas Project (STP). The paper presents an information upgrade project that uses expert system technologies to computerize design change procedures and to tightly integrate the resulting on-line, interactive procedures with the on-line information that design change activities use and generate. This effort will show how procedure computerization can leverage the large investments in plant data. The expected benefits include reduced costs and improved quality of design change work, plus a significant reduction in the burden of configuration management that comes from design changes. Both process computerization and the integration of process with data are being implemented at STP. This work is part of a major migration of information from a mainframe to a LAN platform. This paper will be of greatest interest to those involved in: (1) configuration management, (2) coordinating information to support design change procedures, (3) plant information management, and (4) business process reengineering

  10. Navigation integrity monitoring and obstacle detection for enhanced-vision systems

    Science.gov (United States)

    Korn, Bernd; Doehler, Hans-Ullrich; Hecker, Peter

    2001-08-01

    Typically, Enhanced Vision (EV) systems consist of two main parts, sensor vision and synthetic vision. Synthetic vision usually generates a virtual out-the-window view using databases and accurate navigation data, e. g. provided by differential GPS (DGPS). The reliability of the synthetic vision highly depends on both, the accuracy of the used database and the integrity of the navigation data. But especially in GPS based systems, the integrity of the navigation can't be guaranteed. Furthermore, only objects that are stored in the database can be displayed to the pilot. Consequently, unexpected obstacles are invisible and this might cause severe problems. Therefore, additional information has to be extracted from sensor data to overcome these problems. In particular, the sensor data analysis has to identify obstacles and has to monitor the integrity of databases and navigation. Furthermore, if a lack of integrity arises, navigation data, e.g. the relative position of runway and aircraft, has to be extracted directly from the sensor data. The main contribution of this paper is about the realization of these three sensor data analysis tasks within our EV system, which uses the HiVision 35 GHz MMW radar of EADS, Ulm as the primary EV sensor. For the integrity monitoring, objects extracted from radar images are registered with both database objects and objects (e. g. other aircrafts) transmitted via data link. This results in a classification into known and unknown radar image objects and consequently, in a validation of the integrity of database and navigation. Furthermore, special runway structures are searched for in the radar image where they should appear. The outcome of this runway check contributes to the integrity analysis, too. Concurrent to this investigation a radar image based navigation is performed without using neither precision navigation nor detailed database information to determine the aircraft's position relative to the runway. The performance of our

  11. Information Technology Integration in Teacher Education: Supporting the Paradigm Shift in Hong Kong.

    Science.gov (United States)

    Lee, Kar Tin

    2001-01-01

    Examines the integration of information technology (IT) at the Hong Kong Institute of Education, presenting the rationale for this move, characteristics of IT integration, and program development issues for making IT application a critical component of contemporary teacher education. The paper presents a framework for program development and…

  12. Information Technology, Type II Classroom Integration, and the Limited Infrastructure in Schools

    Science.gov (United States)

    Maddux, Cleborne D.; Johnson D. Lamont

    2006-01-01

    In this second special issue on Type II applications of information technology in education, the focus is on classroom integration. This editorial explores some possible explanations for the fact that information technology in schools has not fulfilled its considerable potential. One reason may be that individualized instruction is not part of the…

  13. Information resources assessment of a healthcare integrated delivery system.

    Science.gov (United States)

    Gadd, C. S.; Friedman, C. P.; Douglas, G.; Miller, D. J.

    1999-01-01

    While clinical healthcare systems may have lagged behind computer applications in other fields in the shift from mainframes to client-server architectures, the rapid deployment of newer applications is closing that gap. Organizations considering the transition to client-server must identify and position themselves to provide the resources necessary to implement and support the infrastructure requirements of client-server architectures and to manage the accelerated complexity at the desktop, including hardware and software deployment, training, and maintenance needs. This paper describes an information resources assessment of the recently aligned Pennsylvania regional Veterans Administration Stars and Stripes Health Network (VISN4), in anticipation of the shift from a predominantly mainframe to a client-server information systems architecture in its well-established VistA clinical information system. The multimethod assessment study is described here to demonstrate this approach and its value to regional healthcare networks undergoing organizational integration and/or significant information technology transformations. PMID:10566414

  14. Information resources assessment of a healthcare integrated delivery system.

    Science.gov (United States)

    Gadd, C S; Friedman, C P; Douglas, G; Miller, D J

    1999-01-01

    While clinical healthcare systems may have lagged behind computer applications in other fields in the shift from mainframes to client-server architectures, the rapid deployment of newer applications is closing that gap. Organizations considering the transition to client-server must identify and position themselves to provide the resources necessary to implement and support the infrastructure requirements of client-server architectures and to manage the accelerated complexity at the desktop, including hardware and software deployment, training, and maintenance needs. This paper describes an information resources assessment of the recently aligned Pennsylvania regional Veterans Administration Stars and Stripes Health Network (VISN4), in anticipation of the shift from a predominantly mainframe to a client-server information systems architecture in its well-established VistA clinical information system. The multimethod assessment study is described here to demonstrate this approach and its value to regional healthcare networks undergoing organizational integration and/or significant information technology transformations.

  15. Information Environment is an Integral Element of Informational Space in the Process of Professional Development of Future Teacher of Physical Culture

    Directory of Open Access Journals (Sweden)

    Yuri V. Dragnev

    2012-04-01

    Full Text Available The article examines information environment as an integral element of information space in the process of professional development of future teacher of physical culture, notes that the strategic objective of the system of higher education is training of competent future teacher of physical culture in the field of information technologies, when information competence and information culture are major components of professionalism in modern information-oriented society

  16. Approaching the largest ‘API’: extracting information from the Internet with Python

    Directory of Open Access Journals (Sweden)

    Jonathan E. Germann

    2018-02-01

    Full Text Available This article explores the need for libraries to algorithmically access and manipulate the world’s largest API: the Internet. The billions of pages on the ‘Internet API’ (HTTP, HTML, CSS, XPath, DOM, etc. are easily accessible and manipulable. Libraries can assist in creating meaning through the datafication of information on the world wide web. Because most information is created for human consumption, some programming is required for automated extraction. Python is an easy-to-learn programming language with extensive packages and community support for web page automation. Four packages (Urllib, Selenium, BeautifulSoup, Scrapy in Python can automate almost any web page for all sized projects. An example warrant data project is explained to illustrate how well Python packages can manipulate web pages to create meaning through assembling custom datasets.

  17. Extracting and Using Photon Polarization Information in Radiative B Decays

    Energy Technology Data Exchange (ETDEWEB)

    Grossman, Yuval

    2000-05-09

    The authors discuss the uses of conversion electron pairs for extracting photon polarization information in weak radiative B decays. Both cases of leptons produced through a virtual and real photon are considered. Measurements of the angular correlation between the (K-pi) and (e{sup +}e{sup {minus}}) decay planes in B --> K*(--> K-pi)gamma (*)(--> e{sup +}e{sup {minus}}) decays can be used to determine the helicity amplitudes in the radiative B --> K*gamma decays. A large right-handed helicity amplitude in B-bar decays is a signal of new physics. The time-dependent CP asymmetry in the B{sup 0} decay angular correlation is shown to measure sin 2-beta and cos 2-beta with little hadronic uncertainty.

  18. Issues in Integrating Information Technology in Learning and Teaching EFL: The Saudi Experience

    Science.gov (United States)

    Al-Maini, Yousef Hamad

    2013-01-01

    The Saudi education system is facing a climate of change characterized by an interest in integrating new technology and educational approaches to improve teaching and learning. In this climate, the present paper explores the issues in integrating information technology in learning and teaching English as a foreign language (EFL) in government…

  19. SAR matrices: automated extraction of information-rich SAR tables from large compound data sets.

    Science.gov (United States)

    Wassermann, Anne Mai; Haebel, Peter; Weskamp, Nils; Bajorath, Jürgen

    2012-07-23

    We introduce the SAR matrix data structure that is designed to elucidate SAR patterns produced by groups of structurally related active compounds, which are extracted from large data sets. SAR matrices are systematically generated and sorted on the basis of SAR information content. Matrix generation is computationally efficient and enables processing of large compound sets. The matrix format is reminiscent of SAR tables, and SAR patterns revealed by different categories of matrices are easily interpretable. The structural organization underlying matrix formation is more flexible than standard R-group decomposition schemes. Hence, the resulting matrices capture SAR information in a comprehensive manner.

  20. Extracting chemical information from high-resolution Kβ X-ray emission spectroscopy

    Science.gov (United States)

    Limandri, S.; Robledo, J.; Tirao, G.

    2018-06-01

    High-resolution X-ray emission spectroscopy allows studying the chemical environment of a wide variety of materials. Chemical information can be obtained by fitting the X-ray spectra and observing the behavior of some spectral features. Spectral changes can also be quantified by means of statistical parameters calculated by considering the spectrum as a probability distribution. Another possibility is to perform statistical multivariate analysis, such as principal component analysis. In this work the performance of these procedures for extracting chemical information in X-ray emission spectroscopy spectra for mixtures of Mn2+ and Mn4+ oxides are studied. A detail analysis of the parameters obtained, as well as the associated uncertainties is shown. The methodologies are also applied for Mn oxidation state characterization of double perovskite oxides Ba1+xLa1-xMnSbO6 (with 0 ≤ x ≤ 0.7). The results show that statistical parameters and multivariate analysis are the most suitable for the analysis of this kind of spectra.

  1. Feature extraction and learning using context cue and Rényi entropy based mutual information

    DEFF Research Database (Denmark)

    Pan, Hong; Olsen, Søren Ingvor; Zhu, Yaping

    2015-01-01

    information. In particular, for feature extraction, we develop a new set of kernel descriptors−Context Kernel Descriptors (CKD), which enhance the original KDES by embedding the spatial context into the descriptors. Context cues contained in the context kernel enforce some degree of spatial consistency, thus...... improving the robustness of CKD. For feature learning and reduction, we propose a novel codebook learning method, based on a Rényi quadratic entropy based mutual information measure called Cauchy-Schwarz Quadratic Mutual Information (CSQMI), to learn a compact and discriminative CKD codebook. Projecting...... as the information about the underlying labels of the CKD using CSQMI. Thus the resulting codebook and reduced CKD are discriminative. We verify the effectiveness of our method on several public image benchmark datasets such as YaleB, Caltech-101 and CIFAR-10, as well as a challenging chicken feet dataset of our own...

  2. Integration of informal recycling sector in Brazil and the case of Sorocaba City.

    Science.gov (United States)

    Silva de Souza Lima, Nathalia; Mancini, Sandro Donnini

    2017-07-01

    Catadores are people who collect and sell materials that can be recycled. This activity has been done informally in many countries for years. Recently, a recognition process has begun for the informal recycling sector, with public and private initiatives. In Brazil, catadores started out associating with each other in co-operatives in the 1980s. In 2010, the Solid Waste National Policy was approved, promoting the inclusion of theses co-operatives in the formal waste management system. However, only in 25 out of 5670 Brazilian municipalities have hired co-operatives as Private Service Providers. The integration of the informal sector has contributed with social, economic and environmental benefits; income generation, reduction of poverty and resource preservation are highlights. Although there was a legal progress, there are great challenges for various actors involved. This paper aims to diagnose the informal recycling sector, emphasizing the integration process that has happened in Brazil. For this, a substantial literature review and a case study were conducted, applying the tool 'InteRa' to the case of Sorocaba. The case showed that it is possible to improve the integration of catadores in the formal waste management system. The co-operatives achieve recycling rates of 2%, higher than the official national rate of 1%. However, we estimate that autonomous pickers increase total recycling in Sorocaba to 9%, still short of the 25% target via source segregation. Therefore, continuing the integration process will benefit both the pickers, and also the municipality through savings on landfill costs.

  3. Creating integrated information management system for small and medium business

    Directory of Open Access Journals (Sweden)

    Deinega Valentina Nikolaevna

    2014-09-01

    Full Text Available Enterprises regardless of their size and ownership, focused on a long and successful work needed to create a system of integrated information systems. This is dictated by the fact that, firstly, it combines the financial data, and secondly provides a standardized manufacturing processes, thirdly, solves the problem of standardization of information in a frame. The main thing in your decision-making - the definition of the strategy of the business and the reflection of this strategy on goals and objectives. ERP-system help to maintain competitiveness and leadership in the market.

  4. Integrated radiation information system in the Czech Republic

    International Nuclear Information System (INIS)

    Drabova, D.; Prouza, Z.; Malatova, I.; Kuca, P.; Bucina, I.

    1998-01-01

    Outline and organizational structure of radiation monitoring network (RMN) in the Czech Republic is conformable with similar networks abroad integrated system of a number of components serve for continuous monitoring of radiation situation on the territory of the Czech Republic, detecting an abnormal radiological situation due to domestic source, detecting a non notified accident abroad with consequences on the territory of the Czech Republic, monitoring the evolution, determining the components of any radioactivity discharge, first estimation of accident extent, forecasting of accident development and of dispersion of radionuclides in the vicinity of source, acquisition of base for decision upon evaluation and other countermeasures and remedial actions, assessment and forecast of contamination for regulation of food and water consumption, review of enforced countermeasures based on actual monitoring data and refined forecast. For model calculations and decision making in case of a nuclear accident an integrated comprehensive computer based information system is now being set up in Czech Republic. (R.P.)

  5. Enhancing situational awareness by means of visualization and information integration of sensor networks

    Science.gov (United States)

    Timonen, Jussi; Vankka, Jouko

    2013-05-01

    This paper presents a solution for information integration and sharing architecture, which is able to receive data simultaneously from multiple different sensor networks. Creating a Common Operational Picture (COP) object along with the base map of the building plays a key role in the research. The object is combined with desired map sources and then shared to the mobile devices worn by soldiers in the field. The sensor networks we used focus on location techniques indoors, and a simple set of symbols is created to present the information, as an addition to NATO APP6B symbols. A core element in this research is the MUSAS (Mobile Urban Situational Awareness System), a demonstration environment that implements central functionalities. Information integration of the system is handled by the Internet Connection Engine (Ice) middleware, as well as the server, which hosts COP information and maps. The entire system is closed, such that it does not need any external service, and the information transfer with the mobile devices is organized by a tactical 5 GHz WLAN solution. The demonstration environment is implemented using only commercial off-theshelf (COTS) products. We have presented a field experiment event in which the system was able to integrate and share real time information of a blue force tracking system, received signal strength indicator (RSSI) based intrusion detection system, and a robot using simultaneous location and mapping technology (SLAM), where all the inputs were based on real activities. The event was held in a training area on urban area warfare.

  6. The interplay between formal and informal contracting in integrated project delivery

    NARCIS (Netherlands)

    Bygballe, L.E.; Dewulf, Geert P.M.R.; Levitt, R.

    2015-01-01

    This research examines the interplay between formal and informal contracting in integrated project delivery (IPD). It investigates how the interplay enables parties in health-care construction projects to cope with uncertainty and complexities, due to, among others, changing demands. New delivery

  7. Methodology of Adaptive Integrated Accounting System in Information Environment

    Directory of Open Access Journals (Sweden)

    Bochulya Tetyana V.

    2013-12-01

    Full Text Available The goal of the article lies in the study of logical and methodological justification of formation of the integrated system of accounting based on realities of the co-ordinated transformation of the society and economy and development of a new knowledge about formation and adjustment of the accounting system in it’s a priori new information competence with expansion of functionality for the justified idea of existence and development of business. Taking developments of the best representatives of the leading scientific society as a basis, the article offers a new vision of organisation of the accounting system, based on the modern projection of information competence and harmonisation of main processes of information service for adaptation of the system for multi-vector inquiries of consumers of information. Pursuant to results of the conducted study, the article makes an effort to change the established opinion about information and professional competences of the accounting system and attach a new qualitative significance to them. The article makes a proposal with respect to calculation of quality of the information system on the basis of key indicators of its information service. It lays the foundation of the prospective study of the problems of building the accounting system in such a projection, so that realities of internal and external processes were maximally co-ordinated based on the idea of their information development.

  8. Design principles for achieving integrated healthcare information systems.

    Science.gov (United States)

    Jensen, Tina Blegind

    2013-03-01

    Achieving integrated healthcare information systems has become a common goal for many countries in their pursuit of obtaining coordinated and comprehensive healthcare services. This article focuses on how a small local project termed 'Standardized pull of patient data' expanded and is now used on a large scale providing a majority of hospitals, general practitioners and citizens across Denmark with the possibility of accessing healthcare data from different electronic patient record systems and other systems. I build on design theory for information infrastructures, as presented by Hanseth and Lyytinen, to examine the design principles that facilitated this smallscale project to expand and become widespread. As a result of my findings, I outline three lessons learned that emphasize: (i) principles of flexibility, (ii) expansion from the installed base through modular strategies and (iii) identification of key healthcare actors to provide them with immediate benefits.

  9. From collision to collaboration - Integrating informal recyclers and re-use operators in Europe: A review.

    Science.gov (United States)

    Scheinberg, Anne; Nesić, Jelena; Savain, Rachel; Luppi, Pietro; Sinnott, Portia; Petean, Flaviu; Pop, Flaviu

    2016-09-01

    The European Union hosts some of the world's most developed waste management systems and an ambitious policy commitment to the circular economy. The existence of informal recycling and re-use activities in Europe has been vigorously denied until quite recently, and remains a very challenging subject for the European solid waste management sector, as well as for European government and private institutions. In countries ranging from Malta to Macedonia and from France to Turkey, informal recyclers excluded from legal recycling niches increasingly collide with formalised and controlled European Union approaches to urban waste management, packaging recovery schemes, formal re-use enterprises, and extended producer responsibility systems.This review focuses on the period from 2004 through the first half of 2016. The 78 sources on European (and neighbouring) informal recycling and re-use are contextualised with global sources and experience. The articles focus on informal recovery in and at the borders of the European Union, document the conflicts and collisions, and elaborate some constructive approaches towards legalisation, integration, and reconciliation. The overarching recommendation, to locate the issue of informal recovery and integration in the framework of the European circular economy package, is supported by four specific pillars of an integration strategy: Documentation, legalisation, occupational and enterprise recognition, and preparation for structural integration. © The Author(s) 2016.

  10. Postnatal experiences influence how the brain integrates information from different senses

    Directory of Open Access Journals (Sweden)

    Barry E Stein

    2009-09-01

    Full Text Available Sensory Processing Disorder (SPD is characterized by anomalous reactions to, and integration of, sensory cues. Although the underlying etiology of SPD is unknown, one brain region likely to reflect these sensory and behavioral anomalies is the Superior Colliculus (SC; a structure involved in the synthesis of information from multiple sensory modalities and the control of overt orientation responses. In this review we describe normal functional properties of this structure, the manner in which its individual neurons integrate cues from different senses, and the overt SC-mediated behaviors that are believed to manifest this “multisensory integration.” Of particular interest here is how SC neurons develop their capacity to engage in multisensory integration during early postnatal life as a consequence of early sensory experience, and that it is the intimate communication between cortex and the midbrain makes this developmental process possible.

  11. Scholarly Information Extraction Is Going to Make a Quantum Leap with PubMed Central (PMC).

    Science.gov (United States)

    Matthies, Franz; Hahn, Udo

    2017-01-01

    With the increasing availability of complete full texts (journal articles), rather than their surrogates (titles, abstracts), as resources for text analytics, entirely new opportunities arise for information extraction and text mining from scholarly publications. Yet, we gathered evidence that a range of problems are encountered for full-text processing when biomedical text analytics simply reuse existing NLP pipelines which were developed on the basis of abstracts (rather than full texts). We conducted experiments with four different relation extraction engines all of which were top performers in previous BioNLP Event Extraction Challenges. We found that abstract-trained engines loose up to 6.6% F-score points when run on full-text data. Hence, the reuse of existing abstract-based NLP software in a full-text scenario is considered harmful because of heavy performance losses. Given the current lack of annotated full-text resources to train on, our study quantifies the price paid for this short cut.

  12. Data Assimilation to Extract Soil Moisture Information from SMAP Observations

    Directory of Open Access Journals (Sweden)

    Jana Kolassa

    2017-11-01

    Full Text Available This study compares different methods to extract soil moisture information through the assimilation of Soil Moisture Active Passive (SMAP observations. Neural network (NN and physically-based SMAP soil moisture retrievals were assimilated into the National Aeronautics and Space Administration (NASA Catchment model over the contiguous United States for April 2015 to March 2017. By construction, the NN retrievals are consistent with the global climatology of the Catchment model soil moisture. Assimilating the NN retrievals without further bias correction improved the surface and root zone correlations against in situ measurements from 14 SMAP core validation sites (CVS by 0.12 and 0.16, respectively, over the model-only skill, and reduced the surface and root zone unbiased root-mean-square error (ubRMSE by 0.005 m 3 m − 3 and 0.001 m 3 m − 3 , respectively. The assimilation reduced the average absolute surface bias against the CVS measurements by 0.009 m 3 m − 3 , but increased the root zone bias by 0.014 m 3 m − 3 . Assimilating the NN retrievals after a localized bias correction yielded slightly lower surface correlation and ubRMSE improvements, but generally the skill differences were small. The assimilation of the physically-based SMAP Level-2 passive soil moisture retrievals using a global bias correction yielded similar skill improvements, as did the direct assimilation of locally bias-corrected SMAP brightness temperatures within the SMAP Level-4 soil moisture algorithm. The results show that global bias correction methods may be able to extract more independent information from SMAP observations compared to local bias correction methods, but without accurate quality control and observation error characterization they are also more vulnerable to adverse effects from retrieval errors related to uncertainties in the retrieval inputs and algorithm. Furthermore, the results show that using global bias correction approaches without a

  13. Project Integration Architecture: Inter-Application Propagation of Information

    Science.gov (United States)

    Jones, William Henry

    2005-01-01

    A principal goal of the Project Integration Architecture (PIA) is to facilitate the meaningful inter-application transfer of application-value-added information. Such exchanging applications may be largely unrelated to each other except through their applicability to an overall project; however, the PIA effort recognizes as fundamental the need to make such applications cooperate despite wide disparaties either in the fidelity of the analyses carried out, or even the disciplines of the analysis. This paper discusses the approach and techniques applied and anticipated by the PIA project in treating this need.

  14. Case study of shallow soil mixing and soil vacuum extraction remediation project

    International Nuclear Information System (INIS)

    Carey, M.J.; Day, S.R.; Pinewski, R.; Schroder, D.

    1995-01-01

    Shallow Soil Mixing (SSM) and Soil Vacuum Extraction (SVE) are techniques which have been increasingly relied on for the insitu remediation of contaminated soils. The primary applications of SSM have been to mix cement, bentonite, or other reagents to modify properties and thereby remediate contaminated soils or sludges. Soil vacuum extraction has been used at numerous applications for insitu removal of contaminants from soils. At a recent project in southern Ohio, the two technologies were integrated and enhanced to extract volatile organic compounds (VOCs) from soils at a Department of Energy facility. Advantages of the integrated SSM/SVE technology over alternative technologies include a relatively rapid remediation compared to other in-situ techniques at a lower cost, less exposure of waste to the surface environment and elimination of off-site disposal. These advantages led to the selection of the use of both technologies on the project in Southern Ohio. The information presented in this paper is intended to provide Engineers and owners with the level of understanding necessary to apply soil mixing and vacuum extraction technology to a specific site. The most important steps in implementing the technology are site investigation, feasibility estimate, selection of performance criteria, selection of appropriate materials, bench scale testing and construction

  15. The effect of informed consent on stress levels associated with extraction of impacted mandibular third molars.

    Science.gov (United States)

    Casap, Nardy; Alterman, Michael; Sharon, Guy; Samuni, Yuval

    2008-05-01

    To evaluate the effect of informed consent on stress levels associated with removal of impacted mandibular third molars. A total of 60 patients scheduled for extraction of impacted mandibular third molars participated in this study. The patients were unaware of the study's objectives. Data from 20 patients established the baseline levels of electrodermal activity (EDA). The remaining 40 patients were randomly assigned into 2 equal groups receiving either a detailed document of informed consent, disclosing the possible risks involved with the surgery, or a simplified version. Pulse, blood pressure, and EDA were monitored before, during, and after completion of the consent document. Changes in EDA, but not in blood pressure, were measured on completion of either version of the consent document. A greater increase in EDA was associated with the detailed version of the consent document (P = .004). A similar concomitant increase (although nonsignificant) in pulse values was monitored on completion of both versions. Completion of overdisclosed document of informed consent is associated with changes in physiological parameters. The results suggest that overdetailed listing and disclosure before extraction of impacted mandibular third molars can increase patient stress.

  16. New approach to the adjustment of group cross sections fitting integral measurements

    International Nuclear Information System (INIS)

    Chao, Y.A.

    1979-01-01

    The adjustment of group cross sections fitting integral measurements is viewed as a process of estimating theoretical and/or experimental negligence errors to bring statistical consistency to the integral and differential data so that they can be combined to form an enlarged ensemble, based on which an improved estimation of the physical constants can be made. A three-step approach is suggested, and its formalism of general validity is developed. In step one, the data of negligence error are extracted from the given integral and differential data. The method of extraction is based on the concepts of prior probability and information entropy. It automatically leads to vanishing negligence error as the two sets of data are statistically consistent. The second step is to identify the sources of negligence error and adjust the data by an amount compensating for the extracted negligence discrepancy. In the last step, the two data sets, already adjusted to mutual consistency are combined as a single unified ensemble. Standard methods of statistics can then be applied to reestimate the physical constants. 1 figure

  17. Method of extracting significant trouble information of nuclear power plants using probabilistic analysis technique

    International Nuclear Information System (INIS)

    Shimada, Yoshio; Miyazaki, Takamasa

    2005-01-01

    In order to analyze and evaluate large amounts of trouble information of overseas nuclear power plants, it is necessary to select information that is significant in terms of both safety and reliability. In this research, a method of efficiently and simply classifying degrees of importance of components in terms of safety and reliability while paying attention to root-cause components appearing in the information was developed. Regarding safety, the reactor core damage frequency (CDF), which is used in the probabilistic analysis of a reactor, was used. Regarding reliability, the automatic plant trip probability (APTP), which is used in the probabilistic analysis of automatic reactor trips, was used. These two aspects were reflected in the development of criteria for classifying degrees of importance of components. By applying these criteria, a simple method of extracting significant trouble information of overseas nuclear power plants was developed. (author)

  18. Unified method to integrate and blend several, potentially related, sources of information for genetic evaluation.

    Science.gov (United States)

    Vandenplas, Jérémie; Colinet, Frederic G; Gengler, Nicolas

    2014-09-30

    A condition to predict unbiased estimated breeding values by best linear unbiased prediction is to use simultaneously all available data. However, this condition is not often fully met. For example, in dairy cattle, internal (i.e. local) populations lead to evaluations based only on internal records while widely used foreign sires have been selected using internally unavailable external records. In such cases, internal genetic evaluations may be less accurate and biased. Because external records are unavailable, methods were developed to combine external information that summarizes these records, i.e. external estimated breeding values and associated reliabilities, with internal records to improve accuracy of internal genetic evaluations. Two issues of these methods concern double-counting of contributions due to relationships and due to records. These issues could be worse if external information came from several evaluations, at least partially based on the same records, and combined into a single internal evaluation. Based on a Bayesian approach, the aim of this research was to develop a unified method to integrate and blend simultaneously several sources of information into an internal genetic evaluation by avoiding double-counting of contributions due to relationships and due to records. This research resulted in equations that integrate and blend simultaneously several sources of information and avoid double-counting of contributions due to relationships and due to records. The performance of the developed equations was evaluated using simulated and real datasets. The results showed that the developed equations integrated and blended several sources of information well into a genetic evaluation. The developed equations also avoided double-counting of contributions due to relationships and due to records. Furthermore, because all available external sources of information were correctly propagated, relatives of external animals benefited from the integrated

  19. 16 CFR 312.8 - Confidentiality, security, and integrity of personal information collected from children.

    Science.gov (United States)

    2010-01-01

    ... 16 Commercial Practices 1 2010-01-01 2010-01-01 false Confidentiality, security, and integrity of... COMMISSION REGULATIONS UNDER SPECIFIC ACTS OF CONGRESS CHILDREN'S ONLINE PRIVACY PROTECTION RULE § 312.8 Confidentiality, security, and integrity of personal information collected from children. The operator must...

  20. Study on advanced systematic function of the JNC geological disposal technical information integration system. Research document

    International Nuclear Information System (INIS)

    Ishihara, Yoshinao; Fukui, Hiroshi; Sagawa, Hiroshi; Matsunaga, Kenichi; Ito Takaya

    2004-02-01

    In this study, while attaining systematization about the technical know-how mutually utilized between geology environmental field, disposal technology (design) field and safety assessment field, the share function of general information in which the formation of an information share and the use promotion between the technical information management databases built for every field were aimed at as an advancement of the function of JNC Geological Disposal Technical Information Integration System considered, and the system function for realizing considered in integration of technical information. (1) Since the concrete information about geology environment which is gradually updated with progress of stratum disposal research, or increases in reflected suitable for research of design and safety assessment. After arranging the form suitable for systematizing technical information, while arranging the technical information in both the fields of design and safety assessment with the form of two classes based on tasks/works, it systematized planning adjustment about delivery of technical information with geology environmental field. (2) In order to aim at integration of 3-fields technical information of geological disposal, based on the examination result of systematization of technical information, the function of mutual use of the information managed in two or more databases was considered. Moreover, while considering system functions, such as management of the use history of technical information, connection of information use, and a notice of common information, the system operation windows in consideration of the ease of operation was examined. (author)

  1. Architectural Building A Public Key Infrastructure Integrated Information Space

    Directory of Open Access Journals (Sweden)

    Vadim Ivanovich Korolev

    2015-10-01

    Full Text Available The article keeps under consideration the mattersto apply the cryptographic system having a public key to provide information security and to implya digital signature. It performs the analysis of trust models at the formation of certificates and their use. The article describes the relationships between the trust model and the architecture public key infrastructure. It contains conclusions in respect of the options for building the public key infrastructure for integrated informationspace.

  2. The Information Book Genre: Its Role in Integrated Science Literacy Research and Practice

    Science.gov (United States)

    Pappas, Christine C.

    2006-01-01

    There has been a call for approaches that connect science learning with literacy, yet the use of, and research on, children's literature information books in science instruction has been quite limited. Because the discipline of science involves distinctive generic linguistic registers, what information books should be integrated in science…

  3. Wiki-Based Data and Information Integration (WikiDI2) System, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed innovation is: A data and information integration (DI2) system built from the methods and tools used to create Wiki websites. Wiki (Hawaiian for...

  4. Machinery running state identification based on discriminant semi-supervised local tangent space alignment for feature fusion and extraction

    International Nuclear Information System (INIS)

    Su, Zuqiang; Xiao, Hong; Zhang, Yi; Tang, Baoping; Jiang, Yonghua

    2017-01-01

    Extraction of sensitive features is a challenging but key task in data-driven machinery running state identification. Aimed at solving this problem, a method for machinery running state identification that applies discriminant semi-supervised local tangent space alignment (DSS-LTSA) for feature fusion and extraction is proposed. Firstly, in order to extract more distinct features, the vibration signals are decomposed by wavelet packet decomposition WPD, and a mixed-domain feature set consisted of statistical features, autoregressive (AR) model coefficients, instantaneous amplitude Shannon entropy and WPD energy spectrum is extracted to comprehensively characterize the properties of machinery running state(s). Then, the mixed-dimension feature set is inputted into DSS-LTSA for feature fusion and extraction to eliminate redundant information and interference noise. The proposed DSS-LTSA can extract intrinsic structure information of both labeled and unlabeled state samples, and as a result the over-fitting problem of supervised manifold learning and blindness problem of unsupervised manifold learning are overcome. Simultaneously, class discrimination information is integrated within the dimension reduction process in a semi-supervised manner to improve sensitivity of the extracted fusion features. Lastly, the extracted fusion features are inputted into a pattern recognition algorithm to achieve the running state identification. The effectiveness of the proposed method is verified by a running state identification case in a gearbox, and the results confirm the improved accuracy of the running state identification. (paper)

  5. Procedure and information displays in advanced nuclear control rooms: experimental evaluation of an integrated design.

    Science.gov (United States)

    Chen, Yue; Gao, Qin; Song, Fei; Li, Zhizhong; Wang, Yufan

    2017-08-01

    In the main control rooms of nuclear power plants, operators frequently have to switch between procedure displays and system information displays. In this study, we proposed an operation-unit-based integrated design, which combines the two displays to facilitate the synthesis of information. We grouped actions that complete a single goal into operation units and showed these operation units on the displays of system states. In addition, we used different levels of visual salience to highlight the current unit and provided a list of execution history records. A laboratory experiment, with 42 students performing a simulated procedure to deal with unexpected high pressuriser level, was conducted to compare this design against an action-based integrated design and the existing separated-displays design. The results indicate that our operation-unit-based integrated design yields the best performance in terms of time and completion rate and helped more participants to detect unexpected system failures. Practitioner Summary: In current nuclear control rooms, operators frequently have to switch between procedure and system information displays. We developed an integrated design that incorporates procedure information into system displays. A laboratory study showed that the proposed design significantly improved participants' performance and increased the probability of detecting unexpected system failures.

  6. Assessing the benefits of the integration of location information in e-Government

    Science.gov (United States)

    Vandenbroucke, D.; Vancauwenberghe, G.; Crompvoets, J.

    2014-12-01

    Over the past years more and more geospatial data have been made readily accessible for different user communities as part of government efforts to set-up Spatial Data Infrastructures. As a result users from different sectors can search, find and bind spatial information and combine it with their own data resources and applications. However, too often, spatial data applications and services remain organised as separate silos, not well integrated in the business processes they are supposed to support. The European Union Location Framework (EULF), as part of the Interoperability Solutions for European Public Administrations (ISA) Programme of the EU (EC-DG DIGIT), aims to improve the integration of location information in e-Government processes through a better policy and strategy alignment, and through the improved legal, organisational, semantic and technical interoperability of data and systems. The EULF seeks to enhance interactions between Governments, Businesses and Citizens with location information and location enabled services and to make them part of the more generic ICT infrastructures of public administrations. One of the challenges that arise in this context is to describe, estimate or measure the benefits and added value of this integration of location information in e-Government. In the context of the EULF several existing approaches to assess the benefits of spatially enabled services and applications in e-Government have been studied. Two examples will be presented, one from Denmark, the other from Abu Dhabi. Both served as input to the approach developed for the EULF. A concrete case to estimate benefits at service and process level will be given with the aim to respond questions such as "which indicators can be used and how to measure them", "how can process owners collect the necessary information", "how to solve the benefits attribute question" and "how to extrapolate findings from one level of analysis to another"?

  7. Formation of the integrated system of the marketing information as the tool of perfection of marketing activity of innovative high school

    OpenAIRE

    D. Bogdanov

    2014-01-01

    The author proves necessity of formation of the integrated system of the marketing information of innovative high school, considers methodical aspects formation of the integrated system of the marketing information of innovative high school, develops structure of the integrated system of the marketing information of innovative high school X and makes recommendations about management of efficiency of the integrated system of the marketing information in innovative high school.

  8. Point Cloud Classification of Tesserae from Terrestrial Laser Data Combined with Dense Image Matching for Archaeological Information Extraction

    Science.gov (United States)

    Poux, F.; Neuville, R.; Billen, R.

    2017-08-01

    Reasoning from information extraction given by point cloud data mining allows contextual adaptation and fast decision making. However, to achieve this perceptive level, a point cloud must be semantically rich, retaining relevant information for the end user. This paper presents an automatic knowledge-based method for pre-processing multi-sensory data and classifying a hybrid point cloud from both terrestrial laser scanning and dense image matching. Using 18 features including sensor's biased data, each tessera in the high-density point cloud from the 3D captured complex mosaics of Germigny-des-prés (France) is segmented via a colour multi-scale abstraction-based featuring extracting connectivity. A 2D surface and outline polygon of each tessera is generated by a RANSAC plane extraction and convex hull fitting. Knowledge is then used to classify every tesserae based on their size, surface, shape, material properties and their neighbour's class. The detection and semantic enrichment method shows promising results of 94% correct semantization, a first step toward the creation of an archaeological smart point cloud.

  9. Cancer survival classification using integrated data sets and intermediate information.

    Science.gov (United States)

    Kim, Shinuk; Park, Taesung; Kon, Mark

    2014-09-01

    Although numerous studies related to cancer survival have been published, increasing the prediction accuracy of survival classes still remains a challenge. Integration of different data sets, such as microRNA (miRNA) and mRNA, might increase the accuracy of survival class prediction. Therefore, we suggested a machine learning (ML) approach to integrate different data sets, and developed a novel method based on feature selection with Cox proportional hazard regression model (FSCOX) to improve the prediction of cancer survival time. FSCOX provides us with intermediate survival information, which is usually discarded when separating survival into 2 groups (short- and long-term), and allows us to perform survival analysis. We used an ML-based protocol for feature selection, integrating information from miRNA and mRNA expression profiles at the feature level. To predict survival phenotypes, we used the following classifiers, first, existing ML methods, support vector machine (SVM) and random forest (RF), second, a new median-based classifier using FSCOX (FSCOX_median), and third, an SVM classifier using FSCOX (FSCOX_SVM). We compared these methods using 3 types of cancer tissue data sets: (i) miRNA expression, (ii) mRNA expression, and (iii) combined miRNA and mRNA expression. The latter data set included features selected either from the combined miRNA/mRNA profile or independently from miRNAs and mRNAs profiles (IFS). In the ovarian data set, the accuracy of survival classification using the combined miRNA/mRNA profiles with IFS was 75% using RF, 86.36% using SVM, 84.09% using FSCOX_median, and 88.64% using FSCOX_SVM with a balanced 22 short-term and 22 long-term survivor data set. These accuracies are higher than those using miRNA alone (70.45%, RF; 75%, SVM; 75%, FSCOX_median; and 75%, FSCOX_SVM) or mRNA alone (65.91%, RF; 63.64%, SVM; 72.73%, FSCOX_median; and 70.45%, FSCOX_SVM). Similarly in the glioblastoma multiforme data, the accuracy of miRNA/mRNA using IFS

  10. Critical Evaluation of Validation Rules Automated Extraction from Data

    Directory of Open Access Journals (Sweden)

    David Pejcoch

    2014-10-01

    Full Text Available The goal of this article is to critically evaluate a possibility of automatic extraction of such kind of rules which could be later used within a Data Quality Management process for validation of records newly incoming to Information System. For practical demonstration the 4FT-Miner procedure implemented in LISpMiner System was chosen. A motivation for this task is the potential simplification of projects focused on Data Quality Management. Initially, this article is going to critically evaluate a possibility of fully automated extraction with the aim to identify strengths and weaknesses of this approach in comparison to its alternative, when at least some a priori knowledge is available. As a result of practical implementation, this article provides design of recommended process which would be used as a guideline for future projects. Also the question of how to store and maintain extracted rules and how to integrate them with existing tools supporting Data Quality Management is discussed

  11. SPECTRa-T: machine-based data extraction and semantic searching of chemistry e-theses.

    Science.gov (United States)

    Downing, Jim; Harvey, Matt J; Morgan, Peter B; Murray-Rust, Peter; Rzepa, Henry S; Stewart, Diana C; Tonge, Alan P; Townsend, Joe A

    2010-02-22

    The SPECTRa-T project has developed text-mining tools to extract named chemical entities (NCEs), such as chemical names and terms, and chemical objects (COs), e.g., experimental spectral assignments and physical chemistry properties, from electronic theses (e-theses). Although NCEs were readily identified within the two major document formats studied, only the use of structured documents enabled identification of chemical objects and their association with the relevant chemical entity (e.g., systematic chemical name). A corpus of theses was analyzed and it is shown that a high degree of semantic information can be extracted from structured documents. This integrated information has been deposited in a persistent Resource Description Framework (RDF) triple-store that allows users to conduct semantic searches. The strength and weaknesses of several document formats are reviewed.

  12. Integrating Assessment into Recurring Information Literacy Instruction: A Case Study from LIS Education

    Science.gov (United States)

    Searing, Susan E.

    2007-01-01

    Information literacy instruction is integrated into the distance education program in library and information science (LEEP) at the University of Illinois, Urbana-Champaign (UIUC). This article describes the LEEP program and the library services provided to its students. Published research on LEEP and related topics in librarianship is reviewed.…

  13. Extraction of diffuse correlation spectroscopy flow index by integration of Nth-order linear model with Monte Carlo simulation

    Science.gov (United States)

    Shang, Yu; Li, Ting; Chen, Lei; Lin, Yu; Toborek, Michal; Yu, Guoqiang

    2014-05-01

    Conventional semi-infinite solution for extracting blood flow index (BFI) from diffuse correlation spectroscopy (DCS) measurements may cause errors in estimation of BFI (αDB) in tissues with small volume and large curvature. We proposed an algorithm integrating Nth-order linear model of autocorrelation function with the Monte Carlo simulation of photon migrations in tissue for the extraction of αDB. The volume and geometry of the measured tissue were incorporated in the Monte Carlo simulation, which overcome the semi-infinite restrictions. The algorithm was tested using computer simulations on four tissue models with varied volumes/geometries and applied on an in vivo stroke model of mouse. Computer simulations shows that the high-order (N ≥ 5) linear algorithm was more accurate in extracting αDB (errors values of errors in extracting αDB were similar to those reconstructed from the noise-free DCS data. In addition, the errors in extracting the relative changes of αDB using both linear algorithm and semi-infinite solution were fairly small (errors < ±2.0%) and did not rely on the tissue volume/geometry. The experimental results from the in vivo stroke mice agreed with those in simulations, demonstrating the robustness of the linear algorithm. DCS with the high-order linear algorithm shows the potential for the inter-subject comparison and longitudinal monitoring of absolute BFI in a variety of tissues/organs with different volumes/geometries.

  14. Sharing Service Resource Information for Application Integration in a Virtual Enterprise - Modeling the Communication Protocol for Exchanging Service Resource Information

    Science.gov (United States)

    Yamada, Hiroshi; Kawaguchi, Akira

    Grid computing and web service technologies enable us to use networked resources in a coordinated manner. An integrated service is made of individual services running on coordinated resources. In order to achieve such coordinated services autonomously, the initiator of a coordinated service needs to know detailed service resource information. This information ranges from static attributes like the IP address of the application server to highly dynamic ones like the CPU load. The most famous wide-area service discovery mechanism based on names is DNS. Its hierarchical tree organization and caching methods take advantage of the static information managed. However, in order to integrate business applications in a virtual enterprise, we need a discovery mechanism to search for the optimal resources based on the given a set of criteria (search keys). In this paper, we propose a communication protocol for exchanging service resource information among wide-area systems. We introduce the concept of the service domain that consists of service providers managed under the same management policy. This concept of the service domain is similar to that for autonomous systems (ASs). In each service domain, the service information provider manages the service resource information of service providers that exist in this service domain. The service resource information provider exchanges this information with other service resource information providers that belong to the different service domains. We also verified the protocol's behavior and effectiveness using a simulation model developed for proposed protocol.

  15. Text-in-context: a method for extracting findings in mixed-methods mixed research synthesis studies.

    Science.gov (United States)

    Sandelowski, Margarete; Leeman, Jennifer; Knafl, Kathleen; Crandell, Jamie L

    2013-06-01

    Our purpose in this paper is to propose a new method for extracting findings from research reports included in mixed-methods mixed research synthesis studies. International initiatives in the domains of systematic review and evidence synthesis have been focused on broadening the conceptualization of evidence, increased methodological inclusiveness and the production of evidence syntheses that will be accessible to and usable by a wider range of consumers. Initiatives in the general mixed-methods research field have been focused on developing truly integrative approaches to data analysis and interpretation. The data extraction challenges described here were encountered, and the method proposed for addressing these challenges was developed, in the first year of the ongoing (2011-2016) study: Mixed-Methods Synthesis of Research on Childhood Chronic Conditions and Family. To preserve the text-in-context of findings in research reports, we describe a method whereby findings are transformed into portable statements that anchor results to relevant information about sample, source of information, time, comparative reference point, magnitude and significance and study-specific conceptions of phenomena. The data extraction method featured here was developed specifically to accommodate mixed-methods mixed research synthesis studies conducted in nursing and other health sciences, but reviewers might find it useful in other kinds of research synthesis studies. This data extraction method itself constitutes a type of integration to preserve the methodological context of findings when statements are read individually and in comparison to each other. © 2012 Blackwell Publishing Ltd.

  16. Study on integrated evaluation of sandstone-hosted uranium metallogenic potential in southern Yili basin

    International Nuclear Information System (INIS)

    Han Shaoyang; Ke Dan; Xu Jianguo; Zheng Enjiu; Li Shengxiang

    2008-01-01

    Plenty of geological data have been accumulated during mineral resource survey in China; under the guidance of new metallogenic theories, it is an important task of how to use these data most effectively for the new cycle uranium survey. In this paper, the flow of establishing the integrated mineral deposits prospecting model for sandstone-hosted uranium deposits is put forward. Based on studying geologic, hydrogeologic and regional geophysical field characteristics of representative uranium deposits No. 512 in southern Yili basin, its multi-source information descriptive model has been established, from which 512-type integrated prospecting models of sandstone-hosted uranium orefield and deposits are summarized. According to the established integrated prospecting models, the metallogenic information extraction of sandstone-hosted uranium deposits has completed in the study area. Finally, the integrated quantitative evaluation of sandstone-hosted uranium metallogenic potential is performed by using the evidence weighing method to integrate middle scale multi-source metallogenic information in the southern Yili basin, and good prediction effect is obtained. (authors)

  17. Audio-Visual Speech Recognition Using Lip Information Extracted from Side-Face Images

    Directory of Open Access Journals (Sweden)

    Koji Iwano

    2007-03-01

    Full Text Available This paper proposes an audio-visual speech recognition method using lip information extracted from side-face images as an attempt to increase noise robustness in mobile environments. Our proposed method assumes that lip images can be captured using a small camera installed in a handset. Two different kinds of lip features, lip-contour geometric features and lip-motion velocity features, are used individually or jointly, in combination with audio features. Phoneme HMMs modeling the audio and visual features are built based on the multistream HMM technique. Experiments conducted using Japanese connected digit speech contaminated with white noise in various SNR conditions show effectiveness of the proposed method. Recognition accuracy is improved by using the visual information in all SNR conditions. These visual features were confirmed to be effective even when the audio HMM was adapted to noise by the MLLR method.

  18. All-source Information Management and Integration for Improved Collective Intelligence Production

    Science.gov (United States)

    2011-06-01

    Intelligence (ELINT) • Open Source Intelligence ( OSINT ) • Technical Intelligence (TECHINT) These intelligence disciplines produce... intelligence , measurement and signature intelligence , signals intelligence , and open - source data, in the production of intelligence . All- source intelligence ...All- Source Information Integration and Management) R&D Project 3 All- Source Intelligence

  19. The Next Step in Educational Program Budgets and Information Resource Management: Integrated Data Structures.

    Science.gov (United States)

    Jackowski, Edward M.

    1988-01-01

    Discusses the role that information resource management (IRM) plays in educational program-oriented budgeting (POB), and presents a theoretical IRM model. Highlights include design considerations for integrated data systems; database management systems (DBMS); and how POB data can be integrated to enhance its value and use within an educational…

  20. Measuring Integration of Information and Communication Technology in Education: An Item Response Modeling Approach

    Science.gov (United States)

    Peeraer, Jef; Van Petegem, Peter

    2012-01-01

    This research describes the development and validation of an instrument to measure integration of Information and Communication Technology (ICT) in education. After literature research on definitions of integration of ICT in education, a comparison is made between the classical test theory and the item response modeling approach for the…

  1. An API-based search system for one click access to information

    NARCIS (Netherlands)

    Ionita, Dan; Tax, Niek; Hiemstra, Djoerd

    This paper proposes a prototype One Click access system, based on previous work in the field and the related 1CLICK-2@NTCIR10 task. The proposed solution integrates methods from into a three tier algorithm: query categorization, information extraction and output generation and offers suggestions on

  2. Semi-Automated Approach for Mapping Urban Trees from Integrated Aerial LiDAR Point Cloud and Digital Imagery Datasets

    Science.gov (United States)

    Dogon-Yaro, M. A.; Kumar, P.; Rahman, A. Abdul; Buyuksalih, G.

    2016-09-01

    Mapping of trees plays an important role in modern urban spatial data management, as many benefits and applications inherit from this detailed up-to-date data sources. Timely and accurate acquisition of information on the condition of urban trees serves as a tool for decision makers to better appreciate urban ecosystems and their numerous values which are critical to building up strategies for sustainable development. The conventional techniques used for extracting trees include ground surveying and interpretation of the aerial photography. However, these techniques are associated with some constraints, such as labour intensive field work and a lot of financial requirement which can be overcome by means of integrated LiDAR and digital image datasets. Compared to predominant studies on trees extraction mainly in purely forested areas, this study concentrates on urban areas, which have a high structural complexity with a multitude of different objects. This paper presented a workflow about semi-automated approach for extracting urban trees from integrated processing of airborne based LiDAR point cloud and multispectral digital image datasets over Istanbul city of Turkey. The paper reveals that the integrated datasets is a suitable technology and viable source of information for urban trees management. As a conclusion, therefore, the extracted information provides a snapshot about location, composition and extent of trees in the study area useful to city planners and other decision makers in order to understand how much canopy cover exists, identify new planting, removal, or reforestation opportunities and what locations have the greatest need or potential to maximize benefits of return on investment. It can also help track trends or changes to the urban trees over time and inform future management decisions.

  3. Comparison and improvements of different Bayesian procedures to integrate external information into genetic evaluations

    NARCIS (Netherlands)

    Vandenplas, J.; Gengler, N.

    2012-01-01

    The aim of this research was to compare different Bayesian procedures to integrate information from outside a given evaluation system, hereafter called external information, and in this context estimated breeding values (EBV), into this genetic evaluation, hereafter called internal evaluation, and

  4. Organisational Culture Matters for System Integration in Health Care

    Science.gov (United States)

    Munir, Samina K.; Kay, Stephen

    2003-01-01

    This paper illustrates the importance of organisational culture for Clinical Information Systems (CIS) integration. The study is based on data collected in intensive care units in the UK and Denmark. Data were collected using qualitative methods, i.e., observations, interviews and shadowing of health care providers, together with a questionnaire at each site. The data are analysed to extract salient variables for CIS integration, and it is shown that these variables can be separated into two categories that describe the ‘Actual Usefulness’ of the system and the ‘Organisational Culture’. This model is then extended to show that CIS integration directly affects the work processes of the organisation, forming an iterative process of change as a CIS is introduced and integrated. PMID:14728220

  5. Architecture and data processing alternatives for the TSE computer. Volume 2: Extraction of topological information from an image by the Tse computer

    Science.gov (United States)

    Jones, J. R.; Bodenheimer, R. E.

    1976-01-01

    A simple programmable Tse processor organization and arithmetic operations necessary for extraction of the desired topological information are described. Hardware additions to this organization are discussed along with trade-offs peculiar to the tse computing concept. An improved organization is presented along with the complementary software for the various arithmetic operations. The performance of the two organizations is compared in terms of speed, power, and cost. Software routines developed to extract the desired information from an image are included.

  6. An integrative review of information systems and terminologies used in local health departments.

    Science.gov (United States)

    Olsen, Jeanette; Baisch, Mary Jo

    2014-02-01

    The purpose of this integrative review based on the published literature was to identify information systems currently being used by local health departments and to determine the extent to which standard terminology was used to communicate data, interventions, and outcomes to improve public health informatics at the local health department (LHD) level and better inform research, policy, and programs. Whittemore and Knafl's integrative review methodology was used. Data were obtained through key word searches of three publication databases and reference lists of retrieved articles and consulting with experts to identify landmark works. The final sample included 45 articles analyzed and synthesized using the matrix method. The results indicated a wide array of information systems were used by LHDs and supported diverse functions aligned with five categories: administration; surveillance; health records; registries; and consumer resources. Detail regarding specific programs being used, location or extent of use, or effectiveness was lacking. The synthesis indicated evidence of growing interest in health information exchange groups, yet few studies described use of data standards or standard terminology in LHDs. Research to address these gaps is needed to provide current, meaningful data that inform public health informatics research, policy, and initiatives at and across the LHD level. Coordination at a state or national level is recommended to collect information efficiently about LHD information systems that will inform improvements while minimizing duplication of efforts and financial burden. Until this happens, efforts to strengthen LHD information systems and policies may be significantly challenged.

  7. Information Extraction for Social Media

    NARCIS (Netherlands)

    Habib, M. B.; Keulen, M. van

    2014-01-01

    The rapid growth in IT in the last two decades has led to a growth in the amount of information available online. A new style for sharing information is social media. Social media is a continuously instantly updated source of information. In this position paper, we propose a framework for

  8. A construction scheme of web page comment information extraction system based on frequent subtree mining

    Science.gov (United States)

    Zhang, Xiaowen; Chen, Bingfeng

    2017-08-01

    Based on the frequent sub-tree mining algorithm, this paper proposes a construction scheme of web page comment information extraction system based on frequent subtree mining, referred to as FSM system. The entire system architecture and the various modules to do a brief introduction, and then the core of the system to do a detailed description, and finally give the system prototype.

  9. Bebidas saborizadas obtidas de extratos de quirera de arroz, de arroz integral e de soja Flavored drinks obtained from extracts of broken rice and brown rice

    Directory of Open Access Journals (Sweden)

    Manoel Soares Soares Junior

    2010-04-01

    Full Text Available Conduziu-se este trabalho, com o objetivo de desenvolver bebidas elaboradas com extratos de quirera de arroz e de arroz integral e comparar as características químicas e sensoriais destas com bebida elaborada com extrato de soja. Utilizou-se um delineamento inteiramente casualisado, com três tratamentos (bebidas de extrato de quirera de arroz, extrato de arroz integral e extrato soja. As seguintes análises foram realizadas: umidade, cinzas, proteínas, lipídios, carboidratos totais, valor energético, cálcio, magnésio, cobre, manganês, ferro e zinco. Também determinou-se a aceitabilidade e a intenção de compra. A bebida elaborada com o extrato de soja contém maiores teores de cinzas, proteínas, lipídeos e minerais em relação ao extrato de quirera de arroz, que possui o maior teor de carboidratos e valor energético. Todas as bebidas obtiveram boa aceitação, com intenção de compra pela população entrevistada acima de 95%, sendo que mais de 99% dos provadores comprariam a bebida elaborada com o extrato de arroz integral (tratamento de maior aceitação. As bebidas elaboradas com extrato de arroz integral ou de quirera de arroz são uma alternativa viável para as pessoas que possuam intolerância à lactose do leite de origem animal e/ou alergia às proteínas da soja.The aim of this work was to develop drinks based on extracts of broken rice and brown rice and to compare their chemical and sensory characteristics with a drink made of soy extract. A totally randomized design was applied, with three treatments (broken rice extract, brown rice extract and soy extract drinks. The following analyses were performed: moisture, ash, protein, lipids, total carbohydrates, caloric value, calcium, magnesium, copper, manganese, iron and zinc, besides the determination of consumer acceptability and buying intention. The soy-based drink has the highest ash, protein, lipids and mineral contents when compared to broken rice extract, which in

  10. SIDECACHE: Information access, management and dissemination framework for web services.

    Science.gov (United States)

    Doderer, Mark S; Burkhardt, Cory; Robbins, Kay A

    2011-06-14

    Many bioinformatics algorithms and data sets are deployed using web services so that the results can be explored via the Internet and easily integrated into other tools and services. These services often include data from other sites that is accessed either dynamically or through file downloads. Developers of these services face several problems because of the dynamic nature of the information from the upstream services. Many publicly available repositories of bioinformatics data frequently update their information. When such an update occurs, the developers of the downstream service may also need to update. For file downloads, this process is typically performed manually followed by web service restart. Requests for information obtained by dynamic access of upstream sources is sometimes subject to rate restrictions. SideCache provides a framework for deploying web services that integrate information extracted from other databases and from web sources that are periodically updated. This situation occurs frequently in biotechnology where new information is being continuously generated and the latest information is important. SideCache provides several types of services including proxy access and rate control, local caching, and automatic web service updating. We have used the SideCache framework to automate the deployment and updating of a number of bioinformatics web services and tools that extract information from remote primary sources such as NCBI, NCIBI, and Ensembl. The SideCache framework also has been used to share research results through the use of a SideCache derived web service.

  11. Digital image integration technique of multi-geoscience information dominated by aerial radiometric measurements

    International Nuclear Information System (INIS)

    Liu Dechang; Sun Maorong; Zhu Deling; Zhang Jingbo; He Jianguo; Dong Xiuzhen

    1992-02-01

    The geologic metallogenetic environment of uranium at Lian Shan Guan region has been studied by using digital image integration technique of multi-geoscience information with aerial radiometric measurements. It includes the classification of uranium-bearing rock, recognizing patterns of ore-forming and geologic mapping in ore field. Some new tectonic structure was found in this region that gives significant information for further exploring of uranium ore. After multi-parameters screening of aerial radiometric data, patterns recognizing and multi-geoscience information integration analysis, four prospective metallogenetic zones were predicted, and the predication was proved by further geologic survey. Three of the four zones are very encouraging, where ore-forming structures, hydrothermal deposits, wall-rock alteration, primary and secondary uranium ore and rich uranium mineralization are discovered. The department of geologic exploring has decided that these zones will enjoy priority in the examination for further prospecting of uranium ores

  12. Effects of Brief Integrated Information Literacy Education Sessions on Undergraduate Engineering Students' Interdisciplinary Research

    Science.gov (United States)

    Talikka, Marja; Soukka, Risto; Eskelinen, Harri

    2018-01-01

    Engineering students often conduct information searches without sufficient consideration of the context of their research topic. This article discusses how development of a new information literacy (IL) mindset through instruction in integrated IL education affects students' understanding of research problems and formulation of information search…

  13. Telematics and smart cards in integrated health information system.

    Science.gov (United States)

    Sicurello, F; Nicolosi, A

    1997-01-01

    Telematics and information technology are the base on which it will be possible to build an integrated health information system to support population and improve their quality of life. This system should be based on record linkage of all data based on the interactions of the patients with the health structures, such as general practitioners, specialists, health institutes and hospitals, pharmacies, etc. The record linkage can provide the connection and integration of various records, thanks to the use of telematic technology (either urban or geographical local networks, such as the Internet) and electronic data cards. Particular emphasis should be placed on the introduction of smart cards, such as portable health cards, which will contain a standardized data set and will be sufficient to access different databases found in various health services. The inter-operability of the social-health records (including multimedia types) and the smart cards (which are one of the most important prerequisites for the homogenization and wide diffusion of these cards at an European level) should be strongly taken into consideration. In this framework a project is going to be developed aiming towards the integration of various data bases distributed territorially, from the reading of the software and the updating of the smart cards to the complete management of the patients' evaluation records, to the quality of the services offered and to the health planning. The applications developed will support epidemiological investigation software and data analysis. The inter-connection of all the databases of the various structures involved will take place through a coordination center, the most important system of which we will call "record linkage" or "integrated database". Smart cards will be distributed to a sample group of possible users and the necessary smart card management tools will be installed in all the structures involved. All the final users (the patients) in the whole

  14. 76 FR 63941 - Agency Information Collection Activities: Business Transformation-Automated Integrated Operating...

    Science.gov (United States)

    2011-10-14

    ... Activities: Business Transformation--Automated Integrated Operating Environment (IOE), New Information... Internet by federal agencies through efforts like USCIS' Business Transformation initiative. The USCIS ELIS... the USCIS Business Transformation initiative and wizard technology. The supporting statement can be...

  15. Comparing the neural basis of monetary reward and cognitive feedback during information-integration category learning.

    Science.gov (United States)

    Daniel, Reka; Pollmann, Stefan

    2010-01-06

    The dopaminergic system is known to play a central role in reward-based learning (Schultz, 2006), yet it was also observed to be involved when only cognitive feedback is given (Aron et al., 2004). Within the domain of information-integration category learning, in which information from several stimulus dimensions has to be integrated predecisionally (Ashby and Maddox, 2005), the importance of contingent feedback is well established (Maddox et al., 2003). We examined the common neural correlates of reward anticipation and prediction error in this task. Sixteen subjects performed two parallel information-integration tasks within a single event-related functional magnetic resonance imaging session but received a monetary reward only for one of them. Similar functional areas including basal ganglia structures were activated in both task versions. In contrast, a single structure, the nucleus accumbens, showed higher activation during monetary reward anticipation compared with the anticipation of cognitive feedback in information-integration learning. Additionally, this activation was predicted by measures of intrinsic motivation in the cognitive feedback task and by measures of extrinsic motivation in the rewarded task. Our results indicate that, although all other structures implicated in category learning are not significantly affected by altering the type of reward, the nucleus accumbens responds to the positive incentive properties of an expected reward depending on the specific type of the reward.

  16. Information Technologies and Supply Chain Integration

    DEFF Research Database (Denmark)

    Lemoine, W; Mortensen, Ole

    integration. This study illustrates, from an empirical point of view, the problems associ-ated to SC integration among European firms operating in global/international markets. The focus is on the relationship between two echelons in the supply chain: manufacturers and their transport and logistics service......The goal of the Supply Chain Management process is to create value for customers, stakeholders and all supply chain members, through the integration of different processes like manufacturing flow management, customer service and order fulfillment. However, many firms fail in the path of achieving...... integration. Our results show that the current business integra-tion practices between manufacturers and TLSPs are primarily restricted to some sub-processes in three key SC processes: Customer Service Management, order fulfillment and backwards logistics. The use of IT tools to support the integration has...

  17. Medicaid Analytic eXtract (MAX) General Information

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicaid Analytic eXtract (MAX) data is a set of person-level data files on Medicaid eligibility, service utilization, and payments. The MAX data are created to...

  18. Integrating Decentralized Indoor Evacuation with Information Depositories in the Field

    Directory of Open Access Journals (Sweden)

    Haifeng Zhao

    2017-07-01

    Full Text Available The lonelier evacuees find themselves, the riskier become their wayfinding decisions. This research supports single evacuees in a dynamically changing environment with risk-aware guidance. It deploys the concept of decentralized evacuation, where evacuees are guided by smartphones acquiring environmental knowledge and risk information via exploration and knowledge sharing by peer-to-peer communication. Peer-to-peer communication, however, relies on the chance that people come into communication range with each other. This chance can be low. To bridge between people being not at the same time at the same places, this paper suggests information depositories at strategic locations to improve information sharing. Information depositories collect the knowledge acquired by the smartphones of evacuees passing by, maintain this information, and convey it to other passing-by evacuees. Multi-agent simulation implementing these depositories in an indoor environment shows that integrating depositories improves evacuation performance: It enhances the risk awareness and consequently increases the chance that people survive and reduces their evacuation time. For evacuating dynamic events, deploying depositories at staircases has been shown more effective than deploying them in corridors.

  19. Instantaneous Shoreline Extraction Utilizing Integrated Spectrum and Shadow Analysis From LiDAR Data and High-resolution Satellite Imagery

    Science.gov (United States)

    Lee, I.-Chieh

    Shoreline delineation and shoreline change detection are expensive processes in data source acquisition and manual shoreline delineation. These costs confine the frequency and interval of shoreline mapping periods. In this dissertation, a new shoreline delineation approach was developed targeting on lowering the data source cost and reducing human labor. To lower the cost of data sources, we used the public domain LiDAR data sets and satellite images to delineate shorelines without the requirement of data sets being acquired simultaneously, which is a new concept in this field. To reduce the labor cost, we made improvements in classifying LiDAR points and satellite images. Analyzing shadow relations with topography to improve the satellite image classification performance is also a brand-new concept. The extracted shoreline of the proposed approach could achieve an accuracy of 1.495 m RMSE, or 4.452m at the 95% confidence level. Consequently, the proposed approach could successfully lower the cost and shorten the processing time, in other words, to increase the shoreline mapping frequency with a reasonable accuracy. However, the extracted shoreline may not compete with the shoreline extracted by aerial photogrammetric procedures in the aspect of accuracy. Hence, this is a trade-off between cost and accuracy. This approach consists of three phases, first, a shoreline extraction procedure based mainly on LiDAR point cloud data with multispectral information from satellite images. Second, an object oriented shoreline extraction procedure to delineate shoreline solely from satellite images; in this case WorldView-2 images were used. Third, a shoreline integration procedure combining these two shorelines based on actual shoreline changes and physical terrain properties. The actual data source cost would only be from the acquisition of satellite images. On the other hand, only two processes needed human attention. First, the shoreline within harbor areas needed to be

  20. Diablo Canyon plant information management system and integrated communication system

    International Nuclear Information System (INIS)

    Stanley, J.W.; Groff, C.

    1990-01-01

    The implementation of a comprehensive maintenance system called the plant information management system (PIMS) at the Diablo Canyon plant, together with its associated integrated communication system (ICS), is widely regarded as the most comprehensive undertaking of its kind in the nuclear industry. This paper provides an overview of the program at Diablo Canyon, an evaluation of system benefits, and highlights the future course of PIMS