WorldWideScience

Sample records for integration information extraction

  1. Information Extraction, Data Integration, and Uncertain Data Management: The State of The Art

    NARCIS (Netherlands)

    Habib, Mena Badieh; van Keulen, Maurice

    2011-01-01

    Information Extraction, data Integration, and uncertain data management are different areas of research that got vast focus in the last two decades. Many researches tackled those areas of research individually. However, information extraction systems should have integrated with data integration

  2. Integrating Information Extraction Agents into a Tourism Recommender System

    Science.gov (United States)

    Esparcia, Sergio; Sánchez-Anguix, Víctor; Argente, Estefanía; García-Fornes, Ana; Julián, Vicente

    Recommender systems face some problems. On the one hand information needs to be maintained updated, which can result in a costly task if it is not performed automatically. On the other hand, it may be interesting to include third party services in the recommendation since they improve its quality. In this paper, we present an add-on for the Social-Net Tourism Recommender System that uses information extraction and natural language processing techniques in order to automatically extract and classify information from the Web. Its goal is to maintain the system updated and obtain information about third party services that are not offered by service providers inside the system.

  3. An Accurate Integral Method for Vibration Signal Based on Feature Information Extraction

    Directory of Open Access Journals (Sweden)

    Yong Zhu

    2015-01-01

    Full Text Available After summarizing the advantages and disadvantages of current integral methods, a novel vibration signal integral method based on feature information extraction was proposed. This method took full advantage of the self-adaptive filter characteristic and waveform correction feature of ensemble empirical mode decomposition in dealing with nonlinear and nonstationary signals. This research merged the superiorities of kurtosis, mean square error, energy, and singular value decomposition on signal feature extraction. The values of the four indexes aforementioned were combined into a feature vector. Then, the connotative characteristic components in vibration signal were accurately extracted by Euclidean distance search, and the desired integral signals were precisely reconstructed. With this method, the interference problem of invalid signal such as trend item and noise which plague traditional methods is commendably solved. The great cumulative error from the traditional time-domain integral is effectively overcome. Moreover, the large low-frequency error from the traditional frequency-domain integral is successfully avoided. Comparing with the traditional integral methods, this method is outstanding at removing noise and retaining useful feature information and shows higher accuracy and superiority.

  4. Terrain Extraction by Integrating Terrestrial Laser Scanner Data and Spectral Information

    Science.gov (United States)

    Lau, C. L.; Halim, S.; Zulkepli, M.; Azwan, A. M.; Tang, W. L.; Chong, A. K.

    2015-10-01

    The extraction of true terrain points from unstructured laser point cloud data is an important process in order to produce an accurate digital terrain model (DTM). However, most of these spatial filtering methods just utilizing the geometrical data to discriminate the terrain points from nonterrain points. The point cloud filtering method also can be improved by using the spectral information available with some scanners. Therefore, the objective of this study is to investigate the effectiveness of using the three-channel (red, green and blue) of the colour image captured from built-in digital camera which is available in some Terrestrial Laser Scanner (TLS) for terrain extraction. In this study, the data acquisition was conducted at a mini replica landscape in Universiti Teknologi Malaysia (UTM), Skudai campus using Leica ScanStation C10. The spectral information of the coloured point clouds from selected sample classes are extracted for spectral analysis. The coloured point clouds which within the corresponding preset spectral threshold are identified as that specific feature point from the dataset. This process of terrain extraction is done through using developed Matlab coding. Result demonstrates that a higher spectral resolution passive image is required in order to improve the output. This is because low quality of the colour images captured by the sensor contributes to the low separability in spectral reflectance. In conclusion, this study shows that, spectral information is capable to be used as a parameter for terrain extraction.

  5. Information extraction system

    Science.gov (United States)

    Lemmond, Tracy D; Hanley, William G; Guensche, Joseph Wendell; Perry, Nathan C; Nitao, John J; Kidwell, Paul Brandon; Boakye, Kofi Agyeman; Glaser, Ron E; Prenger, Ryan James

    2014-05-13

    An information extraction system and methods of operating the system are provided. In particular, an information extraction system for performing meta-extraction of named entities of people, organizations, and locations as well as relationships and events from text documents are described herein.

  6. Multimedia Information Extraction

    CERN Document Server

    Maybury, Mark T

    2012-01-01

    The advent of increasingly large consumer collections of audio (e.g., iTunes), imagery (e.g., Flickr), and video (e.g., YouTube) is driving a need not only for multimedia retrieval but also information extraction from and across media. Furthermore, industrial and government collections fuel requirements for stock media access, media preservation, broadcast news retrieval, identity management, and video surveillance.  While significant advances have been made in language processing for information extraction from unstructured multilingual text and extraction of objects from imagery and vid

  7. CLASSIFICATION OF INFORMAL SETTLEMENTS THROUGH THE INTEGRATION OF 2D AND 3D FEATURES EXTRACTED FROM UAV DATA

    Directory of Open Access Journals (Sweden)

    C. M. Gevaert

    2016-06-01

    Full Text Available Unmanned Aerial Vehicles (UAVs are capable of providing very high resolution and up-to-date information to support informal settlement upgrading projects. In order to provide accurate basemaps, urban scene understanding through the identification and classification of buildings and terrain is imperative. However, common characteristics of informal settlements such as small, irregular buildings with heterogeneous roof material and large presence of clutter challenge state-of-the-art algorithms. Especially the dense buildings and steeply sloped terrain cause difficulties in identifying elevated objects. This work investigates how 2D radiometric and textural features, 2.5D topographic features, and 3D geometric features obtained from UAV imagery can be integrated to obtain a high classification accuracy in challenging classification problems for the analysis of informal settlements. It compares the utility of pixel-based and segment-based features obtained from an orthomosaic and DSM with point-based and segment-based features extracted from the point cloud to classify an unplanned settlement in Kigali, Rwanda. Findings show that the integration of 2D and 3D features leads to higher classification accuracies.

  8. Challenges in Managing Information Extraction

    Science.gov (United States)

    Shen, Warren H.

    2009-01-01

    This dissertation studies information extraction (IE), the problem of extracting structured information from unstructured data. Example IE tasks include extracting person names from news articles, product information from e-commerce Web pages, street addresses from emails, and names of emerging music bands from blogs. IE is all increasingly…

  9. Scenario Customization for Information Extraction

    National Research Council Canada - National Science Library

    Yangarber, Roman

    2001-01-01

    Information Extraction (IE) is an emerging NLP technology, whose function is to process unstructured, natural language text, to locate specific pieces of information, or facts, in the text, and to use these facts to fill a database...

  10. What constitutes information integrity?

    Directory of Open Access Journals (Sweden)

    S. Flowerday

    2008-01-01

    Full Text Available This research focused on what constitutes information integrity as this is a problem facing companies today. Moreover, information integrity is a pillar of information security and is required in order to have a sound security management programme. However, it is acknowledged that 100% information integrity is not currently achievable due to various limitations and therefore the auditing concept of reasonable assurance is adopted. This is in line with the concept that 100% information security is not achievable and the notion that adequate security is the goal, using appropriate countermeasures. The main contribution of this article is to illustrate the importance of and provide a macro view of what constitutes information integrity. The findings are in harmony with Samuel Johnson's words (1751: 'Integrity without knowledge is weak and useless, and knowledge without integrity is dangerous and dreadful.'

  11. What constitutes information integrity?

    Directory of Open Access Journals (Sweden)

    S. Flowerday

    2007-12-01

    Full Text Available This research focused on what constitutes information integrity as this is a problem facing companies today. Moreover, information integrity is a pillar of information security and is required in order to have a sound security management programme. However, it is acknowledged that 100% information integrity is not currently achievable due to various limitations and therefore the auditing concept of reasonable assurance is adopted. This is in line with the concept that 100% information security is not achievable and the notion that adequate security is the goal, using appropriate countermeasures. The main contribution of this article is to illustrate the importance of and provide a macro view of what constitutes information integrity. The findings are in harmony with Samuel Johnson's words (1751: 'Integrity without knowledge is weak and useless, and knowledge without integrity is dangerous and dreadful.'

  12. Advanced integrated solvent extraction systems

    Energy Technology Data Exchange (ETDEWEB)

    Horwitz, E.P.; Dietz, M.L.; Leonard, R.A. [Argonne National Lab., IL (United States)

    1997-10-01

    Advanced integrated solvent extraction systems are a series of novel solvent extraction (SX) processes that will remove and recover all of the major radioisotopes from acidic-dissolved sludge or other acidic high-level wastes. The major focus of this effort during the last 2 years has been the development of a combined cesium-strontium extraction/recovery process, the Combined CSEX-SREX Process. The Combined CSEX-SREX Process relies on a mixture of a strontium-selective macrocyclic polyether and a novel cesium-selective extractant based on dibenzo 18-crown-6. The process offers several potential advantages over possible alternatives in a chemical processing scheme for high-level waste treatment. First, if the process is applied as the first step in chemical pretreatment, the radiation level for all subsequent processing steps (e.g., transuranic extraction/recovery, or TRUEX) will be significantly reduced. Thus, less costly shielding would be required. The second advantage of the Combined CSEX-SREX Process is that the recovered Cs-Sr fraction is non-transuranic, and therefore will decay to low-level waste after only a few hundred years. Finally, combining individual processes into a single process will reduce the amount of equipment required to pretreat the waste and therefore reduce the size and cost of the waste processing facility. In an ongoing collaboration with Lockheed Martin Idaho Technology Company (LMITCO), the authors have successfully tested various segments of the Advanced Integrated Solvent Extraction Systems. Eichrom Industries, Inc. (Darien, IL) synthesizes and markets the Sr extractant and can supply the Cs extractant on a limited basis. Plans are under way to perform a test of the Combined CSEX-SREX Process with real waste at LMITCO in the near future.

  13. Integrated Information Management (IIM)

    National Research Council Canada - National Science Library

    McIlvain, Jason

    2007-01-01

    Information Technology is the core capability required to align our resources and increase our effectiveness on the battlefield by integrating and coordinating our preventative measures and responses...

  14. Integrated inventory information system

    Digital Repository Service at National Institute of Oceanography (India)

    Sarupria, J.S.; Kunte, P.D.

    The nature of oceanographic data and the management of inventory level information are described in Integrated Inventory Information System (IIIS). It is shown how a ROSCOPO (report on observations/samples collected during oceanographic programme...

  15. Extracting useful information from images

    DEFF Research Database (Denmark)

    Kucheryavskiy, Sergey

    2011-01-01

    The paper presents an overview of methods for extracting useful information from digital images. It covers various approaches that utilized different properties of images, like intensity distribution, spatial frequencies content and several others. A few case studies including isotropic and heter......The paper presents an overview of methods for extracting useful information from digital images. It covers various approaches that utilized different properties of images, like intensity distribution, spatial frequencies content and several others. A few case studies including isotropic...

  16. Extracting the Beat: An Experience-dependent Complex Integration of Multisensory Information Involving Multiple Levels of the Nervous System

    Directory of Open Access Journals (Sweden)

    Laurel J. Trainor

    2009-04-01

    Full Text Available In a series of studies we have shown that movement (or vestibular stimulation that is synchronized to every second or every third beat of a metrically ambiguous rhythm pattern biases people to perceive the meter as a march or as a waltz, respectively. Riggle (this volume claims that we postulate an "innate", "specialized brain unit" for beat perception that is "directly" influenced by vestibular input. In fact, to the contrary, we argue that experience likely plays a large role in the development of rhythmic auditory-movement interactions, and that rhythmic processing in the brain is widely distributed and includes subcortical and cortical areas involved in sound processing and movement. Further, we argue that vestibular and auditory information are integrated at various subcortical and cortical levels along with input from other sensory modalities, and it is not clear which levels are most important for rhythm processing or, indeed, what a "direct" influence of vestibular input would mean. Finally, we argue that vestibular input to sound location mechanisms may be involved, but likely cannot explain the influence of vestibular input on the perception of auditory rhythm. This remains an empirical question for future research.

  17. Integrated Reporting Information System -

    Data.gov (United States)

    Department of Transportation — The Integrated Reporting Information System (IRIS) is a flexible and scalable web-based system that supports post operational analysis and evaluation of the National...

  18. Theoretical information reuse and integration

    CERN Document Server

    Rubin, Stuart

    2016-01-01

    Information Reuse and Integration addresses the efficient extension and creation of knowledge through the exploitation of Kolmogorov complexity in the extraction and application of domain symmetry. Knowledge, which seems to be novel, can more often than not be recast as the image of a sequence of transformations, which yield symmetric knowledge. When the size of those transformations and/or the length of that sequence of transforms exceeds the size of the image, then that image is said to be novel or random. It may also be that the new knowledge is random in that no such sequence of transforms, which produces it exists, or is at least known. The nine chapters comprising this volume incorporate symmetry, reuse, and integration as overt operational procedures or as operations built into the formal representations of data and operators employed. Either way, the aforementioned theoretical underpinnings of information reuse and integration are supported.

  19. Integrated care information technology.

    Science.gov (United States)

    Rowe, Ian; Brimacombe, Phil

    2003-02-21

    Counties Manukau District Health Board (CMDHB) uses information technology (IT) to drive its Integrated Care strategy. IT enables the sharing of relevant health information between care providers. This information sharing is critical to closing the gaps between fragmented areas of the health system. The tragic case of James Whakaruru demonstrates how people have been falling through those gaps. The starting point of the Integrated Care strategic initiative was the transmission of electronic discharges and referral status messages from CMDHB's secondary provider, South Auckland Health (SAH), to GPs in the district. Successful pilots of a Well Child system and a diabetes disease management system embracing primary and secondary providers followed this. The improved information flowing from hospital to GPs now enables GPs to provide better management for their patients. The Well Child system pilot helped improve reported immunization rates in a high health need area from 40% to 90%. The diabetes system pilot helped reduce the proportion of patients with HbA1c rang:9 from 47% to 16%. IT has been implemented as an integral component of an overall Integrated Care strategic initiative. Within this context, Integrated Care IT has helped to achieve significant improvements in care outcomes, broken down barriers between health system silos, and contributed to the establishment of a system of care continuum that is better for patients.

  20. Information Integration Architecture Development

    OpenAIRE

    Faulkner, Stéphane; Kolp, Manuel; Nguyen, Duy Thai; Coyette, Adrien; Do, Thanh Tung; 16th International Conference on Software Engineering and Knowledge Engineering

    2004-01-01

    Multi-Agent Systems (MAS) architectures are gaining popularity for building open, distributed, and evolving software required by systems such as information integration applications. Unfortunately, despite considerable work in software architecture during the last decade, few research efforts have aimed at truly defining patterns and languages for designing such multiagent architectures. We propose a modern approach based on organizational structures and architectural description lan...

  1. Extracting information from multiplex networks

    Science.gov (United States)

    Iacovacci, Jacopo; Bianconi, Ginestra

    2016-06-01

    Multiplex networks are generalized network structures that are able to describe networks in which the same set of nodes are connected by links that have different connotations. Multiplex networks are ubiquitous since they describe social, financial, engineering, and biological networks as well. Extending our ability to analyze complex networks to multiplex network structures increases greatly the level of information that is possible to extract from big data. For these reasons, characterizing the centrality of nodes in multiplex networks and finding new ways to solve challenging inference problems defined on multiplex networks are fundamental questions of network science. In this paper, we discuss the relevance of the Multiplex PageRank algorithm for measuring the centrality of nodes in multilayer networks and we characterize the utility of the recently introduced indicator function Θ ˜ S for describing their mesoscale organization and community structure. As working examples for studying these measures, we consider three multiplex network datasets coming for social science.

  2. Transductive Pattern Learning for Information Extraction

    National Research Council Canada - National Science Library

    McLernon, Brian; Kushmerick, Nicholas

    2006-01-01

    .... We present TPLEX, a semi-supervised learning algorithm for information extraction that can acquire extraction patterns from a small amount of labelled text in conjunction with a large amount of unlabelled text...

  3. Feature extraction for dynamic integration of classifiers

    NARCIS (Netherlands)

    Pechenizkiy, M.; Tsymbal, A.; Puuronen, S.; Patterson, D.W.

    2007-01-01

    Recent research has shown the integration of multiple classifiers to be one of the most important directions in machine learning and data mining. In this paper, we present an algorithm for the dynamic integration of classifiers in the space of extracted features (FEDIC). It is based on the technique

  4. Rule-based Information Integration

    NARCIS (Netherlands)

    de Keijzer, Ander; van Keulen, Maurice

    2005-01-01

    In this report, we show the process of information integration. We specifically discuss the language used for integration. We show that integration consists of two phases, the schema mapping phase and the data integration phase. We formally define transformation rules, conversion, evolution and

  5. Integrated Compliance Information System (ICIS)

    Data.gov (United States)

    U.S. Environmental Protection Agency — The purpose of ICIS is to meet evolving Enforcement and Compliance business needs for EPA and State users by integrating information into a single integrated data...

  6. Information Integration Technology Demonstration (IITD)

    National Research Council Canada - National Science Library

    Loe, Richard

    2001-01-01

    The objectives of the Information Integration Technology Demonstration (IITD) were to investigate, design a software architecture and demonstrate a capability to display intelligence data from multiple disciplines...

  7. Probabilistic XML in Information Integration

    NARCIS (Netherlands)

    de Keijzer, Ander; Shim, J.; Casati, F.

    2006-01-01

    Information integration is a difficult research problem. In an ambient environment, where devices can connect and disconnect arbitrarily, the problem only increases, because data sources may become available at any time, but can also disappear. In such an environment, information integration needs

  8. Information Extraction for Social Media

    NARCIS (Netherlands)

    Habib, M. B.; Keulen, M. van

    2014-01-01

    The rapid growth in IT in the last two decades has led to a growth in the amount of information available online. A new style for sharing information is social media. Social media is a continuously instantly updated source of information. In this position paper, we propose a framework for

  9. Integrated Risk Information System (IRIS)

    Data.gov (United States)

    U.S. Environmental Protection Agency — EPA?s Integrated Risk Information System (IRIS) is a compilation of electronic reports on specific substances found in the environment and their potential to cause...

  10. Information Extraction From Chemical Patents

    Directory of Open Access Journals (Sweden)

    Sandra Bergmann

    2012-01-01

    Full Text Available The development of new chemicals or pharmaceuticals is preceded by an indepth analysis of published patents in this field. This information retrieval is a costly and time inefficient step when done by a human reader, yet it is mandatory for potential success of an investment. The goal of the research project UIMA-HPC is to automate and hence speed-up the process of knowledge mining about patents. Multi-threaded analysis engines, developed according to UIMA (Unstructured Information Management Architecture standards, process texts and images in thousands of documents in parallel. UNICORE (UNiform Interface to COmputing Resources workflow control structures make it possible to dynamically allocate resources for every given task to gain best cpu-time/realtime ratios in an HPC environment.

  11. Integrated risk information system (IRIS)

    Energy Technology Data Exchange (ETDEWEB)

    Tuxen, L. [Environmental Protection Agency, Washington, DC (United States)

    1990-12-31

    The Integrated Risk Information System (IRIS) is an electronic information system developed by the US Environmental Protection Agency (EPA) containing information related to health risk assessment. IRIS is the Agency`s primary vehicle for communication of chronic health hazard information that represents Agency consensus following comprehensive review by intra-Agency work groups. The original purpose for developing IRIS was to provide guidance to EPA personnel in making risk management decisions. This original purpose for developing IRIS was to guidance to EPA personnel in making risk management decisions. This role has expanded and evolved with wider access and use of the system. IRIS contains chemical-specific information in summary format for approximately 500 chemicals. IRIS is available to the general public on the National Library of Medicine`s Toxicology Data Network (TOXNET) and on diskettes through the National Technical Information Service (NTIS).

  12. Multi-Filter String Matching and Human-Centric Entity Matching for Information Extraction

    Science.gov (United States)

    Sun, Chong

    2012-01-01

    More and more information is being generated in text documents, such as Web pages, emails and blogs. To effectively manage this unstructured information, one broadly used approach includes locating relevant content in documents, extracting structured information and integrating the extracted information for querying, mining or further analysis. In…

  13. Extracting Information from Multimedia Meeting Collections

    OpenAIRE

    Gatica-Perez, Daniel; Zhang, Dong; Bengio, Samy

    2005-01-01

    Multimedia meeting collections, composed of unedited audio and video streams, handwritten notes, slides, and electronic documents that jointly constitute a raw record of complex human interaction processes in the workplace, have attracted interest due to the increasing feasibility of recording them in large quantities, by the opportunities for information access and retrieval applications derived from the automatic extraction of relevant meeting information, and by the challenges that the ext...

  14. DKIE: Open Source Information Extraction for Danish

    DEFF Research Database (Denmark)

    Derczynski, Leon; Field, Camilla Vilhelmsen; Bøgh, Kenneth Sejdenfaden

    2014-01-01

    Danish is a major Scandinavian language spoken daily by around six million people. However, it lacks a unified, open set of NLP tools. This demonstration will introduce DKIE, an extensible open-source toolkit for processing Danish text. We implement an information extraction architecture for Danish...

  15. INTEGRATED INFORMATION SYSTEM ARCHITECTURE PROVIDING BEHAVIORAL FEATURE

    Directory of Open Access Journals (Sweden)

    Vladimir N. Shvedenko

    2016-11-01

    Full Text Available The paper deals with creation of integrated information system architecture capable of supporting management decisions using behavioral features. The paper considers the architecture of information decision support system for production system management. The behavioral feature is given to an information system, and it ensures extraction, processing of information, management decision-making with both automated and automatic modes of decision-making subsystem being permitted. Practical implementation of information system with behavior is based on service-oriented architecture: there is a set of independent services in the information system that provides data of its subsystems or data processing by separate application under the chosen variant of the problematic situation settlement. For creation of integrated information system with behavior we propose architecture including the following subsystems: data bus, subsystem for interaction with the integrated applications based on metadata, business process management subsystem, subsystem for the current state analysis of the enterprise and management decision-making, behavior training subsystem. For each problematic situation a separate logical layer service is created in Unified Service Bus handling problematic situations. This architecture reduces system information complexity due to the fact that with a constant amount of system elements the number of links decreases, since each layer provides communication center of responsibility for the resource with the services of corresponding applications. If a similar problematic situation occurs, its resolution is automatically removed from problem situation metamodel repository and business process metamodel of its settlement. In the business process performance commands are generated to the corresponding centers of responsibility to settle a problematic situation.

  16. Integrated Phoneme Subspace Method for Speech Feature Extraction

    Directory of Open Access Journals (Sweden)

    Park Hyunsin

    2009-01-01

    Full Text Available Speech feature extraction has been a key focus in robust speech recognition research. In this work, we discuss data-driven linear feature transformations applied to feature vectors in the logarithmic mel-frequency filter bank domain. Transformations are based on principal component analysis (PCA, independent component analysis (ICA, and linear discriminant analysis (LDA. Furthermore, this paper introduces a new feature extraction technique that collects the correlation information among phoneme subspaces and reconstructs feature space for representing phonemic information efficiently. The proposed speech feature vector is generated by projecting an observed vector onto an integrated phoneme subspace (IPS based on PCA or ICA. The performance of the new feature was evaluated for isolated word speech recognition. The proposed method provided higher recognition accuracy than conventional methods in clean and reverberant environments.

  17. Unsupervised information extraction by text segmentation

    CERN Document Server

    Cortez, Eli

    2013-01-01

    A new unsupervised approach to the problem of Information Extraction by Text Segmentation (IETS) is proposed, implemented and evaluated herein. The authors' approach relies on information available on pre-existing data to learn how to associate segments in the input string with attributes of a given domain relying on a very effective set of content-based features. The effectiveness of the content-based features is also exploited to directly learn from test data structure-based features, with no previous human-driven training, a feature unique to the presented approach. Based on the approach, a

  18. Extracting the information backbone in online system.

    Science.gov (United States)

    Zhang, Qian-Ming; Zeng, An; Shang, Ming-Sheng

    2013-01-01

    Information overload is a serious problem in modern society and many solutions such as recommender system have been proposed to filter out irrelevant information. In the literature, researchers have been mainly dedicated to improving the recommendation performance (accuracy and diversity) of the algorithms while they have overlooked the influence of topology of the online user-object bipartite networks. In this paper, we find that some information provided by the bipartite networks is not only redundant but also misleading. With such "less can be more" feature, we design some algorithms to improve the recommendation performance by eliminating some links from the original networks. Moreover, we propose a hybrid method combining the time-aware and topology-aware link removal algorithms to extract the backbone which contains the essential information for the recommender systems. From the practical point of view, our method can improve the performance and reduce the computational time of the recommendation system, thus improving both of their effectiveness and efficiency.

  19. Information Security and Integrity Systems

    Science.gov (United States)

    1990-01-01

    Viewgraphs from the Information Security and Integrity Systems seminar held at the University of Houston-Clear Lake on May 15-16, 1990 are presented. A tutorial on computer security is presented. The goals of this tutorial are the following: to review security requirements imposed by government and by common sense; to examine risk analysis methods to help keep sight of forest while in trees; to discuss the current hot topic of viruses (which will stay hot); to examine network security, now and in the next year to 30 years; to give a brief overview of encryption; to review protection methods in operating systems; to review database security problems; to review the Trusted Computer System Evaluation Criteria (Orange Book); to comment on formal verification methods; to consider new approaches (like intrusion detection and biometrics); to review the old, low tech, and still good solutions; and to give pointers to the literature and to where to get help. Other topics covered include security in software applications and development; risk management; trust: formal methods and associated techniques; secure distributed operating system and verification; trusted Ada; a conceptual model for supporting a B3+ dynamic multilevel security and integrity in the Ada runtime environment; and information intelligence sciences.

  20. A New Multi-Sensor Track Fusion Architecture for Multi-Sensor Information Integration

    National Research Council Canada - National Science Library

    Jean, Buddy H; Younker, John; Hung, Chih-Cheng

    2004-01-01

    .... This new technology will integrate multi-sensor information and extract integrated multi-sensor information to detect, track and identify multiple targets at any time, in any place under all weather conditions...

  1. Information extraction from muon radiography data

    International Nuclear Information System (INIS)

    Borozdin, K.N.; Asaki, T.J.; Chartrand, R.; Hengartner, N.W.; Hogan, G.E.; Morris, C.L.; Priedhorsky, W.C.; Schirato, R.C.; Schultz, L.J.; Sottile, M.J.; Vixie, K.R.; Wohlberg, B.E.; Blanpied, G.

    2004-01-01

    Scattering muon radiography was proposed recently as a technique of detection and 3-d imaging for dense high-Z objects. High-energy cosmic ray muons are deflected in matter in the process of multiple Coulomb scattering. By measuring the deflection angles we are able to reconstruct the configuration of high-Z material in the object. We discuss the methods for information extraction from muon radiography data. Tomographic methods widely used in medical images have been applied to a specific muon radiography information source. Alternative simple technique based on the counting of high-scattered muons in the voxels seems to be efficient in many simulated scenes. SVM-based classifiers and clustering algorithms may allow detection of compact high-Z object without full image reconstruction. The efficiency of muon radiography can be increased using additional informational sources, such as momentum estimation, stopping power measurement, and detection of muonic atom emission.

  2. Extracting the information backbone in online system.

    Directory of Open Access Journals (Sweden)

    Qian-Ming Zhang

    Full Text Available Information overload is a serious problem in modern society and many solutions such as recommender system have been proposed to filter out irrelevant information. In the literature, researchers have been mainly dedicated to improving the recommendation performance (accuracy and diversity of the algorithms while they have overlooked the influence of topology of the online user-object bipartite networks. In this paper, we find that some information provided by the bipartite networks is not only redundant but also misleading. With such "less can be more" feature, we design some algorithms to improve the recommendation performance by eliminating some links from the original networks. Moreover, we propose a hybrid method combining the time-aware and topology-aware link removal algorithms to extract the backbone which contains the essential information for the recommender systems. From the practical point of view, our method can improve the performance and reduce the computational time of the recommendation system, thus improving both of their effectiveness and efficiency.

  3. Extracting the Information Backbone in Online System

    Science.gov (United States)

    Zhang, Qian-Ming; Zeng, An; Shang, Ming-Sheng

    2013-01-01

    Information overload is a serious problem in modern society and many solutions such as recommender system have been proposed to filter out irrelevant information. In the literature, researchers have been mainly dedicated to improving the recommendation performance (accuracy and diversity) of the algorithms while they have overlooked the influence of topology of the online user-object bipartite networks. In this paper, we find that some information provided by the bipartite networks is not only redundant but also misleading. With such “less can be more” feature, we design some algorithms to improve the recommendation performance by eliminating some links from the original networks. Moreover, we propose a hybrid method combining the time-aware and topology-aware link removal algorithms to extract the backbone which contains the essential information for the recommender systems. From the practical point of view, our method can improve the performance and reduce the computational time of the recommendation system, thus improving both of their effectiveness and efficiency. PMID:23690946

  4. Chaotic spectra: How to extract dynamic information

    International Nuclear Information System (INIS)

    Taylor, H.S.; Gomez Llorente, J.M.; Zakrzewski, J.; Kulander, K.C.

    1988-10-01

    Nonlinear dynamics is applied to chaotic unassignable atomic and molecular spectra with the aim of extracting detailed information about regular dynamic motions that exist over short intervals of time. It is shown how this motion can be extracted from high resolution spectra by doing low resolution studies or by Fourier transforming limited regions of the spectrum. These motions mimic those of periodic orbits (PO) and are inserts into the dominant chaotic motion. Considering these inserts and the PO as a dynamically decoupled region of space, resonant scattering theory and stabilization methods enable us to compute ladders of resonant states which interact with the chaotic quasi-continuum computed in principle from basis sets placed off the PO. The interaction of the resonances with the quasicontinuum explains the low resolution spectra seen in such experiments. It also allows one to associate low resolution features with a particular PO. The motion on the PO thereby supplies the molecular movements whose quantization causes the low resolution spectra. Characteristic properties of the periodic orbit based resonances are discussed. The method is illustrated on the photoabsorption spectrum of the hydrogen atom in a strong magnetic field and on the photodissociation spectrum of H 3 + . Other molecular systems which are currently under investigation using this formalism are also mentioned. 53 refs., 10 figs., 2 tabs

  5. Extraction of quantifiable information from complex systems

    CERN Document Server

    Dahmen, Wolfgang; Griebel, Michael; Hackbusch, Wolfgang; Ritter, Klaus; Schneider, Reinhold; Schwab, Christoph; Yserentant, Harry

    2014-01-01

    In April 2007, the  Deutsche Forschungsgemeinschaft (DFG) approved the  Priority Program 1324 “Mathematical Methods for Extracting Quantifiable Information from Complex Systems.” This volume presents a comprehensive overview of the most important results obtained over the course of the program.   Mathematical models of complex systems provide the foundation for further technological developments in science, engineering and computational finance.  Motivated by the trend toward steadily increasing computer power, ever more realistic models have been developed in recent years. These models have also become increasingly complex, and their numerical treatment poses serious challenges.   Recent developments in mathematics suggest that, in the long run, much more powerful numerical solution strategies could be derived if the interconnections between the different fields of research were systematically exploited at a conceptual level. Accordingly, a deeper understanding of the mathematical foundations as w...

  6. Extraction of temporal information in functional MRI

    Science.gov (United States)

    Singh, M.; Sungkarat, W.; Jeong, Jeong-Won; Zhou, Yongxia

    2002-10-01

    The temporal resolution of functional MRI (fMRI) is limited by the shape of the haemodynamic response function (hrf) and the vascular architecture underlying the activated regions. Typically, the temporal resolution of fMRI is on the order of 1 s. We have developed a new data processing approach to extract temporal information on a pixel-by-pixel basis at the level of 100 ms from fMRI data. Instead of correlating or fitting the time-course of each pixel to a single reference function, which is the common practice in fMRI, we correlate each pixel's time-course to a series of reference functions that are shifted with respect to each other by 100 ms. The reference function yielding the highest correlation coefficient for a pixel is then used as a time marker for that pixel. A Monte Carlo simulation and experimental study of this approach were performed to estimate the temporal resolution as a function of signal-to-noise ratio (SNR) in the time-course of a pixel. Assuming a known and stationary hrf, the simulation and experimental studies suggest a lower limit in the temporal resolution of approximately 100 ms at an SNR of 3. The multireference function approach was also applied to extract timing information from an event-related motor movement study where the subjects flexed a finger on cue. The event was repeated 19 times with the event's presentation staggered to yield an approximately 100-ms temporal sampling of the haemodynamic response over the entire presentation cycle. The timing differences among different regions of the brain activated by the motor task were clearly visualized and quantified by this method. The results suggest that it is possible to achieve a temporal resolution of /spl sim/200 ms in practice with this approach.

  7. Optical Aperture Synthesis Object's Information Extracting Based on Wavelet Denoising

    International Nuclear Information System (INIS)

    Fan, W J; Lu, Y

    2006-01-01

    Wavelet denoising is studied to improve OAS(optical aperture synthesis) object's Fourier information extracting. Translation invariance wavelet denoising based on Donoho wavelet soft threshold denoising is researched to remove Pseudo-Gibbs in wavelet soft threshold image. OAS object's information extracting based on translation invariance wavelet denoising is studied. The study shows that wavelet threshold denoising can improve the precision and the repetition of object's information extracting from interferogram, and the translation invariance wavelet denoising information extracting is better than soft threshold wavelet denoising information extracting

  8. Information Integration; The process of integration, evolution and versioning

    NARCIS (Netherlands)

    de Keijzer, Ander; van Keulen, Maurice

    2005-01-01

    At present, many information sources are available wherever you are. Most of the time, the information needed is spread across several of those information sources. Gathering this information is a tedious and time consuming job. Automating this process would assist the user in its task. Integration

  9. Respiratory Information Extraction from Electrocardiogram Signals

    KAUST Repository

    Amin, Gamal El Din Fathy

    2010-12-01

    The Electrocardiogram (ECG) is a tool measuring the electrical activity of the heart, and it is extensively used for diagnosis and monitoring of heart diseases. The ECG signal reflects not only the heart activity but also many other physiological processes. The respiratory activity is a prominent process that affects the ECG signal due to the close proximity of the heart and the lungs. In this thesis, several methods for the extraction of respiratory process information from the ECG signal are presented. These methods allow an estimation of the lung volume and the lung pressure from the ECG signal. The potential benefit of this is to eliminate the corresponding sensors used to measure the respiration activity. A reduction of the number of sensors connected to patients will increase patients’ comfort and reduce the costs associated with healthcare. As a further result, the efficiency of diagnosing respirational disorders will increase since the respiration activity can be monitored with a common, widely available method. The developed methods can also improve the detection of respirational disorders that occur while patients are sleeping. Such disorders are commonly diagnosed in sleeping laboratories where the patients are connected to a number of different sensors. Any reduction of these sensors will result in a more natural sleeping environment for the patients and hence a higher sensitivity of the diagnosis.

  10. Evolving spectral transformations for multitemporal information extraction using evolutionary computation

    Science.gov (United States)

    Momm, Henrique; Easson, Greg

    2011-01-01

    Remote sensing plays an important role in assessing temporal changes in land features. The challenge often resides in the conversion of large quantities of raw data into actionable information in a timely and cost-effective fashion. To address this issue, research was undertaken to develop an innovative methodology integrating biologically-inspired algorithms with standard image classification algorithms to improve information extraction from multitemporal imagery. Genetic programming was used as the optimization engine to evolve feature-specific candidate solutions in the form of nonlinear mathematical expressions of the image spectral channels (spectral indices). The temporal generalization capability of the proposed system was evaluated by addressing the task of building rooftop identification from a set of images acquired at different dates in a cross-validation approach. The proposed system generates robust solutions (kappa values > 0.75 for stage 1 and > 0.4 for stage 2) despite the statistical differences between the scenes caused by land use and land cover changes coupled with variable environmental conditions, and the lack of radiometric calibration between images. Based on our results, the use of nonlinear spectral indices enhanced the spectral differences between features improving the clustering capability of standard classifiers and providing an alternative solution for multitemporal information extraction.

  11. Anomaly extraction from the path integral

    International Nuclear Information System (INIS)

    Christos, G.A.

    1983-01-01

    Fujikawa's recently proposed derivation of the anomaly from the path integral is examined. It is attempted to give a better understanding of his work. In particular, evasions of his result are discussed; for example it is shown how chiral U(1) axial invariance can be maintained by employing a gauge variant regularization prescription. A brief connection with the point-splitting method is also made. (orig.)

  12. Knowledge discovery: Extracting usable information from large amounts of data

    International Nuclear Information System (INIS)

    Whiteson, R.

    1998-01-01

    The threat of nuclear weapons proliferation is a problem of world wide concern. Safeguards are the key to nuclear nonproliferation and data is the key to safeguards. The safeguards community has access to a huge and steadily growing volume of data. The advantages of this data rich environment are obvious, there is a great deal of information which can be utilized. The challenge is to effectively apply proven and developing technologies to find and extract usable information from that data. That information must then be assessed and evaluated to produce the knowledge needed for crucial decision making. Efficient and effective analysis of safeguards data will depend on utilizing technologies to interpret the large, heterogeneous data sets that are available from diverse sources. With an order-of-magnitude increase in the amount of data from a wide variety of technical, textual, and historical sources there is a vital need to apply advanced computer technologies to support all-source analysis. There are techniques of data warehousing, data mining, and data analysis that can provide analysts with tools that will expedite their extracting useable information from the huge amounts of data to which they have access. Computerized tools can aid analysts by integrating heterogeneous data, evaluating diverse data streams, automating retrieval of database information, prioritizing inputs, reconciling conflicting data, doing preliminary interpretations, discovering patterns or trends in data, and automating some of the simpler prescreening tasks that are time consuming and tedious. Thus knowledge discovery technologies can provide a foundation of support for the analyst. Rather than spending time sifting through often irrelevant information, analysts could use their specialized skills in a focused, productive fashion. This would allow them to make their analytical judgments with more confidence and spend more of their time doing what they do best

  13. Sample-based XPath Ranking for Web Information Extraction

    NARCIS (Netherlands)

    Jundt, Oliver; van Keulen, Maurice

    Web information extraction typically relies on a wrapper, i.e., program code or a configuration that specifies how to extract some information from web pages at a specific website. Manually creating and maintaining wrappers is a cumbersome and error-prone task. It may even be prohibitive as some

  14. Development of the Integrated Information Technology System

    National Research Council Canada - National Science Library

    2005-01-01

    The Integrated Medical Information Technology System (IMITS) Program is focused on implementation of advanced technology solutions that eliminate inefficiencies, increase utilization and improve quality of care for active duty forces...

  15. Pixel extraction based integral imaging with controllable viewing direction

    International Nuclear Information System (INIS)

    Ji, Chao-Chao; Deng, Huan; Wang, Qiong-Hua

    2012-01-01

    We propose pixel extraction based integral imaging with a controllable viewing direction. The proposed integral imaging can provide viewers three-dimensional (3D) images in a very small viewing angle. The viewing angle and the viewing direction of the reconstructed 3D images are controlled by the pixels extracted from an elemental image array. Theoretical analysis and a 3D display experiment of the viewing direction controllable integral imaging are carried out. The experimental results verify the correctness of the theory. A 3D display based on the integral imaging can protect the viewer’s privacy and has huge potential for a television to show multiple 3D programs at the same time. (paper)

  16. The Agent of extracting Internet Information with Lead Order

    Science.gov (United States)

    Mo, Zan; Huang, Chuliang; Liu, Aijun

    In order to carry out e-commerce better, advanced technologies to access business information are in need urgently. An agent is described to deal with the problems of extracting internet information that caused by the non-standard and skimble-scamble structure of Chinese websites. The agent designed includes three modules which respond to the process of extracting information separately. A method of HTTP tree and a kind of Lead algorithm is proposed to generate a lead order, with which the required web can be retrieved easily. How to transform the extracted information structuralized with natural language is also discussed.

  17. Integrating Information & Communications Technologies into the Classroom

    Science.gov (United States)

    Tomei, Lawrence, Ed.

    2007-01-01

    "Integrating Information & Communications Technologies Into the Classroom" examines topics critical to business, computer science, and information technology education, such as: school improvement and reform, standards-based technology education programs, data-driven decision making, and strategic technology education planning. This book also…

  18. Advanced integrated solvent extraction and ion exchange systems

    International Nuclear Information System (INIS)

    Horwitz, P.

    1996-01-01

    Advanced integrated solvent extraction (SX) and ion exchange (IX) systems are a series of novel SX and IX processes that extract and recover uranium and transuranics (TRUs) (neptunium, plutonium, americium) and fission products 90 Sr, 99 Tc, and 137 Cs from acidic high-level liquid waste and that sorb and recover 90 Sr, 99 Tc, and 137 Cs from alkaline supernatant high-level waste. Each system is based on the use of new selective liquid extractants or chromatographic materials. The purpose of the integrated SX and IX processes is to minimize the quantity of waste that must be vitrified and buried in a deep geologic repository by producing raffinates (from SX) and effluent streams (from IX) that will meet the specifications of Class A low-level waste

  19. Cause Information Extraction from Financial Articles Concerning Business Performance

    Science.gov (United States)

    Sakai, Hiroyuki; Masuyama, Shigeru

    We propose a method of extracting cause information from Japanese financial articles concerning business performance. Our method acquires cause informtion, e. g. “_??__??__??__??__??__??__??__??__??__??_ (zidousya no uriage ga koutyou: Sales of cars were good)”. Cause information is useful for investors in selecting companies to invest. Our method extracts cause information as a form of causal expression by using statistical information and initial clue expressions automatically. Our method can extract causal expressions without predetermined patterns or complex rules given by hand, and is expected to be applied to other tasks for acquiring phrases that have a particular meaning not limited to cause information. We compared our method with our previous one originally proposed for extracting phrases concerning traffic accident causes and experimental results showed that our new method outperforms our previous one.

  20. MEASURING INFORMATION INTEGR-ATION MODEL FOR CAD/CMM

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    A CAD/CMM workpiece modeling system based on IGES file is proposed. The modeling system is implemented by using a new method for labelling the tolerance items of 3D workpiece. The concept-"feature face" is used in the method. First the CAD data of workpiece are extracted and recognized automatically. Then a workpiece model is generated, which is the integration of pure 3D geometry form with its corresponding inspection items. The principle of workpiece modeling is also presented. At last, the experiment results are shown and correctness of the model is certified.

  1. Can we replace curation with information extraction software?

    Science.gov (United States)

    Karp, Peter D

    2016-01-01

    Can we use programs for automated or semi-automated information extraction from scientific texts as practical alternatives to professional curation? I show that error rates of current information extraction programs are too high to replace professional curation today. Furthermore, current IEP programs extract single narrow slivers of information, such as individual protein interactions; they cannot extract the large breadth of information extracted by professional curators for databases such as EcoCyc. They also cannot arbitrate among conflicting statements in the literature as curators can. Therefore, funding agencies should not hobble the curation efforts of existing databases on the assumption that a problem that has stymied Artificial Intelligence researchers for more than 60 years will be solved tomorrow. Semi-automated extraction techniques appear to have significantly more potential based on a review of recent tools that enhance curator productivity. But a full cost-benefit analysis for these tools is lacking. Without such analysis it is possible to expend significant effort developing information-extraction tools that automate small parts of the overall curation workflow without achieving a significant decrease in curation costs.Database URL. © The Author(s) 2016. Published by Oxford University Press.

  2. Mining knowledge from text repositories using information extraction ...

    Indian Academy of Sciences (India)

    Information extraction (IE); text mining; text repositories; knowledge discovery from .... general purpose English words. However ... of precision and recall, as extensive experimentation is required due to lack of public tagged corpora. 4. Mining ...

  3. Mars Target Encyclopedia: Information Extraction for Planetary Science

    Science.gov (United States)

    Wagstaff, K. L.; Francis, R.; Gowda, T.; Lu, Y.; Riloff, E.; Singh, K.

    2017-06-01

    Mars surface targets / and published compositions / Seek and ye will find. We used text mining methods to extract information from LPSC abstracts about the composition of Mars surface targets. Users can search by element, mineral, or target.

  4. Extraction of polycyclic aromatic hydrocarbons from smoked fish using pressurized liquid extraction with integrated fat removal

    DEFF Research Database (Denmark)

    Lund, Mette; Duedahl-Olesen, Lene; Christensen, Jan H.

    2009-01-01

    Quantification of polycyclic aromatic hydrocarbons (PAHs) in smoked fish products often requires multiple clean-up steps to remove fat and other compounds that may interfere with the chemical analysis. We present a novel pressurized liquid extraction (PLE) method that integrates exhaustive...

  5. Social network extraction based on Web: 3. the integrated superficial method

    Science.gov (United States)

    Nasution, M. K. M.; Sitompul, O. S.; Noah, S. A.

    2018-03-01

    The Web as a source of information has become part of the social behavior information. Although, by involving only the limitation of information disclosed by search engines in the form of: hit counts, snippets, and URL addresses of web pages, the integrated extraction method produces a social network not only trusted but enriched. Unintegrated extraction methods may produce social networks without explanation, resulting in poor supplemental information, or resulting in a social network of durmise laden, consequently unrepresentative social structures. The integrated superficial method in addition to generating the core social network, also generates an expanded network so as to reach the scope of relation clues, or number of edges computationally almost similar to n(n - 1)/2 for n social actors.

  6. Addressing Information Proliferation: Applications of Information Extraction and Text Mining

    Science.gov (United States)

    Li, Jingjing

    2013-01-01

    The advent of the Internet and the ever-increasing capacity of storage media have made it easy to store, deliver, and share enormous volumes of data, leading to a proliferation of information on the Web, in online libraries, on news wires, and almost everywhere in our daily lives. Since our ability to process and absorb this information remains…

  7. Information extraction from multi-institutional radiology reports.

    Science.gov (United States)

    Hassanpour, Saeed; Langlotz, Curtis P

    2016-01-01

    The radiology report is the most important source of clinical imaging information. It documents critical information about the patient's health and the radiologist's interpretation of medical findings. It also communicates information to the referring physicians and records that information for future clinical and research use. Although efforts to structure some radiology report information through predefined templates are beginning to bear fruit, a large portion of radiology report information is entered in free text. The free text format is a major obstacle for rapid extraction and subsequent use of information by clinicians, researchers, and healthcare information systems. This difficulty is due to the ambiguity and subtlety of natural language, complexity of described images, and variations among different radiologists and healthcare organizations. As a result, radiology reports are used only once by the clinician who ordered the study and rarely are used again for research and data mining. In this work, machine learning techniques and a large multi-institutional radiology report repository are used to extract the semantics of the radiology report and overcome the barriers to the re-use of radiology report information in clinical research and other healthcare applications. We describe a machine learning system to annotate radiology reports and extract report contents according to an information model. This information model covers the majority of clinically significant contents in radiology reports and is applicable to a wide variety of radiology study types. Our automated approach uses discriminative sequence classifiers for named-entity recognition to extract and organize clinically significant terms and phrases consistent with the information model. We evaluated our information extraction system on 150 radiology reports from three major healthcare organizations and compared its results to a commonly used non-machine learning information extraction method. We

  8. Risk Informed Structural Systems Integrity Management

    DEFF Research Database (Denmark)

    Nielsen, Michael Havbro Faber

    2017-01-01

    The present paper is predominantly a conceptual contribution with an appraisal of major developments in risk informed structural integrity management for offshore installations together with a discussion of their merits and the challenges which still lie ahead. Starting point is taken in a selected...... overview of research and development contributions which have formed the basis for Risk Based Inspection Planning (RBI) as we know it today. Thereafter an outline of the methodical basis for risk informed structural systems integrity management, i.e. the Bayesian decision analysis is provided in summary....... The main focus is here directed on RBI for offshore facilities subject to fatigue damages. New ideas and methodical frameworks in the area of robustness and resilience modeling of structural systems are then introduced, and it is outlined how these may adequately be utilized to enhance Structural Integrity...

  9. Integrated plant information technology design support functionality

    International Nuclear Information System (INIS)

    Kim, Yeon Seung; Kim, Dae Jin; Barber, P. W.; Goland, D.

    1996-06-01

    This technical report was written as a result of Integrated Plant Information System (IPIS) feasibility study on CANDU 9 project which had been carried out from January, 1994 to March, 1994 at AECL (Atomic Energy Canada Limited) in Canada. From 1987, AECL had done endeavour to change engineering work process from paper based work process to computer based work process through CANDU 3 project. Even though AECL had a lot of good results form computerizing the Process Engineering, Instrumentation Control and Electrical Engineering, Mechanical Engineering, Computer Aided Design and Drafting, and Document Management System, but there remains the problem of information isolation and integration. On this feasibility study, IPIS design support functionality guideline was suggested by evaluating current AECL CAE tools, analyzing computer aided engineering task and work flow, investigating request for implementing integrated computer aided engineering and describing Korean request for future CANDU design including CANDU 9. 6 figs. (Author)

  10. Integrated plant information technology design support functionality

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yeon Seung; Kim, Dae Jin [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of); Barber, P W; Goland, D [Atomic Energy Canada Ltd., (Canada)

    1996-06-01

    This technical report was written as a result of Integrated Plant Information System (IPIS) feasibility study on CANDU 9 project which had been carried out from January, 1994 to March, 1994 at AECL (Atomic Energy Canada Limited) in Canada. From 1987, AECL had done endeavour to change engineering work process from paper based work process to computer based work process through CANDU 3 project. Even though AECL had a lot of good results form computerizing the Process Engineering, Instrumentation Control and Electrical Engineering, Mechanical Engineering, Computer Aided Design and Drafting, and Document Management System, but there remains the problem of information isolation and integration. On this feasibility study, IPIS design support functionality guideline was suggested by evaluating current AECL CAE tools, analyzing computer aided engineering task and work flow, investigating request for implementing integrated computer aided engineering and describing Korean request for future CANDU design including CANDU 9. 6 figs. (Author).

  11. Information delivery manuals to integrate building product information into design

    DEFF Research Database (Denmark)

    Berard, Ole Bengt; Karlshøj, Jan

    2011-01-01

    Despite continuing BIM progress, professionals in the AEC industry often lack the information they need to perform their work. Although this problem could be alleviated by information systems similar to those in other industries, companies struggle to model processes and information needs...... them in information systems. BIM implies that objects are bearers of information and logic. The present study has three main aims: (1) to explore IDMs capability to capture all four perspectives, (2) to determine whether an IDM’s collaborative methodology is valid for developing standardized processes......, and (3) to ascertain whether IDM’s business rules can support the development of information and logic-bearing BIM objects. The research is based on a case study of re-engineering the bidding process for a design-build project to integrate building product manufacturers, subcontractors...

  12. Fine-grained information extraction from German transthoracic echocardiography reports.

    Science.gov (United States)

    Toepfer, Martin; Corovic, Hamo; Fette, Georg; Klügl, Peter; Störk, Stefan; Puppe, Frank

    2015-11-12

    Information extraction techniques that get structured representations out of unstructured data make a large amount of clinically relevant information about patients accessible for semantic applications. These methods typically rely on standardized terminologies that guide this process. Many languages and clinical domains, however, lack appropriate resources and tools, as well as evaluations of their applications, especially if detailed conceptualizations of the domain are required. For instance, German transthoracic echocardiography reports have not been targeted sufficiently before, despite of their importance for clinical trials. This work therefore aimed at development and evaluation of an information extraction component with a fine-grained terminology that enables to recognize almost all relevant information stated in German transthoracic echocardiography reports at the University Hospital of Würzburg. A domain expert validated and iteratively refined an automatically inferred base terminology. The terminology was used by an ontology-driven information extraction system that outputs attribute value pairs. The final component has been mapped to the central elements of a standardized terminology, and it has been evaluated according to documents with different layouts. The final system achieved state-of-the-art precision (micro average.996) and recall (micro average.961) on 100 test documents that represent more than 90 % of all reports. In particular, principal aspects as defined in a standardized external terminology were recognized with f 1=.989 (micro average) and f 1=.963 (macro average). As a result of keyword matching and restraint concept extraction, the system obtained high precision also on unstructured or exceptionally short documents, and documents with uncommon layout. The developed terminology and the proposed information extraction system allow to extract fine-grained information from German semi-structured transthoracic echocardiography reports

  13. Extraction of Information of Audio-Visual Contents

    Directory of Open Access Journals (Sweden)

    Carlos Aguilar

    2011-10-01

    Full Text Available In this article we show how it is possible to use Channel Theory (Barwise and Seligman, 1997 for modeling the process of information extraction realized by audiences of audio-visual contents. To do this, we rely on the concepts pro- posed by Channel Theory and, especially, its treatment of representational systems. We then show how the information that an agent is capable of extracting from the content depends on the number of channels he is able to establish between the content and the set of classifications he is able to discriminate. The agent can endeavor the extraction of information through these channels from the totality of content; however, we discuss the advantages of extracting from its constituents in order to obtain a greater number of informational items that represent it. After showing how the extraction process is endeavored for each channel, we propose a method of representation of all the informative values an agent can obtain from a content using a matrix constituted by the channels the agent is able to establish on the content (source classifications, and the ones he can understand as individual (destination classifications. We finally show how this representation allows reflecting the evolution of the informative items through the evolution of audio-visual content.

  14. Curriculum integrated information literacy: a challenge

    DEFF Research Database (Denmark)

    Bønløkke, Mette; Kobow, Else; Kristensen, Anne-Kirstine Østergaard

    2012-01-01

    Information literacy is a competence needed for students and for practitioners in the nursing profession. A curriculum integrated intervention was qualitatively evaluated by focus group interviews of students, lecturers and the university librarian. Information literacy makes sense for students...... when it is linked to assignments, timed right, prepared, systematic and continuous. Support is needed to help students understand the meaning of seeking information, to focus their problem and to make them reflect on their search and its results. Feedback on materials used is also asked for...

  15. Semantic Information Extraction of Lanes Based on Onboard Camera Videos

    Science.gov (United States)

    Tang, L.; Deng, T.; Ren, C.

    2018-04-01

    In the field of autonomous driving, semantic information of lanes is very important. This paper proposes a method of automatic detection of lanes and extraction of semantic information from onboard camera videos. The proposed method firstly detects the edges of lanes by the grayscale gradient direction, and improves the Probabilistic Hough transform to fit them; then, it uses the vanishing point principle to calculate the lane geometrical position, and uses lane characteristics to extract lane semantic information by the classification of decision trees. In the experiment, 216 road video images captured by a camera mounted onboard a moving vehicle were used to detect lanes and extract lane semantic information. The results show that the proposed method can accurately identify lane semantics from video images.

  16. Integrating the Supervised Information into Unsupervised Learning

    Directory of Open Access Journals (Sweden)

    Ping Ling

    2013-01-01

    Full Text Available This paper presents an assembling unsupervised learning framework that adopts the information coming from the supervised learning process and gives the corresponding implementation algorithm. The algorithm consists of two phases: extracting and clustering data representatives (DRs firstly to obtain labeled training data and then classifying non-DRs based on labeled DRs. The implementation algorithm is called SDSN since it employs the tuning-scaled Support vector domain description to collect DRs, uses spectrum-based method to cluster DRs, and adopts the nearest neighbor classifier to label non-DRs. The validation of the clustering procedure of the first-phase is analyzed theoretically. A new metric is defined data dependently in the second phase to allow the nearest neighbor classifier to work with the informed information. A fast training approach for DRs’ extraction is provided to bring more efficiency. Experimental results on synthetic and real datasets verify that the proposed idea is of correctness and performance and SDSN exhibits higher popularity in practice over the traditional pure clustering procedure.

  17. Knowledge Dictionary for Information Extraction on the Arabic Text Data

    Directory of Open Access Journals (Sweden)

    Wahyu Jauharis Saputra

    2013-04-01

    Full Text Available Information extraction is an early stage of a process of textual data analysis. Information extraction is required to get information from textual data that can be used for process analysis, such as classification and categorization. A textual data is strongly influenced by the language. Arabic is gaining a significant attention in many studies because Arabic language is very different from others, and in contrast to other languages, tools and research on the Arabic language is still lacking. The information extracted using the knowledge dictionary is a concept of expression. A knowledge dictionary is usually constructed manually by an expert and this would take a long time and is specific to a problem only. This paper proposed a method for automatically building a knowledge dictionary. Dictionary knowledge is formed by classifying sentences having the same concept, assuming that they will have a high similarity value. The concept that has been extracted can be used as features for subsequent computational process such as classification or categorization. Dataset used in this paper was the Arabic text dataset. Extraction result was tested by using a decision tree classification engine and the highest precision value obtained was 71.0% while the highest recall value was 75.0%. 

  18. Integrated occupational radiation exposure information system

    International Nuclear Information System (INIS)

    Hunt, H.W.

    1983-06-01

    The integrated (Occupational Radiation Exposure) data base information system has many advantages. Radiation exposure information is available to operating management in a more timely manner and in a more flexible mode. The ORE system has permitted the integration of scattered files and data to be stored in a more cost-effective method that permits easy and simultaneous access by a variety of users with different data needs. The external storage needs of the radiation exposure source documents are several orders of magnitude less through the use of the computer assisted retrieval techniques employed in the ORE system. Groundwork is being layed to automate the historical files, which are maintained to help describe the radiation protection programs and policies at any one point in time. The file unit will be microfilmed for topical indexing on the ORE data base

  19. FEMA's Integrated Emergency Management Information System (IEMIS)

    International Nuclear Information System (INIS)

    Jaske, R.T.; Meitzler, W.

    1987-01-01

    FEMA is implementing a computerized system for use in optimizing planning, and for supporting exercises of these plans. Called the Integrated Emergency Management Information System (IEMIS), it consists of a base geographic information system upon which analytical models are superimposed in order to load data and report results analytically. At present, it supports FEMA's work in offsite preparedness around nuclear power stations, but is being developed to deal with a full range of natural and technological accident hazards for which emergency evacuation or population movement is required

  20. Integrated Engineering Information Technology, FY93 accommplishments

    Energy Technology Data Exchange (ETDEWEB)

    Harris, R.N.; Miller, D.K.; Neugebauer, G.L.; Orona, J.R.; Partridge, R.A.; Herman, J.D.

    1994-03-01

    The Integrated Engineering Information Technology (IEIT) project is providing a comprehensive, easy-to-use computer network solution or communicating with coworkers both inside and outside Sandia National Laboratories. IEIT capabilities include computer networking, electronic mail, mechanical design, and data management. These network-based tools have one fundamental purpose: to help create a concurrent engineering environment that will enable Sandia organizations to excel in today`s increasingly competitive business environment.

  1. Ontology-Based Information Extraction for Business Intelligence

    Science.gov (United States)

    Saggion, Horacio; Funk, Adam; Maynard, Diana; Bontcheva, Kalina

    Business Intelligence (BI) requires the acquisition and aggregation of key pieces of knowledge from multiple sources in order to provide valuable information to customers or feed statistical BI models and tools. The massive amount of information available to business analysts makes information extraction and other natural language processing tools key enablers for the acquisition and use of that semantic information. We describe the application of ontology-based extraction and merging in the context of a practical e-business application for the EU MUSING Project where the goal is to gather international company intelligence and country/region information. The results of our experiments so far are very promising and we are now in the process of building a complete end-to-end solution.

  2. Nuclear plants gain integrated information systems

    International Nuclear Information System (INIS)

    Villavicencio-Ramirez, A.; Rodriquez-Alvarez, J.M.

    1994-01-01

    With the objective of simplifying the complex mesh of computing devices employed within nuclear power plants, modern technology and integration techniques are being used to form centralized (but backed up) databases and distributed processing and display networks. Benefits are immediate as a result of the integration and the use of standards. The use of a unique data acquisition and database subsystem optimizes the high costs of engineering, as this task is done only once for the life span of the system. This also contributes towards a uniform user interface and allows for graceful expansion and maintenance. This article features an integrated information system, Sistema Integral de Informacion de Proceso (SIIP). The development of this system enabled the Laguna Verde Nuclear Power plant to fully use the already existing universe of signals and its related engineering during all plant conditions, namely, start up, normal operation, transient analysis, and emergency operation. Integrated systems offer many advantages over segregated systems, and this experience should benefit similar development efforts in other electric power utilities, not only for nuclear but also for other types of generating plants

  3. An information integration theory of consciousness

    Directory of Open Access Journals (Sweden)

    Tononi Giulio

    2004-11-01

    Full Text Available Abstract Background Consciousness poses two main problems. The first is understanding the conditions that determine to what extent a system has conscious experience. For instance, why is our consciousness generated by certain parts of our brain, such as the thalamocortical system, and not by other parts, such as the cerebellum? And why are we conscious during wakefulness and much less so during dreamless sleep? The second problem is understanding the conditions that determine what kind of consciousness a system has. For example, why do specific parts of the brain contribute specific qualities to our conscious experience, such as vision and audition? Presentation of the hypothesis This paper presents a theory about what consciousness is and how it can be measured. According to the theory, consciousness corresponds to the capacity of a system to integrate information. This claim is motivated by two key phenomenological properties of consciousness: differentiation – the availability of a very large number of conscious experiences; and integration – the unity of each such experience. The theory states that the quantity of consciousness available to a system can be measured as the Φ value of a complex of elements. Φ is the amount of causally effective information that can be integrated across the informational weakest link of a subset of elements. A complex is a subset of elements with Φ>0 that is not part of a subset of higher Φ. The theory also claims that the quality of consciousness is determined by the informational relationships among the elements of a complex, which are specified by the values of effective information among them. Finally, each particular conscious experience is specified by the value, at any given time, of the variables mediating informational interactions among the elements of a complex. Testing the hypothesis The information integration theory accounts, in a principled manner, for several neurobiological observations

  4. The extraction and integration framework: a two-process account of statistical learning.

    Science.gov (United States)

    Thiessen, Erik D; Kronstein, Alexandra T; Hufnagle, Daniel G

    2013-07-01

    The term statistical learning in infancy research originally referred to sensitivity to transitional probabilities. Subsequent research has demonstrated that statistical learning contributes to infant development in a wide array of domains. The range of statistical learning phenomena necessitates a broader view of the processes underlying statistical learning. Learners are sensitive to a much wider range of statistical information than the conditional relations indexed by transitional probabilities, including distributional and cue-based statistics. We propose a novel framework that unifies learning about all of these kinds of statistical structure. From our perspective, learning about conditional relations outputs discrete representations (such as words). Integration across these discrete representations yields sensitivity to cues and distributional information. To achieve sensitivity to all of these kinds of statistical structure, our framework combines processes that extract segments of the input with processes that compare across these extracted items. In this framework, the items extracted from the input serve as exemplars in long-term memory. The similarity structure of those exemplars in long-term memory leads to the discovery of cues and categorical structure, which guides subsequent extraction. The extraction and integration framework provides a way to explain sensitivity to both conditional statistical structure (such as transitional probabilities) and distributional statistical structure (such as item frequency and variability), and also a framework for thinking about how these different aspects of statistical learning influence each other. 2013 APA, all rights reserved

  5. Portable blood extraction device integrated with biomedical monitoring system

    Science.gov (United States)

    Khumpuang, S.; Horade, M.; Fujioka, K.; Sugiyama, S.

    2006-01-01

    Painless and portable blood extraction device has been immersed in the world of miniaturization on bio-medical research particularly in manufacturing point-of-care systems. The fabrication of a blood extraction device integrated with an electrolyte-monitoring system is reported in this paper. The device has advantages in precise controlled dosage of blood extracted including the slightly damaged blood vessels and nervous system. The in-house blood diagnostic will become simple for the patients. Main components of the portable system are; the blood extraction device and electrolyte-monitoring system. The monitoring system consists of ISFET (Ion Selective Field Effect Transistor) for measuring the concentration level of minerals in blood. In this work, we measured the level of 3 ions; Na+, K+ and Cl-. The mentioned ions are frequently required the measurement since their concentration levels in the blood can indicate whether the kidney, pancreas, liver or heart is being malfunction. The fabrication of the whole system and experimentation on each ISM (Ion Sensitive Membrane) will be provided. Taking the advantages of LIGA technology, the 100 hollow microneedles fabricated by Synchrotron Radiation deep X-ray lithography through PCT (Plane-pattern to Cross-section Transfer) technique have been consisted in 5x5 mm2 area. The microneedle is 300 μm in base-diameter, 500 μm-pitch, 800 μm-height and 50 μm hole-diameter. The total size of the blood extraction device is 2x2x2 cm 3. The package is made from a plastic socket including slots for inserting microneedle array and ISFET connecting to an electrical circuit for the monitoring. Through the dimensional design for simply handling and selection of disposable material, the patients can self-evaluate the critical level of the body minerals in anywhere and anytime.

  6. Assessing Extinction Risk: Integrating Genetic Information

    Directory of Open Access Journals (Sweden)

    Jason Dunham

    1999-06-01

    Full Text Available Risks of population extinction have been estimated using a variety of methods incorporating information from different spatial and temporal scales. We briefly consider how several broad classes of extinction risk assessments, including population viability analysis, incidence functions, and ranking methods integrate information on different temporal and spatial scales. In many circumstances, data from surveys of neutral genetic variability within, and among, populations can provide information useful for assessing extinction risk. Patterns of genetic variability resulting from past and present ecological and demographic events, can indicate risks of extinction that are otherwise difficult to infer from ecological and demographic analyses alone. We provide examples of how patterns of neutral genetic variability, both within, and among populations, can be used to corroborate and complement extinction risk assessments.

  7. Integrated Information System for Higher Education Qualifications

    Directory of Open Access Journals (Sweden)

    Catalin Ionut SILVESTRU

    2012-10-01

    Full Text Available In the present article we aim to study thoroughly and detail aspects related to architectures specific for e-learning and management of human resources training interconnected to management of qualifications. In addition, we take into consideration combining e-learning architectures with software in an e-learning system interconnected with the National Registry of Qualifications of Higher Education, in view of developing and information system that correlates educational supply from higher education from Romania with labor market demands through qualifications. The scientific endeavor consists of original architectural solutions to integrate data, systems, processes, services from various sources and to use them in the proposed system. The practical result of the scientific endeavor is represented by design of architectures required for developing an e-learning system interconnected with the National Registry of Qualifications from Romania, which involve in first stage the qualifications provided by higher education. The proposed innovative solution consists in the fact that the proposed information system combines the advantages of content management system (CMS with learning content management system (LCMS and with reusable learning objects (RLO. Thus, the architecture proposed in the research ensures the integration of a content management system with a portal for information, guidance and support in making a professional project. The integration enables correlation of competences with content areas and specific items from various teaching subjects, thus evaluating the usefulness for this registry from learning/educational perspective. Using the proposed information system in enables correlation among qualifications, content of educational program and continuous self-evaluation opportunities, which facilitate monitoring of progress and adjustment of learning content.

  8. MedTime: a temporal information extraction system for clinical narratives.

    Science.gov (United States)

    Lin, Yu-Kai; Chen, Hsinchun; Brown, Randall A

    2013-12-01

    Temporal information extraction from clinical narratives is of critical importance to many clinical applications. We participated in the EVENT/TIMEX3 track of the 2012 i2b2 clinical temporal relations challenge, and presented our temporal information extraction system, MedTime. MedTime comprises a cascade of rule-based and machine-learning pattern recognition procedures. It achieved a micro-averaged f-measure of 0.88 in both the recognitions of clinical events and temporal expressions. We proposed and evaluated three time normalization strategies to normalize relative time expressions in clinical texts. The accuracy was 0.68 in normalizing temporal expressions of dates, times, durations, and frequencies. This study demonstrates and evaluates the integration of rule-based and machine-learning-based approaches for high performance temporal information extraction from clinical narratives. Copyright © 2013 Elsevier Inc. All rights reserved.

  9. NAMED ENTITY RECOGNITION FROM BIOMEDICAL TEXT -AN INFORMATION EXTRACTION TASK

    Directory of Open Access Journals (Sweden)

    N. Kanya

    2016-07-01

    Full Text Available Biomedical Text Mining targets the Extraction of significant information from biomedical archives. Bio TM encompasses Information Retrieval (IR and Information Extraction (IE. The Information Retrieval will retrieve the relevant Biomedical Literature documents from the various Repositories like PubMed, MedLine etc., based on a search query. The IR Process ends up with the generation of corpus with the relevant document retrieved from the Publication databases based on the query. The IE task includes the process of Preprocessing of the document, Named Entity Recognition (NER from the documents and Relationship Extraction. This process includes Natural Language Processing, Data Mining techniques and machine Language algorithm. The preprocessing task includes tokenization, stop word Removal, shallow parsing, and Parts-Of-Speech tagging. NER phase involves recognition of well-defined objects such as genes, proteins or cell-lines etc. This process leads to the next phase that is extraction of relationships (IE. The work was based on machine learning algorithm Conditional Random Field (CRF.

  10. ACCOUNTING INFORMATION INTEGRATION TROUGH AN ENTERPRISE PORTAL

    Directory of Open Access Journals (Sweden)

    Gianina RIZESCU

    2014-06-01

    Full Text Available If companies are lacking integrated enterprise software applications, or they simply do not use them on a large scale, accounting departments have to face lots of difficulties, concerning both the inflexibility in achieving good results and the limited possibility of communicating these results. Thus, most times, accounting departments are limited to generating predefined reports provided by a software application and the most they can do is export these reports into Microsoft Excel. Another cause which leads to late obtaining and publishing of accounting information is the lack of data from other departments and their corresponding software applications. That is why, in many enterprises, accounting data becomes irrelevant for the users. The main goal of this article is to show how accounting can benefit from an integrated software solution, namely an enterprise portal.

  11. Integrate offsites management with information systems

    Energy Technology Data Exchange (ETDEWEB)

    Valleur, M. (TECHNIP, Paris (France))

    1993-11-01

    Computerized offsites management systems in oil refineries offer a unique opportunity to integrate advanced technology into a coherent refinery information system that contributes to benefits-driven optimal operations: from long-term, multirefinery linear programming (LP) models to sequential control of transfer lineups in the tank farm. There are strong incentives to automate and optimize the offsites operations, and benefits can be quantified to justify properly sized projects. The paper discusses the following: business opportunities, oil movement and advanced technology, project scoping and sizing, review of functional requirements, transfer automation, blending optimal control, on-line analyzers, oil movement and scheduling, organizational issues, and investment and benefits analysis.

  12. A Two-Step Resume Information Extraction Algorithm

    Directory of Open Access Journals (Sweden)

    Jie Chen

    2018-01-01

    Full Text Available With the rapid growth of Internet-based recruiting, there are a great number of personal resumes among recruiting systems. To gain more attention from the recruiters, most resumes are written in diverse formats, including varying font size, font colour, and table cells. However, the diversity of format is harmful to data mining, such as resume information extraction, automatic job matching, and candidates ranking. Supervised methods and rule-based methods have been proposed to extract facts from resumes, but they strongly rely on hierarchical structure information and large amounts of labelled data, which are hard to collect in reality. In this paper, we propose a two-step resume information extraction approach. In the first step, raw text of resume is identified as different resume blocks. To achieve the goal, we design a novel feature, Writing Style, to model sentence syntax information. Besides word index and punctuation index, word lexical attribute and prediction results of classifiers are included in Writing Style. In the second step, multiple classifiers are employed to identify different attributes of fact information in resumes. Experimental results on a real-world dataset show that the algorithm is feasible and effective.

  13. Optimum detection for extracting maximum information from symmetric qubit sets

    International Nuclear Information System (INIS)

    Mizuno, Jun; Fujiwara, Mikio; Sasaki, Masahide; Akiba, Makoto; Kawanishi, Tetsuya; Barnett, Stephen M.

    2002-01-01

    We demonstrate a class of optimum detection strategies for extracting the maximum information from sets of equiprobable real symmetric qubit states of a single photon. These optimum strategies have been predicted by Sasaki et al. [Phys. Rev. A 59, 3325 (1999)]. The peculiar aspect is that the detections with at least three outputs suffice for optimum extraction of information regardless of the number of signal elements. The cases of ternary (or trine), quinary, and septenary polarization signals are studied where a standard von Neumann detection (a projection onto a binary orthogonal basis) fails to access the maximum information. Our experiments demonstrate that it is possible with present technologies to attain about 96% of the theoretical limit

  14. Extracting Semantic Information from Visual Data: A Survey

    Directory of Open Access Journals (Sweden)

    Qiang Liu

    2016-03-01

    Full Text Available The traditional environment maps built by mobile robots include both metric ones and topological ones. These maps are navigation-oriented and not adequate for service robots to interact with or serve human users who normally rely on the conceptual knowledge or semantic contents of the environment. Therefore, the construction of semantic maps becomes necessary for building an effective human-robot interface for service robots. This paper reviews recent research and development in the field of visual-based semantic mapping. The main focus is placed on how to extract semantic information from visual data in terms of feature extraction, object/place recognition and semantic representation methods.

  15. Rapid automatic keyword extraction for information retrieval and analysis

    Science.gov (United States)

    Rose, Stuart J [Richland, WA; Cowley,; E, Wendy [Richland, WA; Crow, Vernon L [Richland, WA; Cramer, Nicholas O [Richland, WA

    2012-03-06

    Methods and systems for rapid automatic keyword extraction for information retrieval and analysis. Embodiments can include parsing words in an individual document by delimiters, stop words, or both in order to identify candidate keywords. Word scores for each word within the candidate keywords are then calculated based on a function of co-occurrence degree, co-occurrence frequency, or both. Based on a function of the word scores for words within the candidate keyword, a keyword score is calculated for each of the candidate keywords. A portion of the candidate keywords are then extracted as keywords based, at least in part, on the candidate keywords having the highest keyword scores.

  16. Integrating information for better environmental decisions.

    Energy Technology Data Exchange (ETDEWEB)

    MacDonell, M.; Morgan, K.; Newland, L.; Environmental Assessment; Texas Christian Univ.

    2002-01-01

    As more is learned about the complex nature and extent of environmental impacts from progressive human disturbance, scientists, policy analysts, decision makers, educators, and communicators are increasingly joining forces to develop strategies for preserving and protecting the environment. The Eco-Informa Foundation is an educational scientific organization dedicated to promoting the collaborative development and sharing of scientific information. The Foundation participated in a recent international conference on environmental informatics through a special symposium on integrating information for better environmental decisions. Presentations focused on four general themes: (1) remote sensing and data interpretation, including through new knowledge management tools; (2) risk assessment and communication, including for radioactively contaminated facilities, introduced biological hazards, and food safety; (3) community involvement in cleanup projects; and (4) environmental education. The general context for related issues, methods and applications, and results and recommendations from those discussions are highlighted here.

  17. Representation and Integration of Scientific Information

    Science.gov (United States)

    1998-01-01

    The objective of this Joint Research Interchange with NASA-Ames was to investigate how the Tsimmis technology could be used to represent and integrate scientific information. The main goal of the Tsimmis project is to allow a decision maker to find information of interest from such sources, fuse it, and process it (e.g., summarize it, visualize it, discover trends). Another important goal is the easy incorporation of new sources, as well the ability to deal with sources whose structure or services evolve. During the Interchange we had research meetings approximately every month or two. The funds provided by NASA supported work that lead to the following two papers: Fusion Queries over Internet Databases; Efficient Query Subscription Processing in a Multicast Environment.

  18. Integrated environmental monitoring and information system

    International Nuclear Information System (INIS)

    Klinda, J.; Lieskovska, Z.

    1998-01-01

    The concept of the environmental monitoring within the territory of the Slovak Republic and the concept of the integrated environmental information system of the Slovak Republic were accepted and confirmed by the Government Order No. 449/1992. The state monitoring system covering the whole territory of Slovakia is the most important and consists of 13 Partial Monitoring Systems (PMSs). List of PMSs is included. The listed PMSs are managed according to the concept of the Sectoral Information System (SIS) of the Ministry of the Environment of the Slovak Republic (MESR) which was established by the National Council Act No. 261/1995 Coll. on the SIS. The SIS consists of 18 subsystems which are listed. The overviews of budget of PMSs as well as of environmental publications and periodicals of the MESR are included

  19. Robust Vehicle and Traffic Information Extraction for Highway Surveillance

    Directory of Open Access Journals (Sweden)

    Yeh Chia-Hung

    2005-01-01

    Full Text Available A robust vision-based traffic monitoring system for vehicle and traffic information extraction is developed in this research. It is challenging to maintain detection robustness at all time for a highway surveillance system. There are three major problems in detecting and tracking a vehicle: (1 the moving cast shadow effect, (2 the occlusion effect, and (3 nighttime detection. For moving cast shadow elimination, a 2D joint vehicle-shadow model is employed. For occlusion detection, a multiple-camera system is used to detect occlusion so as to extract the exact location of each vehicle. For vehicle nighttime detection, a rear-view monitoring technique is proposed to maintain tracking and detection accuracy. Furthermore, we propose a method to improve the accuracy of background extraction, which usually serves as the first step in any vehicle detection processing. Experimental results are given to demonstrate that the proposed techniques are effective and efficient for vision-based highway surveillance.

  20. Dutch virtual integration of healthcare information.

    Science.gov (United States)

    de Graaf, J C; Vlug, A E; van Boven, G J

    2007-01-01

    As information technology creates opportunities for cooperation which crosses the boundaries between healthcare institutions, it will become an integral part of the Dutch healthcare system. Along with many involved organizations in healthcare the National IT Institute for Healthcare in the Netherlands (NICTIZ) is working on the realization of a national IT infrastructure for healthcare and a national electronic patient record (EPR). An underlying national architecture is designed to enable the Dutch EPR virtually, not in a national database, nor on a patient's smartcard. The required secure infrastructure provides generic functions for healthcare applications: patient identification, authentication and authorization of healthcare professionals. The first national applications in the EPR program using a national index of where patient data is stored, are the electronic medication record and the electronic record for after hours GP services. The rollout of the electronic medication record and electronic record for after hours GP services has been started in 2007. To guarantee progress of electronic data exchange in healthcare in the Netherlands we have primarily opted for two healthcare applications: the electronic medication record and the electronic record for after hours GP services. The use of a national switch-point containing the registry of where to find what information, guarantees that the professional receives the most recent information and omits large databases to contain downloaded data. Proper authorization, authentication as well as tracing by the national switchpoint also ensures a secure environment for the communication of delicate information.

  1. Advanced applications of natural language processing for performing information extraction

    CERN Document Server

    Rodrigues, Mário

    2015-01-01

    This book explains how can be created information extraction (IE) applications that are able to tap the vast amount of relevant information available in natural language sources: Internet pages, official documents such as laws and regulations, books and newspapers, and social web. Readers are introduced to the problem of IE and its current challenges and limitations, supported with examples. The book discusses the need to fill the gap between documents, data, and people, and provides a broad overview of the technology supporting IE. The authors present a generic architecture for developing systems that are able to learn how to extract relevant information from natural language documents, and illustrate how to implement working systems using state-of-the-art and freely available software tools. The book also discusses concrete applications illustrating IE uses.   ·         Provides an overview of state-of-the-art technology in information extraction (IE), discussing achievements and limitations for t...

  2. Information and image integration: project spectrum

    Science.gov (United States)

    Blaine, G. James; Jost, R. Gilbert; Martin, Lori; Weiss, David A.; Lehmann, Ron; Fritz, Kevin

    1998-07-01

    The BJC Health System (BJC) and the Washington University School of Medicine (WUSM) formed a technology alliance with industry collaborators to develop and implement an integrated, advanced clinical information system. The industry collaborators include IBM, Kodak, SBC and Motorola. The activity, called Project Spectrum, provides an integrated clinical repository for the multiple hospital facilities of the BJC. The BJC System consists of 12 acute care hospitals serving over one million patients in Missouri and Illinois. An interface engine manages transactions from each of the hospital information systems, lab systems and radiology information systems. Data is normalized to provide a consistent view for the primary care physician. Access to the clinical repository is supported by web-based server/browser technology which delivers patient data to the physician's desktop. An HL7 based messaging system coordinates the acquisition and management of radiological image data and sends image keys to the clinical data repository. Access to the clinical chart browser currently provides radiology reports, laboratory data, vital signs and transcribed medical reports. A chart metaphor provides tabs for the selection of the clinical record for review. Activation of the radiology tab facilitates a standardized view of radiology reports and provides an icon used to initiate retrieval of available radiology images. The selection of the image icon spawns an image browser plug-in and utilizes the image key from the clinical repository to access the image server for the requested image data. The Spectrum system is collecting clinical data from five hospital systems and imaging data from two hospitals. Domain specific radiology imaging systems support the acquisition and primary interpretation of radiology exams. The spectrum clinical workstations are deployed to over 200 sites utilizing local area networks and ISDN connectivity.

  3. YAdumper: extracting and translating large information volumes from relational databases to structured flat files.

    Science.gov (United States)

    Fernández, José M; Valencia, Alfonso

    2004-10-12

    Downloading the information stored in relational databases into XML and other flat formats is a common task in bioinformatics. This periodical dumping of information requires considerable CPU time, disk and memory resources. YAdumper has been developed as a purpose-specific tool to deal with the integral structured information download of relational databases. YAdumper is a Java application that organizes database extraction following an XML template based on an external Document Type Declaration. Compared with other non-native alternatives, YAdumper substantially reduces memory requirements and considerably improves writing performance.

  4. Extracting breathing rate information from a wearable reflectance pulse oximeter sensor.

    Science.gov (United States)

    Johnston, W S; Mendelson, Y

    2004-01-01

    The integration of multiple vital physiological measurements could help combat medics and field commanders to better predict a soldier's health condition and enhance their ability to perform remote triage procedures. In this paper we demonstrate the feasibility of extracting accurate breathing rate information from a photoplethysmographic signal that was recorded by a reflectance pulse oximeter sensor mounted on the forehead and subsequently processed by a simple time domain filtering and frequency domain Fourier analysis.

  5. Data Entities and Information System Matrix for Integrated Agriculture Information System (IAIS)

    Science.gov (United States)

    Budi Santoso, Halim; Delima, Rosa

    2018-03-01

    Integrated Agriculture Information System is a system that is developed to process data, information, and knowledge in Agriculture sector. Integrated Agriculture Information System brings valuable information for farmers: (1) Fertilizer price; (2) Agriculture technique and practise; (3) Pest management; (4) Cultivation; (5) Irrigation; (6) Post harvest processing; (7) Innovation in agriculture processing. Integrated Agriculture Information System contains 9 subsystems. To bring an integrated information to the user and stakeholder, it needs an integrated database approach. Thus, researchers describes data entity and its matrix relate to subsystem in Integrated Agriculture Information System (IAIS). As a result, there are 47 data entities as entities in single and integrated database.

  6. Improving information extraction using a probability-based approach

    DEFF Research Database (Denmark)

    Kim, S.; Ahmed, Saeema; Wallace, K.

    2007-01-01

    Information plays a crucial role during the entire life-cycle of a product. It has been shown that engineers frequently consult colleagues to obtain the information they require to solve problems. However, the industrial world is now more transient and key personnel move to other companies...... or retire. It is becoming essential to retrieve vital information from archived product documents, if it is available. There is, therefore, great interest in ways of extracting relevant and sharable information from documents. A keyword-based search is commonly used, but studies have shown...... the recall, while maintaining the high precision, a learning approach that makes identification decisions based on a probability model, rather than simply looking up the presence of the pre-defined variations, looks promising. This paper presents the results of developing such a probability-based entity...

  7. Transliteration normalization for Information Extraction and Machine Translation

    Directory of Open Access Journals (Sweden)

    Yuval Marton

    2014-12-01

    Full Text Available Foreign name transliterations typically include multiple spelling variants. These variants cause data sparseness and inconsistency problems, increase the Out-of-Vocabulary (OOV rate, and present challenges for Machine Translation, Information Extraction and other natural language processing (NLP tasks. This work aims to identify and cluster name spelling variants using a Statistical Machine Translation method: word alignment. The variants are identified by being aligned to the same “pivot” name in another language (the source-language in Machine Translation settings. Based on word-to-word translation and transliteration probabilities, as well as the string edit distance metric, names with similar spellings in the target language are clustered and then normalized to a canonical form. With this approach, tens of thousands of high-precision name transliteration spelling variants are extracted from sentence-aligned bilingual corpora in Arabic and English (in both languages. When these normalized name spelling variants are applied to Information Extraction tasks, improvements over strong baseline systems are observed. When applied to Machine Translation tasks, a large improvement potential is shown.

  8. Integration of auditory and visual speech information

    NARCIS (Netherlands)

    Hall, M.; Smeele, P.M.T.; Kuhl, P.K.

    1998-01-01

    The integration of auditory and visual speech is observed when modes specify different places of articulation. Influences of auditory variation on integration were examined using consonant identifi-cation, plus quality and similarity ratings. Auditory identification predicted auditory-visual

  9. Recognition techniques for extracting information from semistructured documents

    Science.gov (United States)

    Della Ventura, Anna; Gagliardi, Isabella; Zonta, Bruna

    2000-12-01

    Archives of optical documents are more and more massively employed, the demand driven also by the new norms sanctioning the legal value of digital documents, provided they are stored on supports that are physically unalterable. On the supply side there is now a vast and technologically advanced market, where optical memories have solved the problem of the duration and permanence of data at costs comparable to those for magnetic memories. The remaining bottleneck in these systems is the indexing. The indexing of documents with a variable structure, while still not completely automated, can be machine supported to a large degree with evident advantages both in the organization of the work, and in extracting information, providing data that is much more detailed and potentially significant for the user. We present here a system for the automatic registration of correspondence to and from a public office. The system is based on a general methodology for the extraction, indexing, archiving, and retrieval of significant information from semi-structured documents. This information, in our prototype application, is distributed among the database fields of sender, addressee, subject, date, and body of the document.

  10. Using text mining techniques to extract phenotypic information from the PhenoCHF corpus.

    Science.gov (United States)

    Alnazzawi, Noha; Thompson, Paul; Batista-Navarro, Riza; Ananiadou, Sophia

    2015-01-01

    Phenotypic information locked away in unstructured narrative text presents significant barriers to information accessibility, both for clinical practitioners and for computerised applications used for clinical research purposes. Text mining (TM) techniques have previously been applied successfully to extract different types of information from text in the biomedical domain. They have the potential to be extended to allow the extraction of information relating to phenotypes from free text. To stimulate the development of TM systems that are able to extract phenotypic information from text, we have created a new corpus (PhenoCHF) that is annotated by domain experts with several types of phenotypic information relating to congestive heart failure. To ensure that systems developed using the corpus are robust to multiple text types, it integrates text from heterogeneous sources, i.e., electronic health records (EHRs) and scientific articles from the literature. We have developed several different phenotype extraction methods to demonstrate the utility of the corpus, and tested these methods on a further corpus, i.e., ShARe/CLEF 2013. Evaluation of our automated methods showed that PhenoCHF can facilitate the training of reliable phenotype extraction systems, which are robust to variations in text type. These results have been reinforced by evaluating our trained systems on the ShARe/CLEF corpus, which contains clinical records of various types. Like other studies within the biomedical domain, we found that solutions based on conditional random fields produced the best results, when coupled with a rich feature set. PhenoCHF is the first annotated corpus aimed at encoding detailed phenotypic information. The unique heterogeneous composition of the corpus has been shown to be advantageous in the training of systems that can accurately extract phenotypic information from a range of different text types. Although the scope of our annotation is currently limited to a single

  11. Seeds integrate biological information about conspecific and allospecific neighbours.

    Science.gov (United States)

    Yamawo, Akira; Mukai, Hiromi

    2017-06-28

    Numerous organisms integrate information from multiple sources and express adaptive behaviours, but how they do so at different developmental stages remains to be identified. Seeds, which are the embryonic stage of plants, need to make decisions about the timing of emergence in response to environmental cues related to survival. We investigated the timing of emergence of Plantago asiatica (Plantaginaceae) seed while manipulating the presence of Trifolium repens seed and the relatedness of neighbouring P. asiatica seed. The relatedness of neighbouring P. asiatica seed and the presence of seeds of T. repens did not on their own influence the timing of P. asiatica emergence. However, when encountering a T. repens seed, a P. asiatica seed emerged faster in the presence of a sibling seed than in the presence of a non-sibling seed. Water extracts of seeds gave the same result. We show that P. asiatica seeds integrate information about the relatedness of neighbouring P. asiatica seeds and the presence of seeds of a different species via water-soluble chemicals and adjust their emergence behaviour in response. These findings suggest the presence of kin-dependent interspecific interactions. © 2017 The Author(s).

  12. CSIR's new integrated electronic library information-system

    CSIR Research Space (South Africa)

    Michie, A

    1995-08-01

    Full Text Available The CSIR has developed a CDROM-based electronic library information system which provides the ability to reproduce and search for published information and colour brochures on the computer screen. The system integrates this information with online...

  13. Extraction of Urban Trees from Integrated Airborne Based Digital Image and LIDAR Point Cloud Datasets - Initial Results

    Science.gov (United States)

    Dogon-yaro, M. A.; Kumar, P.; Rahman, A. Abdul; Buyuksalih, G.

    2016-10-01

    Timely and accurate acquisition of information on the condition and structural changes of urban trees serves as a tool for decision makers to better appreciate urban ecosystems and their numerous values which are critical to building up strategies for sustainable development. The conventional techniques used for extracting tree features include; ground surveying and interpretation of the aerial photography. However, these techniques are associated with some constraint, such as labour intensive field work, a lot of financial requirement, influences by weather condition and topographical covers which can be overcome by means of integrated airborne based LiDAR and very high resolution digital image datasets. This study presented a semi-automated approach for extracting urban trees from integrated airborne based LIDAR and multispectral digital image datasets over Istanbul city of Turkey. The above scheme includes detection and extraction of shadow free vegetation features based on spectral properties of digital images using shadow index and NDVI techniques and automated extraction of 3D information about vegetation features from the integrated processing of shadow free vegetation image and LiDAR point cloud datasets. The ability of the developed algorithms shows a promising result as an automated and cost effective approach to estimating and delineated 3D information of urban trees. The research also proved that integrated datasets is a suitable technology and a viable source of information for city managers to be used in urban trees management.

  14. Autonomous Preference-Aware Information Services Integration for High Response in Integrated Faded Information Field Systems

    Science.gov (United States)

    Lu, Xiaodong; Mori, Kinji

    The market and users' requirements have been rapidly changing and diversified. Under these heterogeneous and dynamic situations, not only the system structure itself, but also the accessible information services would be changed constantly. To cope with the continuously changing conditions of service provision and utilization, Faded Information Field (FIF) has been proposed, which is a agent-based distributed information service system architecture. In the case of a mono-service request, the system is designed to improve users' access time and preserve load balancing through the information structure. However, with interdependent requests of multi-service increasing, adaptability and timeliness have to be assured by the system. In this paper, the relationship that exists among the correlated services and the users' preferences for separate and integrated services is clarified. Based on these factors, the autonomous preference-aware information services integration technology to provide one-stop service for users multi-service requests is proposed. As compared to the conventional system, we show that proposed technology is able to reduce the total access time.

  15. Depth extraction method with high accuracy in integral imaging based on moving array lenslet technique

    Science.gov (United States)

    Wang, Yao-yao; Zhang, Juan; Zhao, Xue-wei; Song, Li-pei; Zhang, Bo; Zhao, Xing

    2018-03-01

    In order to improve depth extraction accuracy, a method using moving array lenslet technique (MALT) in pickup stage is proposed, which can decrease the depth interval caused by pixelation. In this method, the lenslet array is moved along the horizontal and vertical directions simultaneously for N times in a pitch to get N sets of elemental images. Computational integral imaging reconstruction method for MALT is taken to obtain the slice images of the 3D scene, and the sum modulus (SMD) blur metric is taken on these slice images to achieve the depth information of the 3D scene. Simulation and optical experiments are carried out to verify the feasibility of this method.

  16. Integration of Information Technologies in Enterprise Application Development

    OpenAIRE

    Iulia SURUGIU

    2012-01-01

    Healthcare enterprises are disconnected. In the era of integrated information systems and Internet explosion, the necessity of information systems integration reside from business process evolution, on the one hand, and from information technology tendencies, on the other hand. In order to become more efficient and adaptive to change, healthcare organizations are tremendously preoccupied of business process automation, flexibility and complexity. The need of information systems integration ar...

  17. The NASA Integrated Information Technology Architecture

    Science.gov (United States)

    Baldridge, Tim

    1997-01-01

    of IT systems, 3) the Technical Architecture: a common, vendor-independent framework for design, integration and implementation of IT systems and 4) the Product Architecture: vendor=specific IT solutions. The Systems Architecture is effectively a description of the end-user "requirements". Generalized end-user requirements are discussed and subsequently organized into specific mission and project functions. The Technical Architecture depicts the framework, and relationship, of the specific IT components that enable the end-user functionality as described in the Systems Architecture. The primary components as described in the Technical Architecture are: 1) Applications: Basic Client Component, Object Creation Applications, Collaborative Applications, Object Analysis Applications, 2) Services: Messaging, Information Broker, Collaboration, Distributed Processing, and 3) Infrastructure: Network, Security, Directory, Certificate Management, Enterprise Management and File System. This Architecture also provides specific Implementation Recommendations, the most significant of which is the recognition of IT as core to NASA activities and defines a plan, which is aligned with the NASA strategic planning processes, for keeping the Architecture alive and useful.

  18. New microwave-integrated Soxhlet extraction. An advantageous tool for the extraction of lipids from food products.

    Science.gov (United States)

    Virot, Matthieu; Tomao, Valérie; Colnagui, Giulio; Visinoni, Franco; Chemat, Farid

    2007-12-07

    A new process of Soxhlet extraction assisted by microwave was designed and developed. The process is performed in four steps, which ensures complete, rapid and accurate extraction of the samples. A second-order central composite design (CCD) has been used to investigate the performance of the new device. The results provided by analysis of variance and Pareto chart, indicated that the extraction time was the most important factor followed by the leaching time. The response surface methodology allowed us to determine optimal conditions for olive oil extraction: 13 min of extraction time, 17 min of leaching time, and 720 W of irradiation power. The proposed process is suitable for lipids determination from food. Microwave-integrated Soxhlet (MIS) extraction has been compared with a conventional technique, Soxhlet extraction, for the extraction of oil from olives (Aglandau, Vaucluse, France). The oils extracted by MIS for 32 min were quantitatively (yield) and qualitatively (fatty acid composition) similar to those obtained by conventional Soxhlet extraction for 8 h. MIS is a green technology and appears as a good alternative for the extraction of fat and oils from food products.

  19. A Method for Extracting Road Boundary Information from Crowdsourcing Vehicle GPS Trajectories.

    Science.gov (United States)

    Yang, Wei; Ai, Tinghua; Lu, Wei

    2018-04-19

    Crowdsourcing trajectory data is an important approach for accessing and updating road information. In this paper, we present a novel approach for extracting road boundary information from crowdsourcing vehicle traces based on Delaunay triangulation (DT). First, an optimization and interpolation method is proposed to filter abnormal trace segments from raw global positioning system (GPS) traces and interpolate the optimization segments adaptively to ensure there are enough tracking points. Second, constructing the DT and the Voronoi diagram within interpolated tracking lines to calculate road boundary descriptors using the area of Voronoi cell and the length of triangle edge. Then, the road boundary detection model is established integrating the boundary descriptors and trajectory movement features (e.g., direction) by DT. Third, using the boundary detection model to detect road boundary from the DT constructed by trajectory lines, and a regional growing method based on seed polygons is proposed to extract the road boundary. Experiments were conducted using the GPS traces of taxis in Beijing, China, and the results show that the proposed method is suitable for extracting the road boundary from low-frequency GPS traces, multi-type road structures, and different time intervals. Compared with two existing methods, the automatically extracted boundary information was proved to be of higher quality.

  20. A Method for Extracting Road Boundary Information from Crowdsourcing Vehicle GPS Trajectories

    Directory of Open Access Journals (Sweden)

    Wei Yang

    2018-04-01

    Full Text Available Crowdsourcing trajectory data is an important approach for accessing and updating road information. In this paper, we present a novel approach for extracting road boundary information from crowdsourcing vehicle traces based on Delaunay triangulation (DT. First, an optimization and interpolation method is proposed to filter abnormal trace segments from raw global positioning system (GPS traces and interpolate the optimization segments adaptively to ensure there are enough tracking points. Second, constructing the DT and the Voronoi diagram within interpolated tracking lines to calculate road boundary descriptors using the area of Voronoi cell and the length of triangle edge. Then, the road boundary detection model is established integrating the boundary descriptors and trajectory movement features (e.g., direction by DT. Third, using the boundary detection model to detect road boundary from the DT constructed by trajectory lines, and a regional growing method based on seed polygons is proposed to extract the road boundary. Experiments were conducted using the GPS traces of taxis in Beijing, China, and the results show that the proposed method is suitable for extracting the road boundary from low-frequency GPS traces, multi-type road structures, and different time intervals. Compared with two existing methods, the automatically extracted boundary information was proved to be of higher quality.

  1. Automated extraction of chemical structure information from digital raster images

    Directory of Open Access Journals (Sweden)

    Shedden Kerby A

    2009-02-01

    Full Text Available Abstract Background To search for chemical structures in research articles, diagrams or text representing molecules need to be translated to a standard chemical file format compatible with cheminformatic search engines. Nevertheless, chemical information contained in research articles is often referenced as analog diagrams of chemical structures embedded in digital raster images. To automate analog-to-digital conversion of chemical structure diagrams in scientific research articles, several software systems have been developed. But their algorithmic performance and utility in cheminformatic research have not been investigated. Results This paper aims to provide critical reviews for these systems and also report our recent development of ChemReader – a fully automated tool for extracting chemical structure diagrams in research articles and converting them into standard, searchable chemical file formats. Basic algorithms for recognizing lines and letters representing bonds and atoms in chemical structure diagrams can be independently run in sequence from a graphical user interface-and the algorithm parameters can be readily changed-to facilitate additional development specifically tailored to a chemical database annotation scheme. Compared with existing software programs such as OSRA, Kekule, and CLiDE, our results indicate that ChemReader outperforms other software systems on several sets of sample images from diverse sources in terms of the rate of correct outputs and the accuracy on extracting molecular substructure patterns. Conclusion The availability of ChemReader as a cheminformatic tool for extracting chemical structure information from digital raster images allows research and development groups to enrich their chemical structure databases by annotating the entries with published research articles. Based on its stable performance and high accuracy, ChemReader may be sufficiently accurate for annotating the chemical database with links

  2. Road Network Extraction from VHR Satellite Images Using Context Aware Object Feature Integration and Tensor Voting

    Directory of Open Access Journals (Sweden)

    Mehdi Maboudi

    2016-08-01

    Full Text Available Road networks are very important features in geospatial databases. Even though high-resolution optical satellite images have already been acquired for more than a decade, tools for automated extraction of road networks from these images are still rare. One consequence of this is the need for manual interaction which, in turn, is time and cost intensive. In this paper, a multi-stage approach is proposed which integrates structural, spectral, textural, as well as contextual information of objects to extract road networks from very high resolution satellite images. Highlights of the approach are a novel linearity index employed for the discrimination of elongated road segments from other objects and customized tensor voting which is utilized to fill missing parts of the network. Experiments are carried out with different datasets. Comparison of the achieved results with the results of seven state-of-the-art methods demonstrated the efficiency of the proposed approach.

  3. Information Technologies and Supply Chain Integration

    DEFF Research Database (Denmark)

    Lemoine, W; Mortensen, Ole

    integration. This study illustrates, from an empirical point of view, the problems associ-ated to SC integration among European firms operating in global/international markets. The focus is on the relationship between two echelons in the supply chain: manufacturers and their transport and logistics service......The goal of the Supply Chain Management process is to create value for customers, stakeholders and all supply chain members, through the integration of different processes like manufacturing flow management, customer service and order fulfillment. However, many firms fail in the path of achieving...... integration. Our results show that the current business integra-tion practices between manufacturers and TLSPs are primarily restricted to some sub-processes in three key SC processes: Customer Service Management, order fulfillment and backwards logistics. The use of IT tools to support the integration has...

  4. Understanding Information Systems Integration Deficiencies in Mergers and Acquisitions

    DEFF Research Database (Denmark)

    Henningsson, Stefan; Kettinger, William J.

    2017-01-01

    Information systems (IS) integration is a critical challenge for value-creating mergers and acquisitions. Appropriate design and implementation of IS integration is typically a precondition for enabling a majority of the anticipated business benefits of a combined organization. Often...

  5. Knowledge and information management for integrated water resource management

    Science.gov (United States)

    Watershed information systems that integrate data and analytical tools are critical enabling technologies to support Integrated Water Resource Management (IWRM) by converting data into information, and information into knowledge. Many factors bring people to the table to participate in an IWRM fra...

  6. Academic Integrity: Information Systems Education Perspective

    Science.gov (United States)

    McHaney, Roger; Cronan, Timothy Paul; Douglas, David E.

    2016-01-01

    Academic integrity receives a great deal of attention in institutions of higher education. Universities and colleges provide specific honor codes or have administrative units to promote good behaviors and resolve dishonesty allegations. Students, faculty, and staff have stakes in maintaining high levels of academic integrity to ensure their…

  7. Principles and core functions of integrated child health information systems.

    Science.gov (United States)

    Hinman, Alan R; Atkinson, Delton; Diehn, Tonya Norvell; Eichwald, John; Heberer, Jennifer; Hoyle, Therese; King, Pam; Kossack, Robert E; Williams, Donna C; Zimmerman, Amy

    2004-11-01

    Infants undergo a series of preventive and therapeutic health interventions and activities. Typically, each activity includes collection and submission of data to a dedicated information system. Subsequently, health care providers, families, and health programs must query each information system to determine the child's status in a given area. Efforts are underway to integrate information in these separate information systems. This requires specifying the core functions that integrated information systems must perform.

  8. Regional Logistics Information Resources Integration Patterns and Countermeasures

    Science.gov (United States)

    Wu, Hui; Shangguan, Xu-ming

    Effective integration of regional logistics information resources can provide collaborative services in information flow, business flow and logistics for regional logistics enterprises, which also can reduce operating costs and improve market responsiveness. First, this paper analyzes the realistic significance on the integration of regional logistics information. Second, this paper brings forward three feasible patterns on the integration of regional logistics information resources, These three models have their own strengths and the scope of application and implementation, which model is selected will depend on the specific business and the regional distribution of enterprises. Last, this paper discusses the related countermeasures on the integration of regional logistics information resources, because the integration of regional logistics information is a systems engineering, when the integration is advancing, the countermeasures should pay close attention to the current needs and long-term development of regional enterprises.

  9. Integration issues of information engineering based I-CASE tools

    OpenAIRE

    Kurbel, Karl; Schnieder, Thomas

    1994-01-01

    Problems and requirements regarding integration of methods and tools across phases of the software-development life cycle are discussed. Information engineering (IE) methodology and I-CASE (integrated CASE) tools supporting IE claim to have an integrated view across major stages of enterprise-wide information-system development: information strategy planning, business area analysis, system design, and construction. In the main part of this paper, two comprehensive I-CASE tools, ADW (Applicati...

  10. Information Security Maturity as an Integral Part of ISMS based Risk Management Tools

    NARCIS (Netherlands)

    Fetler, Ben; Harpes, Carlo

    2016-01-01

    Measuring the continuous improvement of Information Security Management Systems (ISMS) is often neglected as most organizations do not know how to extract key-indicators that could be used for this purpose. The underlying work presents a six-level maturity model which can be fully integrated in a

  11. Information Extraction for Clinical Data Mining: A Mammography Case Study.

    Science.gov (United States)

    Nassif, Houssam; Woods, Ryan; Burnside, Elizabeth; Ayvaci, Mehmet; Shavlik, Jude; Page, David

    2009-01-01

    Breast cancer is the leading cause of cancer mortality in women between the ages of 15 and 54. During mammography screening, radiologists use a strict lexicon (BI-RADS) to describe and report their findings. Mammography records are then stored in a well-defined database format (NMD). Lately, researchers have applied data mining and machine learning techniques to these databases. They successfully built breast cancer classifiers that can help in early detection of malignancy. However, the validity of these models depends on the quality of the underlying databases. Unfortunately, most databases suffer from inconsistencies, missing data, inter-observer variability and inappropriate term usage. In addition, many databases are not compliant with the NMD format and/or solely consist of text reports. BI-RADS feature extraction from free text and consistency checks between recorded predictive variables and text reports are crucial to addressing this problem. We describe a general scheme for concept information retrieval from free text given a lexicon, and present a BI-RADS features extraction algorithm for clinical data mining. It consists of a syntax analyzer, a concept finder and a negation detector. The syntax analyzer preprocesses the input into individual sentences. The concept finder uses a semantic grammar based on the BI-RADS lexicon and the experts' input. It parses sentences detecting BI-RADS concepts. Once a concept is located, a lexical scanner checks for negation. Our method can handle multiple latent concepts within the text, filtering out ultrasound concepts. On our dataset, our algorithm achieves 97.7% precision, 95.5% recall and an F 1 -score of 0.97. It outperforms manual feature extraction at the 5% statistical significance level.

  12. INFORMATION EXTRACTION IN TOMB PIT USING HYPERSPECTRAL DATA

    Directory of Open Access Journals (Sweden)

    X. Yang

    2018-04-01

    Full Text Available Hyperspectral data has characteristics of multiple bands and continuous, large amount of data, redundancy, and non-destructive. These characteristics make it possible to use hyperspectral data to study cultural relics. In this paper, the hyperspectral imaging technology is adopted to recognize the bottom images of an ancient tomb located in Shanxi province. There are many black remains on the bottom surface of the tomb, which are suspected to be some meaningful texts or paintings. Firstly, the hyperspectral data is preprocessing to get the reflectance of the region of interesting. For the convenient of compute and storage, the original reflectance value is multiplied by 10000. Secondly, this article uses three methods to extract the symbols at the bottom of the ancient tomb. Finally we tried to use morphology to connect the symbols and gave fifteen reference images. The results show that the extraction of information based on hyperspectral data can obtain a better visual experience, which is beneficial to the study of ancient tombs by researchers, and provides some references for archaeological research findings.

  13. Information Extraction in Tomb Pit Using Hyperspectral Data

    Science.gov (United States)

    Yang, X.; Hou, M.; Lyu, S.; Ma, S.; Gao, Z.; Bai, S.; Gu, M.; Liu, Y.

    2018-04-01

    Hyperspectral data has characteristics of multiple bands and continuous, large amount of data, redundancy, and non-destructive. These characteristics make it possible to use hyperspectral data to study cultural relics. In this paper, the hyperspectral imaging technology is adopted to recognize the bottom images of an ancient tomb located in Shanxi province. There are many black remains on the bottom surface of the tomb, which are suspected to be some meaningful texts or paintings. Firstly, the hyperspectral data is preprocessing to get the reflectance of the region of interesting. For the convenient of compute and storage, the original reflectance value is multiplied by 10000. Secondly, this article uses three methods to extract the symbols at the bottom of the ancient tomb. Finally we tried to use morphology to connect the symbols and gave fifteen reference images. The results show that the extraction of information based on hyperspectral data can obtain a better visual experience, which is beneficial to the study of ancient tombs by researchers, and provides some references for archaeological research findings.

  14. Automated Extraction of Substance Use Information from Clinical Texts.

    Science.gov (United States)

    Wang, Yan; Chen, Elizabeth S; Pakhomov, Serguei; Arsoniadis, Elliot; Carter, Elizabeth W; Lindemann, Elizabeth; Sarkar, Indra Neil; Melton, Genevieve B

    2015-01-01

    Within clinical discourse, social history (SH) includes important information about substance use (alcohol, drug, and nicotine use) as key risk factors for disease, disability, and mortality. In this study, we developed and evaluated a natural language processing (NLP) system for automated detection of substance use statements and extraction of substance use attributes (e.g., temporal and status) based on Stanford Typed Dependencies. The developed NLP system leveraged linguistic resources and domain knowledge from a multi-site social history study, Propbank and the MiPACQ corpus. The system attained F-scores of 89.8, 84.6 and 89.4 respectively for alcohol, drug, and nicotine use statement detection, as well as average F-scores of 82.1, 90.3, 80.8, 88.7, 96.6, and 74.5 respectively for extraction of attributes. Our results suggest that NLP systems can achieve good performance when augmented with linguistic resources and domain knowledge when applied to a wide breadth of substance use free text clinical notes.

  15. Information delivery manuals to integrate building product information into design

    DEFF Research Database (Denmark)

    Berard, Ole Bengt; Karlshøj, Jan

    2013-01-01

    Despite continuing BIM progress, professionals in the AEC industry often lack the information they need to perform their work. Although this problem could be alleviated by information systems similar to those in other industries, companies struggle to model processes and information needs....... Traditional business process modeling languages often fail to completely cover all four perspectives. BuildingSMART has proposed Information Delivery Manuals (IDMs) to model and re-engineer processes that address the four perspectives through a collaborative methodology in order to standardize and implement...... in the manner necessary to develop information systems that support digital collaboration, workflows, and information exchange. Processes for information systems can be described from four perspectives: task sequence, information need, organizational interaction, and required logic for the specific task...

  16. Domain-independent information extraction in unstructured text

    Energy Technology Data Exchange (ETDEWEB)

    Irwin, N.H. [Sandia National Labs., Albuquerque, NM (United States). Software Surety Dept.

    1996-09-01

    Extracting information from unstructured text has become an important research area in recent years due to the large amount of text now electronically available. This status report describes the findings and work done during the second year of a two-year Laboratory Directed Research and Development Project. Building on the first-year`s work of identifying important entities, this report details techniques used to group words into semantic categories and to output templates containing selective document content. Using word profiles and category clustering derived during a training run, the time-consuming knowledge-building task can be avoided. Though the output still lacks in completeness when compared to systems with domain-specific knowledge bases, the results do look promising. The two approaches are compatible and could complement each other within the same system. Domain-independent approaches retain appeal as a system that adapts and learns will soon outpace a system with any amount of a priori knowledge.

  17. Extracting and Using Photon Polarization Information in Radiative B Decays

    Energy Technology Data Exchange (ETDEWEB)

    Grossman, Yuval

    2000-05-09

    The authors discuss the uses of conversion electron pairs for extracting photon polarization information in weak radiative B decays. Both cases of leptons produced through a virtual and real photon are considered. Measurements of the angular correlation between the (K-pi) and (e{sup +}e{sup {minus}}) decay planes in B --> K*(--> K-pi)gamma (*)(--> e{sup +}e{sup {minus}}) decays can be used to determine the helicity amplitudes in the radiative B --> K*gamma decays. A large right-handed helicity amplitude in B-bar decays is a signal of new physics. The time-dependent CP asymmetry in the B{sup 0} decay angular correlation is shown to measure sin 2-beta and cos 2-beta with little hadronic uncertainty.

  18. Extraction of neutron spectral information from Bonner-Sphere data

    CERN Document Server

    Haney, J H; Zaidins, C S

    1999-01-01

    We have extended a least-squares method of extracting neutron spectral information from Bonner-Sphere data which was previously developed by Zaidins et al. (Med. Phys. 5 (1978) 42). A pulse-height analysis with background stripping is employed which provided a more accurate count rate for each sphere. Newer response curves by Mares and Schraube (Nucl. Instr. and Meth. A 366 (1994) 461) were included for the moderating spheres and the bare detector which comprise the Bonner spectrometer system. Finally, the neutron energy spectrum of interest was divided using the philosophy of fuzzy logic into three trapezoidal regimes corresponding to slow, moderate, and fast neutrons. Spectral data was taken using a PuBe source in two different environments and the analyzed data is presented for these cases as slow, moderate, and fast neutron fluences. (author)

  19. ONTOGRABBING: Extracting Information from Texts Using Generative Ontologies

    DEFF Research Database (Denmark)

    Nilsson, Jørgen Fischer; Szymczak, Bartlomiej Antoni; Jensen, P.A.

    2009-01-01

    We describe principles for extracting information from texts using a so-called generative ontology in combination with syntactic analysis. Generative ontologies are introduced as semantic domains for natural language phrases. Generative ontologies extend ordinary finite ontologies with rules...... for producing recursively shaped terms representing the ontological content (ontological semantics) of NL noun phrases and other phrases. We focus here on achieving a robust, often only partial, ontology-driven parsing of and ascription of semantics to a sentence in the text corpus. The aim of the ontological...... analysis is primarily to identify paraphrases, thereby achieving a search functionality beyond mere keyword search with synsets. We further envisage use of the generative ontology as a phrase-based rather than word-based browser into text corpora....

  20. Assessment of Integrated Information System (IIS) in organization ...

    African Journals Online (AJOL)

    Assessment of Integrated Information System (IIS) in organization. ... to enable the Information System (IS) managers, as well as top management to understand the ... since organisational and strategic aspects in IIS should also be considered.

  1. Environment, safety, and health information technology systems integration.

    Energy Technology Data Exchange (ETDEWEB)

    Hendrickson, David A.; Bayer, Gregory W.

    2006-02-01

    The ES&H Information Systems department, motivated by the numerous isolated information technology systems under its control, undertook a significant integration effort. This effort was planned and executed over the course of several years and parts of it still continue today. The effect was to help move the ES&H Information Systems department toward integration with the corporate Information Solutions and Services center.

  2. Information extraction and knowledge graph construction from geoscience literature

    Science.gov (United States)

    Wang, Chengbin; Ma, Xiaogang; Chen, Jianguo; Chen, Jingwen

    2018-03-01

    Geoscience literature published online is an important part of open data, and brings both challenges and opportunities for data analysis. Compared with studies of numerical geoscience data, there are limited works on information extraction and knowledge discovery from textual geoscience data. This paper presents a workflow and a few empirical case studies for that topic, with a focus on documents written in Chinese. First, we set up a hybrid corpus combining the generic and geology terms from geology dictionaries to train Chinese word segmentation rules of the Conditional Random Fields model. Second, we used the word segmentation rules to parse documents into individual words, and removed the stop-words from the segmentation results to get a corpus constituted of content-words. Third, we used a statistical method to analyze the semantic links between content-words, and we selected the chord and bigram graphs to visualize the content-words and their links as nodes and edges in a knowledge graph, respectively. The resulting graph presents a clear overview of key information in an unstructured document. This study proves the usefulness of the designed workflow, and shows the potential of leveraging natural language processing and knowledge graph technologies for geoscience.

  3. Legal Issues for an Integrated Information Center.

    Science.gov (United States)

    Rees, Warren; And Others

    1991-01-01

    The ability to collect, store, retrieve, and combine information in computerized databases has magnified the potential for misuse of information. Laws have begun to deal with these new threats by expanding rights of privacy, copyright, misrepresentation, products liability, and defamation. Laws regarding computerized databases are certain to…

  4. Integrated Reporting and Assurance of Sustainability Information: An Experimental Study on Professional Investors’ Information Processing

    NARCIS (Netherlands)

    Reimsbach, D.; Hahn, R.; Gürtürk, A.

    2018-01-01

    Sustainability-related non-financial information is increasingly deemed value relevant. Against this background, two recent trends in non-financial reporting are frequently discussed: integrated reporting and assurance of sustainability information. Using an established framework of information

  5. Integration of Information Technologies in Enterprise Application Development

    Directory of Open Access Journals (Sweden)

    Iulia SURUGIU

    2012-05-01

    Full Text Available Healthcare enterprises are disconnected. In the era of integrated information systems and Internet explosion, the necessity of information systems integration reside from business process evolution, on the one hand, and from information technology tendencies, on the other hand. In order to become more efficient and adaptive to change, healthcare organizations are tremendously preoccupied of business process automation, flexibility and complexity. The need of information systems integration arise from these goals, explaining, at the same time, the special interest in EAI. Extensible software integration architectures and business orientation of process modeling and information systems functionalities, the same as open-connectivity, accessibility and virtualization lead to most suitable integration solutions: SOA and BPM architectural styles in a cloud computing environment.

  6. Integrating Information Networks for Collective Planetary Stewardship

    Science.gov (United States)

    Tiwari, A.

    2016-12-01

    Responsible behaviour resulting from climate literacy in global environmental movement is limited to policy and planning institutions in the Global South, while remaining absent for ends-user. Thus, planetary stewardship exists only at earth system boundaries where pressures sink to the local scale while ethics remains afloat. Existing citizen participation is restricted within policy spheres, appearing synonymous to enforcements in social psychology. Much, accounted reason is that existing information mechanisms operate mostly through linear exchanges between institutions and users, therefore reinforcing only hierarchical relationships. This study discloses such relationships that contribute to broad networking gaps through information demand assessment of stakeholders in a dozen development projects based in South Asia. Two parameters widely used for this purpose are: a. Feedback: Ends-user feedback to improve consumption literacy of climate sensitive resources (through consumption displays, billing, advisory services ecolabelling, sensors) and, b. Institutional Policy: Rewarding punishing to enforce desired behaviour (subsidies, taxation). Research answered: 1. Who gets the information (Equity in Information Distribution)? As existing information publishing mechanisms are designed by and for analysts, 2. How information translates to climate action Transparency of Execution)? Findings suggested that climate goals manifested in economic policy, than environmental policy, have potential clear short-term benefits and costs, and coincide with people's economic goals Also grassroots roles for responsible behaviour are empowered with presence of end user information. Barier free climate communication process and decision making is ensured among multiplicity of stakeholders with often conflicting perspectives. Research finds significance where collaboration among information networks can better translate regional policies into local action for climate adaptation and

  7. Data Assimilation to Extract Soil Moisture Information from SMAP Observations

    Directory of Open Access Journals (Sweden)

    Jana Kolassa

    2017-11-01

    Full Text Available This study compares different methods to extract soil moisture information through the assimilation of Soil Moisture Active Passive (SMAP observations. Neural network (NN and physically-based SMAP soil moisture retrievals were assimilated into the National Aeronautics and Space Administration (NASA Catchment model over the contiguous United States for April 2015 to March 2017. By construction, the NN retrievals are consistent with the global climatology of the Catchment model soil moisture. Assimilating the NN retrievals without further bias correction improved the surface and root zone correlations against in situ measurements from 14 SMAP core validation sites (CVS by 0.12 and 0.16, respectively, over the model-only skill, and reduced the surface and root zone unbiased root-mean-square error (ubRMSE by 0.005 m 3 m − 3 and 0.001 m 3 m − 3 , respectively. The assimilation reduced the average absolute surface bias against the CVS measurements by 0.009 m 3 m − 3 , but increased the root zone bias by 0.014 m 3 m − 3 . Assimilating the NN retrievals after a localized bias correction yielded slightly lower surface correlation and ubRMSE improvements, but generally the skill differences were small. The assimilation of the physically-based SMAP Level-2 passive soil moisture retrievals using a global bias correction yielded similar skill improvements, as did the direct assimilation of locally bias-corrected SMAP brightness temperatures within the SMAP Level-4 soil moisture algorithm. The results show that global bias correction methods may be able to extract more independent information from SMAP observations compared to local bias correction methods, but without accurate quality control and observation error characterization they are also more vulnerable to adverse effects from retrieval errors related to uncertainties in the retrieval inputs and algorithm. Furthermore, the results show that using global bias correction approaches without a

  8. INEL Waste and Environmental Information Integration Project approach and concepts

    International Nuclear Information System (INIS)

    Dean, L.A.; Fairbourn, P.J.; Randall, V.C.; Riedesel, A.M.

    1994-06-01

    The Idaho National Engineering, Laboratory (INEL) Waste and Environmental Information integration Project (IWEIIP) was established in December 1993 to address issues related to INEL waste and environmental information including: Data quality; Data redundancy; Data accessibility; Data integration. This effort includes existing information, new development, and acquisition activities. Existing information may not be a database record; it may be an entire document (electronic, scanned, or hard-copy), a video clip, or a file cabinet of information. The IWEIIP will implement an effective integrated information framework to manage INEL waste and environmental information as an asset. This will improve data quality, resolve data redundancy, and increase data accessibility; therefore, providing more effective utilization of the dollars spent on waste and environmental information

  9. The Dilution Effect and Information Integration in Perceptual Decision Making.

    Directory of Open Access Journals (Sweden)

    Jared M Hotaling

    Full Text Available In cognitive science there is a seeming paradox: On the one hand, studies of human judgment and decision making have repeatedly shown that people systematically violate optimal behavior when integrating information from multiple sources. On the other hand, optimal models, often Bayesian, have been successful at accounting for information integration in fields such as categorization, memory, and perception. This apparent conflict could be due, in part, to different materials and designs that lead to differences in the nature of processing. Stimuli that require controlled integration of information, such as the quantitative or linguistic information (commonly found in judgment studies, may lead to suboptimal performance. In contrast, perceptual stimuli may lend themselves to automatic processing, resulting in integration that is closer to optimal. We tested this hypothesis with an experiment in which participants categorized faces based on resemblance to a family patriarch. The amount of evidence contained in the top and bottom halves of each test face was independently manipulated. These data allow us to investigate a canonical example of sub-optimal information integration from the judgment and decision making literature, the dilution effect. Splitting the top and bottom halves of a face, a manipulation meant to encourage controlled integration of information, produced farther from optimal behavior and larger dilution effects. The Multi-component Information Accumulation model, a hybrid optimal/averaging model of information integration, successfully accounts for key accuracy, response time, and dilution effects.

  10. The Dilution Effect and Information Integration in Perceptual Decision Making.

    Science.gov (United States)

    Hotaling, Jared M; Cohen, Andrew L; Shiffrin, Richard M; Busemeyer, Jerome R

    2015-01-01

    In cognitive science there is a seeming paradox: On the one hand, studies of human judgment and decision making have repeatedly shown that people systematically violate optimal behavior when integrating information from multiple sources. On the other hand, optimal models, often Bayesian, have been successful at accounting for information integration in fields such as categorization, memory, and perception. This apparent conflict could be due, in part, to different materials and designs that lead to differences in the nature of processing. Stimuli that require controlled integration of information, such as the quantitative or linguistic information (commonly found in judgment studies), may lead to suboptimal performance. In contrast, perceptual stimuli may lend themselves to automatic processing, resulting in integration that is closer to optimal. We tested this hypothesis with an experiment in which participants categorized faces based on resemblance to a family patriarch. The amount of evidence contained in the top and bottom halves of each test face was independently manipulated. These data allow us to investigate a canonical example of sub-optimal information integration from the judgment and decision making literature, the dilution effect. Splitting the top and bottom halves of a face, a manipulation meant to encourage controlled integration of information, produced farther from optimal behavior and larger dilution effects. The Multi-component Information Accumulation model, a hybrid optimal/averaging model of information integration, successfully accounts for key accuracy, response time, and dilution effects.

  11. From remote sensing data about information extraction for 3D geovisualization - Development of a workflow

    International Nuclear Information System (INIS)

    Tiede, D.

    2010-01-01

    With an increased availability of high (spatial) resolution remote sensing imagery since the late nineties, the need to develop operative workflows for the automated extraction, provision and communication of information from such data has grown. Monitoring requirements, aimed at the implementation of environmental or conservation targets, management of (environmental-) resources, and regional planning as well as international initiatives, especially the joint initiative of the European Commission and ESA (European Space Agency) for Global Monitoring for Environment and Security (GMES) play also a major part. This thesis addresses the development of an integrated workflow for the automated provision of information derived from remote sensing data. Considering applied data and fields of application, this work aims to design the workflow as generic as possible. Following research questions are discussed: What are the requirements of a workflow architecture that seamlessly links the individual workflow elements in a timely manner and secures accuracy of the extracted information effectively? How can the workflow retain its efficiency if mounds of data are processed? How can the workflow be improved with regards to automated object-based image analysis (OBIA)? Which recent developments could be of use? What are the limitations or which workarounds could be applied in order to generate relevant results? How can relevant information be prepared target-oriented and communicated effectively? How can the more recently developed freely available virtual globes be used for the delivery of conditioned information under consideration of the third dimension as an additional, explicit carrier of information? Based on case studies comprising different data sets and fields of application it is demonstrated how methods to extract and process information as well as to effectively communicate results can be improved and successfully combined within one workflow. It is shown that (1

  12. Classification Method in Integrated Information Network Using Vector Image Comparison

    Directory of Open Access Journals (Sweden)

    Zhou Yuan

    2014-05-01

    Full Text Available Wireless Integrated Information Network (WMN consists of integrated information that can get data from its surrounding, such as image, voice. To transmit information, large resource is required which decreases the service time of the network. In this paper we present a Classification Approach based on Vector Image Comparison (VIC for WMN that improve the service time of the network. The available methods for sub-region selection and conversion are also proposed.

  13. Information Security Management - Part Of The Integrated Management System

    Science.gov (United States)

    Manea, Constantin Adrian

    2015-07-01

    The international management standards allow their integrated approach, thereby combining aspects of particular importance to the activity of any organization, from the quality management systems or the environmental management of the information security systems or the business continuity management systems. Although there is no national or international regulation, nor a defined standard for the Integrated Management System, the need to implement an integrated system occurs within the organization, which feels the opportunity to integrate the management components into a cohesive system, in agreement with the purpose and mission publicly stated. The issues relating to information security in the organization, from the perspective of the management system, raise serious questions to any organization in the current context of electronic information, reason for which we consider not only appropriate but necessary to promote and implement an Integrated Management System Quality - Environment - Health and Operational Security - Information Security

  14. Geography and Geographical Information Science: Interdisciplinary Integrators

    Science.gov (United States)

    Ellul, Claire

    2015-01-01

    To understand how Geography and Geographical Information Science (GIS) can contribute to Interdisciplinary Research (IDR), it is relevant to articulate the differences between the different types of such research. "Multidisciplinary" researchers work in a "parallel play" mode, completing work in their disciplinary work streams…

  15. A Kansas Integrated Commercialization Information Network (KICIN).

    Science.gov (United States)

    Ambler, C.; And Others

    A consortium of Kansas economic development service providers is building a web of virtual satellite offices that will demonstrate the delivery of economic development services in all areas of Kansas. These "offices" will use the Internet and a novel information delivery system to reach small and medium-sized businesses and individuals…

  16. Teacher Readiness to Integrate Information Technology into ...

    African Journals Online (AJOL)

    ... of simple percentage and frequency calculation. The results revealed that majority of the teachers have low level of knowledge about IT. In the same vein, majority of teachers in the schools in this study did not have adequate IT skills. However, the teachers have positive attitude toward the use of information technology.

  17. Integrated information theory of consciousness: an updated account.

    Science.gov (United States)

    Tononi, G

    2012-12-01

    This article presents an updated account of integrated information theory of consciousness (liT) and some of its implications. /IT stems from thought experiments that lead to phenomenological axioms (existence, compositionality, information, integration, exclusion) and corresponding ontological postulates. The information axiom asserts that every experience is spec~fic - it is what it is by differing in its particular way from a large repertoire of alternatives. The integration axiom asserts that each experience is unified- it cannot be reduced to independent components. The exclusion axiom asserts that every experience is definite - it is limited to particular things and not others and flows at a particular speed and resolution. /IT formalizes these intuitions with postulates. The information postulate states that only "differences that make a difference" from the intrinsic perpective of a system matter: a mechanism generates cause-effect information if its present state has selective past causes and selective future effects within a system. The integration postulate states that only information that is irreducible matters: mechanisms generate integrated information only to the extent that the information they generate cannot be partitioned into that generated within independent components. The exclusion postulate states that only maxima of integrated information matter: a mechanism specifies only one maximally irreducible set of past causes and future effects - a concept. A complex is a set of elements specifying a maximally irreducible constellation of concepts, where the maximum is evaluated over elements and at the optimal spatiatemporal scale. Its concepts specify a maximally integrated conceptual information structure or quale, which is identical with an experience. Finally, changes in information integration upon exposure to the environment reflect a system's ability to match the causal structure of the world. After introducing an updated definition of

  18. Earth Science Data Analytics: Preparing for Extracting Knowledge from Information

    Science.gov (United States)

    Kempler, Steven; Barbieri, Lindsay

    2016-01-01

    Data analytics is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. Data analytics is a broad term that includes data analysis, as well as an understanding of the cognitive processes an analyst uses to understand problems and explore data in meaningful ways. Analytics also include data extraction, transformation, and reduction, utilizing specific tools, techniques, and methods. Turning to data science, definitions of data science sound very similar to those of data analytics (which leads to a lot of the confusion between the two). But the skills needed for both, co-analyzing large amounts of heterogeneous data, understanding and utilizing relevant tools and techniques, and subject matter expertise, although similar, serve different purposes. Data Analytics takes on a practitioners approach to applying expertise and skills to solve issues and gain subject knowledge. Data Science, is more theoretical (research in itself) in nature, providing strategic actionable insights and new innovative methodologies. Earth Science Data Analytics (ESDA) is the process of examining, preparing, reducing, and analyzing large amounts of spatial (multi-dimensional), temporal, or spectral data using a variety of data types to uncover patterns, correlations and other information, to better understand our Earth. The large variety of datasets (temporal spatial differences, data types, formats, etc.) invite the need for data analytics skills that understand the science domain, and data preparation, reduction, and analysis techniques, from a practitioners point of view. The application of these skills to ESDA is the focus of this presentation. The Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster was created in recognition of the practical need to facilitate the co-analysis of large amounts of data and information for Earth science. Thus, from a to

  19. Testing the reliability of information extracted from ancient zircon

    Science.gov (United States)

    Kielman, Ross; Whitehouse, Martin; Nemchin, Alexander

    2015-04-01

    Studies combining zircon U-Pb chronology, trace element distribution as well as O and Hf isotope systematics are a powerful way to gain understanding of the processes shaping Earth's evolution, especially in detrital populations where constraints from the original host are missing. Such studies of the Hadean detrital zircon population abundant in sedimentary rocks in Western Australia have involved analysis of an unusually large number of individual grains, but also highlighted potential problems with the approach, only apparent when multiple analyses are obtained from individual grains. A common feature of the Hadean as well as many early Archaean zircon populations is their apparent inhomogeneity, which reduces confidence in conclusions based on studies combining chemistry and isotopic characteristics of zircon. In order to test the reliability of information extracted from early Earth zircon, we report results from one of the first in-depth multi-method study of zircon from a relatively simple early Archean magmatic rock, used as an analogue to ancient detrital zircon. The approach involves making multiple SIMS analyses in individual grains in order to be comparable to the most advanced studies of detrital zircon populations. The investigated sample is a relatively undeformed, non-migmatitic ca. 3.8 Ga tonalite collected a few kms south of the Isua Greenstone Belt, southwest Greenland. Extracted zircon grains can be combined into three different groups based on the behavior of their U-Pb systems: (i) grains that show internally consistent and concordant ages and define an average age of 3805±15 Ma, taken to be the age of the rock, (ii) grains that are distributed close to the concordia line, but with significant variability between multiple analyses, suggesting an ancient Pb loss and (iii) grains that have multiple analyses distributed along a discordia pointing towards a zero intercept, indicating geologically recent Pb-loss. This overall behavior has

  20. The integrated approach methodology for operator information evaluation

    International Nuclear Information System (INIS)

    Stroube, K.; Modarres, M.; Roush, M.; Hunt, N.; Pearce, R.

    1986-01-01

    The Integrated Approach has developed a complete method for evaluating the relative importance of operation information improvements. By use of decision trees the impact of information on success probability of a function or system can be evaluated. This approach couples goal trees and human success likelihoods to estimate anticipated consequences of a given information system

  1. Ontology Based Resolution of Semantic Conflicts in Information Integration

    Institute of Scientific and Technical Information of China (English)

    LU Han; LI Qing-zhong

    2004-01-01

    Semantic conflict is the conflict caused by using different ways in heterogeneous systems to express the same entity in reality.This prevents information integration from accomplishing semantic coherence.Since ontology helps to solve semantic problems, this area has become a hot topic in information integration.In this paper, we introduce semantic conflict into information integration of heterogeneous applications.We discuss the origins and categories of the conflict, and present an ontology-based schema mapping approach to eliminate semantic conflicts.

  2. Integrated Information Technology Policy Analysis Research, CSUSB

    Science.gov (United States)

    2010-10-01

    cience  fields in order to combine efforts to better understand multiple network s systems, including technical, biological and social networks...Flowing Valued Information (FVI) project has been discussed at the Network  cience  Workshops linked form the Center website and the FVI reports and

  3. CALmsu contactor for solvent extraction with integrated flowrate meters

    International Nuclear Information System (INIS)

    Siddiqui, I.A.; Shah, B.V.; Theyyunni, T.K.

    1994-01-01

    Mixer-settlers are widely used as contactors in solvent extraction processes. In the nuclear industry, solvent extraction techniques are used for the separation and purification of a range of materials. A major difficulty is faced in the nuclear industry due to the constraints on the design of the equipment and its operation by the presence of radioactive materials in process solutions. The development of CALmsu contactor was necessitated by the requirements of the operating environment in radiochemical plants. This contactor is a mixer-settler designed to use a CALMIX (combined air lifting and mixing device) static mixer. The CALMIX comprises two air lifts which raise the liquid phases to a highly turbulent mixing zone situated above the lifts. Its principle and construction are simple, and it is compact in size. It is a passive device and needs no maintenance. It has proved to be efficient during extensive testing. The simple and efficient CALmsu contactor internals are specially engineered for use of CALMIX mixer. It has been extensively tested in pilot plant for extraction and stripping of uranium, recovery of uranium from thorium by THOREX process and for treatment of degraded solvents. A model for the design of CALmsu contactors has been evolved and based on this model a software for engineering design of CALMIX and CALmsu contactors of throughput between 50 and 3000 lph has been developed. (author)

  4. CALmsu contactor for solvent extraction with integrated flowrate meters

    Energy Technology Data Exchange (ETDEWEB)

    Siddiqui, I A; Shah, B V; Theyyunni, T K [Process Engineering and Systems Development Division, Bhabha Atomic Research Centre, Mumbai (India)

    1994-06-01

    Mixer-settlers are widely used as contactors in solvent extraction processes. In the nuclear industry, solvent extraction techniques are used for the separation and purification of a range of materials. A major difficulty is faced in the nuclear industry due to the constraints on the design of the equipment and its operation by the presence of radioactive materials in process solutions. The development of CALmsu contactor was necessitated by the requirements of the operating environment in radiochemical plants. This contactor is a mixer-settler designed to use a CALMIX (combined air lifting and mixing device) static mixer. The CALMIX comprises two air lifts which raise the liquid phases to a highly turbulent mixing zone situated above the lifts. Its principle and construction are simple, and it is compact in size. It is a passive device and needs no maintenance. It has proved to be efficient during extensive testing. The simple and efficient CALmsu contactor internals are specially engineered for use of CALMIX mixer. It has been extensively tested in pilot plant for extraction and stripping of uranium, recovery of uranium from thorium by THOREX process and for treatment of degraded solvents. A model for the design of CALmsu contactors has been evolved and based on this model a software for engineering design of CALMIX and CALmsu contactors of throughput between 50 and 3000 lph has been developed. (author). 8 refs., 1 fig.

  5. Geographic information processing in the Integrated Measuring and Information System (IMIS). An overview; Geographische Informationsverarbeitung im integrierten Mess- und Informationssystem (IMIS). Ein Ueberblick

    Energy Technology Data Exchange (ETDEWEB)

    Burbeck, S. [Bundesamt fuer Strahlenschutz (BfS), Freiburg (Germany)

    2014-01-20

    As most public administrations the Federal Office for Radiation Protection faces various tasks and requirements with geographic information playing an important role. All the more this is true for the Department of Emergency Protection with its Integrated Measuring and Information System (IMIS) and the tasks of information provision for the public. Crucial part in geographic information extraction and provision is cartographic representation. In BfS the different requirements shall be met by a common software architecture, based on web services.

  6. QUANTITATIVE СHARACTERISTICS OF COMPLEMENTARY INTEGRATED HEALTH CARE SYSTEM AND INTEGRATED MEDICATION MANAGEMENT INFORMATION SYSTEM

    Directory of Open Access Journals (Sweden)

    L. Yu. Babintseva

    2015-05-01

    i mportant elements of state regulation of the pharmaceutical sector health. For the first time creation of two information systems: integrated medication management infor mation system and integrated health care system in an integrated medical infor mation area, operating based on th e principle of complementarity was justified. Global and technological coefficients of these systems’ functioning were introduced.

  7. Using integrated information systems in supply chain management

    Science.gov (United States)

    Gonzálvez-Gallego, Nicolás; Molina-Castillo, Francisco-Jose; Soto-Acosta, Pedro; Varajao, Joao; Trigo, Antonio

    2015-02-01

    The aim of this paper is to empirically test not only the direct effects of information and communication technology (ICT) capabilities and integrated information systems (IS) on firm performance, but also the moderating role of IS integration along the supply chain in the relationship between ICT external and capabilities and business performance. Data collected from 102 large Iberian firms from Spain and Portugal are used to test the research model. The hierarchical multiple regression analysis is employed to test the direct effects and the moderating relationships proposed. Results show that external and internal ICT capabilities are important drivers of firm performance, while merely having integrated IS do not lead to better firm performance. In addition, a moderating effect of IS integration in the relationship between ICT capabilities and business performance is found, although this integration only contributes to firm performance when it is directed to connect with suppliers or customers rather than when integrating the whole supply chain.

  8. RRB / SSI Interface Checkwriting Integrated Computer Operation Extract File (CHICO)

    Data.gov (United States)

    Social Security Administration — This monthly file provides SSA with information about benefit payments made to railroad retirement beneficiaries. SSA uses this data to verify Supplemental Security...

  9. Information Systems Integration and Enterprise Application Integration (EAI) Adoption: A Case from Financial Services

    Science.gov (United States)

    Lam, Wing

    2007-01-01

    Increasingly, organizations find that they need to integrate large number of information systems in order to support enterprise-wide business initiatives such as e-business, supply chain management and customer relationship management. To date, organizations have largely tended to address information systems (IS) integration in an ad-hoc manner.…

  10. Extraction of CT dose information from DICOM metadata: automated Matlab-based approach.

    Science.gov (United States)

    Dave, Jaydev K; Gingold, Eric L

    2013-01-01

    The purpose of this study was to extract exposure parameters and dose-relevant indexes of CT examinations from information embedded in DICOM metadata. DICOM dose report files were identified and retrieved from a PACS. An automated software program was used to extract from these files information from the structured elements in the DICOM metadata relevant to exposure. Extracting information from DICOM metadata eliminated potential errors inherent in techniques based on optical character recognition, yielding 100% accuracy.

  11. Integrating knowledge based functionality in commercial hospital information systems.

    Science.gov (United States)

    Müller, M L; Ganslandt, T; Eich, H P; Lang, K; Ohmann, C; Prokosch, H U

    2000-01-01

    Successful integration of knowledge-based functions in the electronic patient record depends on direct and context-sensitive accessibility and availability to clinicians and must suit their workflow. In this paper we describe an exemplary integration of an existing standalone scoring system for acute abdominal pain into two different commercial hospital information systems using Java/Corba technolgy.

  12. Integrating SAP to Information Systems Curriculum: Design and Delivery

    Science.gov (United States)

    Wang, Ming

    2011-01-01

    Information Systems (IS) education is being transformed from the segmented applications toward the integrated enterprise-wide system software Enterprise Resource Planning (ERP). ERP is a platform that integrates all business functions with its centralized data repository shared by all the business operations in the enterprise. This tremendous…

  13. Integrating an Information Literacy Quiz into the Learning Management System

    Science.gov (United States)

    Lowe, M. Sara; Booth, Char; Tagge, Natalie; Stone, Sean

    2014-01-01

    The Claremont Colleges Library Instruction Services Department developed a quiz that could be integrated into the consortial learning management software to accompany a local online, open-source information literacy tutorial. The quiz is integrated into individual course pages, allowing students to receive a grade for completion and improving…

  14. A Framework for Understanding Post-Merger Information Systems Integration

    DEFF Research Database (Denmark)

    Alaranta, Maria; Kautz, Karlheinz

    2012-01-01

    This paper develops a theoretical framework for the integration of information systems (IS) after a merger or an acquisition. The framework integrates three perspectives: a structuralist, an individualist, and an interactive process perspective to analyze and understand such integrations....... The framework is applied to a longitudinal case study of a manufacturing company that grew through an acquisition. The management decided to integrate the production control IS via tailoring a new system that blends together features of existing IS. The application of the framework in the case study confirms...... several known impediments to IS integrations. It also identifies a number of new inhibitors, as well as known and new facilitators that can bring post-merger IS integration to a success. Our findings provide relevant insights to researching and managing post-merger IS integrations. They emphasize...

  15. Medicaid Analytic eXtract (MAX) General Information

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicaid Analytic eXtract (MAX) data is a set of person-level data files on Medicaid eligibility, service utilization, and payments. The MAX data are created to...

  16. Standards to support information systems integration in anatomic pathology.

    Science.gov (United States)

    Daniel, Christel; García Rojo, Marcial; Bourquard, Karima; Henin, Dominique; Schrader, Thomas; Della Mea, Vincenzo; Gilbertson, John; Beckwith, Bruce A

    2009-11-01

    Integrating anatomic pathology information- text and images-into electronic health care records is a key challenge for enhancing clinical information exchange between anatomic pathologists and clinicians. The aim of the Integrating the Healthcare Enterprise (IHE) international initiative is precisely to ensure interoperability of clinical information systems by using existing widespread industry standards such as Digital Imaging and Communication in Medicine (DICOM) and Health Level Seven (HL7). To define standard-based informatics transactions to integrate anatomic pathology information to the Healthcare Enterprise. We used the methodology of the IHE initiative. Working groups from IHE, HL7, and DICOM, with special interest in anatomic pathology, defined consensual technical solutions to provide end-users with improved access to consistent information across multiple information systems. The IHE anatomic pathology technical framework describes a first integration profile, "Anatomic Pathology Workflow," dedicated to the diagnostic process including basic image acquisition and reporting solutions. This integration profile relies on 10 transactions based on HL7 or DICOM standards. A common specimen model was defined to consistently identify and describe specimens in both HL7 and DICOM transactions. The IHE anatomic pathology working group has defined standard-based informatics transactions to support the basic diagnostic workflow in anatomic pathology laboratories. In further stages, the technical framework will be completed to manage whole-slide images and semantically rich structured reports in the diagnostic workflow and to integrate systems used for patient care and those used for research activities (such as tissue bank databases or tissue microarrayers).

  17. EFFICIENCY INDICATORS INFORMATION MANAGEMENT IN INTEGRATED SECURITY SYSTEMS

    Directory of Open Access Journals (Sweden)

    N. S. Rodionova

    2014-01-01

    Full Text Available Summary. Introduction of information technology to improve the efficiency of security activity leads to the need to consider a number of negative factors associated with in consequence of the use of these technologies as a key element of modern security systems. One of the most notable factor is the exposure to information processes in protection systems security threats. This largely relates to integrated security systems (ISS is the system of protection with the highest level of informatization security functions. Significant damage to protected objects that they could potentially incur as a result of abnormal operation ISS, puts a very actual problem of assessing factors that reduce the efficiency of the ISS to justify the ways and methods to improve it. Because of the nature of threats and blocking distortion of information in the ISS of interest are: the volume undistorted ISF working environment, as a characteristic of data integrity; time access to information as a feature of its availability. This in turn leads to the need to use these parameters as the performance characteristics of information processes in the ISS - the completeness and timeliness of information processing. The article proposes performance indicators of information processes in integrated security systems in terms of optimal control procedures to protect information from unauthorized access. Set the considered parameters allows to conduct comprehensive security analysis of integrated security systems, and to provide recommendations to improve the management of information security procedures in them.

  18. The Integrated Information System for Natural Disaster Mitigation

    Directory of Open Access Journals (Sweden)

    Junxiu Wu

    2007-08-01

    Full Text Available Supported by the World Bank, the Integrated Information System for Natural Disaster Mitigation (ISNDM, including the operational service system and network telecommunication system, has been in development for three years in the Center of Disaster Reduction, Chinese Academy of Sciences, based on the platform of the GIS software Arcview. It has five main modules: disaster background information, socio- economic information, disaster-induced factors database, disaster scenarios database, and disaster assessment. ISNDM has several significant functions, which include information collection, information processing, data storage, and information distribution. It is a simple but comprehensive demonstration system for our national center for natural disaster reduction.

  19. Developing an Approach to Prioritize River Restoration using Data Extracted from Flood Risk Information System Databases.

    Science.gov (United States)

    Vimal, S.; Tarboton, D. G.; Band, L. E.; Duncan, J. M.; Lovette, J. P.; Corzo, G.; Miles, B.

    2015-12-01

    Prioritizing river restoration requires information on river geometry. In many states in the US detailed river geometry has been collected for floodplain mapping and is available in Flood Risk Information Systems (FRIS). In particular, North Carolina has, for its 100 Counties, developed a database of numerous HEC-RAS models which are available through its Flood Risk Information System (FRIS). These models that include over 260 variables were developed and updated by numerous contractors. They contain detailed surveyed or LiDAR derived cross-sections and modeled flood extents for different extreme event return periods. In this work, over 4700 HEC-RAS models' data was integrated and upscaled to utilize detailed cross-section information and 100-year modelled flood extent information to enable river restoration prioritization for the entire state of North Carolina. We developed procedures to extract geomorphic properties such as entrenchment ratio, incision ratio, etc. from these models. Entrenchment ratio quantifies the vertical containment of rivers and thereby their vulnerability to flooding and incision ratio quantifies the depth per unit width. A map of entrenchment ratio for the whole state was derived by linking these model results to a geodatabase. A ranking of highly entrenched counties enabling prioritization for flood allowance and mitigation was obtained. The results were shared through HydroShare and web maps developed for their visualization using Google Maps Engine API.

  20. The impulse cutoff an entropy functional measure on trajectories of Markov diffusion process integrating in information path functional

    OpenAIRE

    Lerner, Vladimir S.

    2012-01-01

    The impulses, cutting entropy functional (EF) measure on trajectories Markov diffusion process, integrate information path functional (IPF) composing discrete information Bits extracted from observing random process. Each cut brings memory of the cutting entropy, which provides both reduction of the process entropy and discrete unit of the cutting entropy a Bit. Consequently, information is memorized entropy cutting in random observations which process interactions. The origin of information ...

  1. Broad knowledge of information technologies: a prerequisite for the effective management of the integrated information system

    Energy Technology Data Exchange (ETDEWEB)

    Landau, H.B.

    1980-09-01

    There is a trend towards the bringing together of various information technologies into integrated information systems. The managers of these total systems therefore must be familiar with each of the component technologies and how they may be combined into a total information system. To accomplish this, the effective manager should first define the overall system as an integrated flow of information with each step identified; then, the alternate technologies applicable to each step may be selected. Methods of becoming technologically aware are suggested and examples of integrated systems are discussed.

  2. Development of Integrated Information System for Travel Bureau Company

    Science.gov (United States)

    Karma, I. G. M.; Susanti, J.

    2018-01-01

    Related to the effectiveness of decision-making by the management of travel bureau company, especially by managers, information serves frequent delays or incomplete. Although already computer-assisted, the existing application-based is used only handle one particular activity only, not integrated. This research is intended to produce an integrated information system that handles the overall operational activities of the company. By applying the object-oriented system development approach, the system is built with Visual Basic. Net programming language and MySQL database package. The result is a system that consists of 4 (four) separated program packages, including Reservation System, AR System, AP System and Accounting System. Based on the output, we can conclude that this system is able to produce integrated information that related to the problem of reservation, operational and financial those produce up-to-date information in order to support operational activities and decisionmaking process by related parties.

  3. Development of an integrated medical supply information system

    Science.gov (United States)

    Xu, Eric; Wermus, Marek; Blythe Bauman, Deborah

    2011-08-01

    The integrated medical supply inventory control system introduced in this study is a hybrid system that is shaped by the nature of medical supply, usage and storage capacity limitations of health care facilities. The system links demand, service provided at the clinic, health care service provider's information, inventory storage data and decision support tools into an integrated information system. ABC analysis method, economic order quantity model, two-bin method and safety stock concept are applied as decision support models to tackle inventory management issues at health care facilities. In the decision support module, each medical item and storage location has been scrutinised to determine the best-fit inventory control policy. The pilot case study demonstrates that the integrated medical supply information system holds several advantages for inventory managers, since it entails benefits of deploying enterprise information systems to manage medical supply and better patient services.

  4. Information Extraction with Character-level Neural Networks and Free Noisy Supervision

    OpenAIRE

    Meerkamp, Philipp; Zhou, Zhengyi

    2016-01-01

    We present an architecture for information extraction from text that augments an existing parser with a character-level neural network. The network is trained using a measure of consistency of extracted data with existing databases as a form of noisy supervision. Our architecture combines the ability of constraint-based information extraction systems to easily incorporate domain knowledge and constraints with the ability of deep neural networks to leverage large amounts of data to learn compl...

  5. Semantics-based information extraction for detecting economic events

    NARCIS (Netherlands)

    A.C. Hogenboom (Alexander); F. Frasincar (Flavius); K. Schouten (Kim); O. van der Meer

    2013-01-01

    textabstractAs today's financial markets are sensitive to breaking news on economic events, accurate and timely automatic identification of events in news items is crucial. Unstructured news items originating from many heterogeneous sources have to be mined in order to extract knowledge useful for

  6. Tagline: Information Extraction for Semi-Structured Text Elements in Medical Progress Notes

    Science.gov (United States)

    Finch, Dezon Kile

    2012-01-01

    Text analysis has become an important research activity in the Department of Veterans Affairs (VA). Statistical text mining and natural language processing have been shown to be very effective for extracting useful information from medical documents. However, neither of these techniques is effective at extracting the information stored in…

  7. An Effective Approach to Biomedical Information Extraction with Limited Training Data

    Science.gov (United States)

    Jonnalagadda, Siddhartha

    2011-01-01

    In the current millennium, extensive use of computers and the internet caused an exponential increase in information. Few research areas are as important as information extraction, which primarily involves extracting concepts and the relations between them from free text. Limitations in the size of training data, lack of lexicons and lack of…

  8. Metaproteomics: extracting and mining proteome information to characterize metabolic activities in microbial communities.

    Science.gov (United States)

    Abraham, Paul E; Giannone, Richard J; Xiong, Weili; Hettich, Robert L

    2014-06-17

    Contemporary microbial ecology studies usually employ one or more "omics" approaches to investigate the structure and function of microbial communities. Among these, metaproteomics aims to characterize the metabolic activities of the microbial membership, providing a direct link between the genetic potential and functional metabolism. The successful deployment of metaproteomics research depends on the integration of high-quality experimental and bioinformatic techniques for uncovering the metabolic activities of a microbial community in a way that is complementary to other "meta-omic" approaches. The essential, quality-defining informatics steps in metaproteomics investigations are: (1) construction of the metagenome, (2) functional annotation of predicted protein-coding genes, (3) protein database searching, (4) protein inference, and (5) extraction of metabolic information. In this article, we provide an overview of current bioinformatic approaches and software implementations in metaproteome studies in order to highlight the key considerations needed for successful implementation of this powerful community-biology tool. Copyright © 2014 John Wiley & Sons, Inc.

  9. A Quality-Driven Methodology for Information Systems Integration

    Directory of Open Access Journals (Sweden)

    Iyad Zikra

    2017-10-01

    Full Text Available Information systems integration is an essential instrument for organizations to attain advantage in today’s growing and fast changing business and technology landscapes. Integration solutions generate added value by combining the functionality and services of heterogeneous and diverse systems. Existing integration environments tend to rely heavily on technical, platform-dependent skills. Consequently, the solutions that they enable are not optimally aligned with the envisioned business goals of the organization. Furthermore, the gap between the goals and the solutions complicates the task of evaluating the quality of integration solutions. To address these challenges, we propose a quality-driven, model-driven methodology for designing and developing integration solutions. The methodology spans organizational and systems design details, providing a holistic view of the integration solution and its underlying business goals. A multi-view meta-model provides the basis for the integration design. Quality factors that affect various aspects of the integration solution guide and inform the progress of the methodology. An example business case is presented to demonstrate the application of the methodology.

  10. A rapid extraction of landslide disaster information research based on GF-1 image

    Science.gov (United States)

    Wang, Sai; Xu, Suning; Peng, Ling; Wang, Zhiyi; Wang, Na

    2015-08-01

    In recent years, the landslide disasters occurred frequently because of the seismic activity. It brings great harm to people's life. It has caused high attention of the state and the extensive concern of society. In the field of geological disaster, landslide information extraction based on remote sensing has been controversial, but high resolution remote sensing image can improve the accuracy of information extraction effectively with its rich texture and geometry information. Therefore, it is feasible to extract the information of earthquake- triggered landslides with serious surface damage and large scale. Taking the Wenchuan county as the study area, this paper uses multi-scale segmentation method to extract the landslide image object through domestic GF-1 images and DEM data, which uses the estimation of scale parameter tool to determine the optimal segmentation scale; After analyzing the characteristics of landslide high-resolution image comprehensively and selecting spectrum feature, texture feature, geometric features and landform characteristics of the image, we can establish the extracting rules to extract landslide disaster information. The extraction results show that there are 20 landslide whose total area is 521279.31 .Compared with visual interpretation results, the extraction accuracy is 72.22%. This study indicates its efficient and feasible to extract earthquake landslide disaster information based on high resolution remote sensing and it provides important technical support for post-disaster emergency investigation and disaster assessment.

  11. Advanced field-solver techniques for RC extraction of integrated circuits

    CERN Document Server

    Yu, Wenjian

    2014-01-01

    Resistance and capacitance (RC) extraction is an essential step in modeling the interconnection wires and substrate coupling effect in nanometer-technology integrated circuits (IC). The field-solver techniques for RC extraction guarantee the accuracy of modeling, and are becoming increasingly important in meeting the demand for accurate modeling and simulation of VLSI designs. Advanced Field-Solver Techniques for RC Extraction of Integrated Circuits presents a systematic introduction to, and treatment of, the key field-solver methods for RC extraction of VLSI interconnects and substrate coupling in mixed-signal ICs. Various field-solver techniques are explained in detail, with real-world examples to illustrate the advantages and disadvantages of each algorithm. This book will benefit graduate students and researchers in the field of electrical and computer engineering, as well as engineers working in the IC design and design automation industries. Dr. Wenjian Yu is an Associate Professor at the Department of ...

  12. Towards an information extraction and knowledge formation framework based on Shannon entropy

    Directory of Open Access Journals (Sweden)

    Iliescu Dragoș

    2017-01-01

    Full Text Available Information quantity subject is approached in this paperwork, considering the specific domain of nonconforming product management as information source. This work represents a case study. Raw data were gathered from a heavy industrial works company, information extraction and knowledge formation being considered herein. Involved method for information quantity estimation is based on Shannon entropy formula. Information and entropy spectrum are decomposed and analysed for extraction of specific information and knowledge-that formation. The result of the entropy analysis point out the information needed to be acquired by the involved organisation, this being presented as a specific knowledge type.

  13. Moral Judgment as Information Processing: An Integrative Review

    Directory of Open Access Journals (Sweden)

    Steve eGuglielmo

    2015-10-01

    Full Text Available This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two fundamental questions: (1 What input information guides moral judgments?; and (2 What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework, critically evaluates them on empirical and theoretical grounds, outlines a general integrative model grounded in information processing, and offers conceptual and methodological suggestions for future research. The information processing perspective provides a useful theoretical framework for organizing extant and future work in the rapidly growing field of moral judgment.

  14. Using remote sensing to inform integrated coastal zone management

    CSIR Research Space (South Africa)

    Roberts, W

    2010-06-01

    Full Text Available TO INFORM INTERGRATED COASTAL ZONE MANAGEMENT GISSA Western Cape Regional Meeting Wesley Roberts & Melanie Luck-Vogel 2 June 2010 CSIR NRE Ecosystems Earth Observation Group What is Integrated Coastal Zone Management? Integrated coastal management... D1D1 B a n d 1 Band 2 Quick theory of CVA Magnitude Direction ( ) ( )22 xaxbyaybM ?+?= Quadrant 1 (++) Accretion Quadrant 2 (-+) Quadrant 4 (+-) Quadrant 3 (--) Erosion CVA Results & Conclusions ? Change in image time series...

  15. DESIGN OF INFORMATION MANAGEMENT SYSTEM OF VERTICALLY INTEGRATED AGRICULTURAL HOLDINGS

    Directory of Open Access Journals (Sweden)

    Александр Витальевич ШМАТКО

    2015-05-01

    Full Text Available The paper deals with an approach to the design and development of information systems for the management and optimization of the organizational structure of vertically integrated agricultural holdings. A review of the problems of building and improving the organizational structure of vertically integrated agricultural holding is made. A method of constructing a discrete model management structure agricultural holding, which minimizes the costs associated with attracting applicants to work, is proposed.

  16. Information-integration category learning and the human uncertainty response.

    Science.gov (United States)

    Paul, Erick J; Boomer, Joseph; Smith, J David; Ashby, F Gregory

    2011-04-01

    The human response to uncertainty has been well studied in tasks requiring attention and declarative memory systems. However, uncertainty monitoring and control have not been studied in multi-dimensional, information-integration categorization tasks that rely on non-declarative procedural memory. Three experiments are described that investigated the human uncertainty response in such tasks. Experiment 1 showed that following standard categorization training, uncertainty responding was similar in information-integration tasks and rule-based tasks requiring declarative memory. In Experiment 2, however, uncertainty responding in untrained information-integration tasks impaired the ability of many participants to master those tasks. Finally, Experiment 3 showed that the deficit observed in Experiment 2 was not because of the uncertainty response option per se, but rather because the uncertainty response provided participants a mechanism via which to eliminate stimuli that were inconsistent with a simple declarative response strategy. These results are considered in the light of recent models of category learning and metacognition.

  17. Integrative analysis of gene expression and DNA methylation using unsupervised feature extraction for detecting candidate cancer biomarkers.

    Science.gov (United States)

    Moon, Myungjin; Nakai, Kenta

    2018-04-01

    Currently, cancer biomarker discovery is one of the important research topics worldwide. In particular, detecting significant genes related to cancer is an important task for early diagnosis and treatment of cancer. Conventional studies mostly focus on genes that are differentially expressed in different states of cancer; however, noise in gene expression datasets and insufficient information in limited datasets impede precise analysis of novel candidate biomarkers. In this study, we propose an integrative analysis of gene expression and DNA methylation using normalization and unsupervised feature extractions to identify candidate biomarkers of cancer using renal cell carcinoma RNA-seq datasets. Gene expression and DNA methylation datasets are normalized by Box-Cox transformation and integrated into a one-dimensional dataset that retains the major characteristics of the original datasets by unsupervised feature extraction methods, and differentially expressed genes are selected from the integrated dataset. Use of the integrated dataset demonstrated improved performance as compared with conventional approaches that utilize gene expression or DNA methylation datasets alone. Validation based on the literature showed that a considerable number of top-ranked genes from the integrated dataset have known relationships with cancer, implying that novel candidate biomarkers can also be acquired from the proposed analysis method. Furthermore, we expect that the proposed method can be expanded for applications involving various types of multi-omics datasets.

  18. Information Extraction and Interpretation Analysis of Mineral Potential Targets Based on ETM+ Data and GIS technology: A Case Study of Copper and Gold Mineralization in Burma

    International Nuclear Information System (INIS)

    Wenhui, Du; Yongqing, Chen; Nana, Guo; Yinglong, Hao; Pengfei, Zhao; Gongwen, Wang

    2014-01-01

    Mineralization-alteration and structure information extraction plays important roles in mineral resource prospecting and assessment using remote sensing data and the Geographical Information System (GIS) technology. Choosing copper and gold mines in Burma as example, the authors adopt band ratio, threshold segmentation and principal component analysis (PCA) to extract the hydroxyl alteration information using ETM+ remote sensing images. Digital elevation model (DEM) (30m spatial resolution) and ETM+ data was used to extract linear and circular faults that are associated with copper and gold mineralization. Combining geological data and the above information, the weights of evidence method and the C-A fractal model was used to integrate and identify the ore-forming favourable zones in this area. Research results show that the high grade potential targets are located with the known copper and gold deposits, and the integrated information can be used to the next exploration for the mineral resource decision-making

  19. Integrating hospital information systems in healthcare institutions: a mediation architecture.

    Science.gov (United States)

    El Azami, Ikram; Cherkaoui Malki, Mohammed Ouçamah; Tahon, Christian

    2012-10-01

    Many studies have examined the integration of information systems into healthcare institutions, leading to several standards in the healthcare domain (CORBAmed: Common Object Request Broker Architecture in Medicine; HL7: Health Level Seven International; DICOM: Digital Imaging and Communications in Medicine; and IHE: Integrating the Healthcare Enterprise). Due to the existence of a wide diversity of heterogeneous systems, three essential factors are necessary to fully integrate a system: data, functions and workflow. However, most of the previous studies have dealt with only one or two of these factors and this makes the system integration unsatisfactory. In this paper, we propose a flexible, scalable architecture for Hospital Information Systems (HIS). Our main purpose is to provide a practical solution to insure HIS interoperability so that healthcare institutions can communicate without being obliged to change their local information systems and without altering the tasks of the healthcare professionals. Our architecture is a mediation architecture with 3 levels: 1) a database level, 2) a middleware level and 3) a user interface level. The mediation is based on two central components: the Mediator and the Adapter. Using the XML format allows us to establish a structured, secured exchange of healthcare data. The notion of medical ontology is introduced to solve semantic conflicts and to unify the language used for the exchange. Our mediation architecture provides an effective, promising model that promotes the integration of hospital information systems that are autonomous, heterogeneous, semantically interoperable and platform-independent.

  20. Integrated system of production information processing for surface mines

    Energy Technology Data Exchange (ETDEWEB)

    Li, K.; Wang, S.; Zeng, Z.; Wei, J.; Ren, Z. [China University of Mining and Technology, Xuzhou (China). Dept of Mining Engineering

    2000-09-01

    Based on the concept of geological statistic, mathematical program, condition simulation, system engineering, and the features and duties of each main department in surface mine production, an integrated system for surface mine production information was studied systematically and developed by using the technology of data warehousing, CAD, object-oriented and system integration, which leads to the systematizing and automating of the information management, data processing, optimization computing and plotting. In this paper, its overall object, system design, structure and functions and some key techniques were described. 2 refs., 3 figs.

  1. Visualization and Integrated Data Mining of Disparate Information

    Energy Technology Data Exchange (ETDEWEB)

    Saffer, Jeffrey D.(OMNIVIZ, INC); Albright, Cory L.(BATTELLE (PACIFIC NW LAB)); Calapristi, Augustin J.(BATTELLE (PACIFIC NW LAB)); Chen, Guang (OMNIVIZ, INC); Crow, Vernon L.(BATTELLE (PACIFIC NW LAB)); Decker, Scott D.(BATTELLE (PACIFIC NW LAB)); Groch, Kevin M.(BATTELLE (PACIFIC NW LAB)); Havre, Susan L.(BATTELLE (PACIFIC NW LAB)); Malard, Joel (BATTELLE (PACIFIC NW LAB)); Martin, Tonya J.(BATTELLE (PACIFIC NW LAB)); Miller, Nancy E.(BATTELLE (PACIFIC NW LAB)); Monroe, Philip J.(OMNIVIZ, INC); Nowell, Lucy T.(BATTELLE (PACIFIC NW LAB)); Payne, Deborah A.(BATTELLE (PACIFIC NW LAB)); Reyes Spindola, Jorge F.(BATTELLE (PACIFIC NW LAB)); Scarberry, Randall E.(OMNIVIZ, INC); Sofia, Heidi J.(BATTELLE (PACIFIC NW LAB)); Stillwell, Lisa C.(OMNIVIZ, INC); Thomas, Gregory S.(BATTELLE (PACIFIC NW LAB)); Thurston, Sarah J.(OMNIVIZ, INC); Williams, Leigh K.(BATTELLE (PACIFIC NW LAB)); Zabriskie, Sean J.(OMNIVIZ, INC); MG Hicks

    2001-05-11

    The volumes and diversity of information in the discovery, development, and business processes within the chemical and life sciences industries require new approaches for analysis. Traditional list- or spreadsheet-based methods are easily overwhelmed by large amounts of data. Furthermore, generating strong hypotheses and, just as importantly, ruling out weak ones, requires integration across different experimental and informational sources. We have developed a framework for this integration, including common conceptual data models for multiple data types and linked visualizations that provide an overview of the entire data set, a measure of how each data record is related to every other record, and an assessment of the associations within the data set.

  2. Extracting local information from crowds through betting markets

    Science.gov (United States)

    Weijs, Steven

    2015-04-01

    In this research, a set-up is considered in which users can bet against a forecasting agency to challenge their probabilistic forecasts. From an information theory standpoint, a reward structure is considered that either provides the forecasting agency with better information, paying the successful providers of information for their winning bets, or funds excellent forecasting agencies through users that think they know better. Especially for local forecasts, the approach may help to diagnose model biases and to identify local predictive information that can be incorporated in the models. The challenges and opportunities for implementing such a system in practice are also discussed.

  3. International seminar on integrated information systems. Book of extended synopses

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-04-01

    The information available to the IAEA under comprehensive safeguards agreement with an Additional protocol is intended to provide for as complete a picture as practicable of a State's current or planned nuclear programme. The central components of the strengthened safeguards system are: increased IAEA access to and evaluation of information about States' nuclear and nuclear-related activities and increased physical access to relevant locations for verification of the exclusively peaceful content of a States' nuclear programme. Strengthening measures implemented under the existing legal authority of the Agency have contributed to increased information and physical access. Thus the role of integrated information systems for safeguards relevant data acquisition became more significant.

  4. International seminar on integrated information systems. Book of extended synopses

    International Nuclear Information System (INIS)

    2000-04-01

    The information available to the IAEA under comprehensive safeguards agreement with an Additional protocol is intended to provide for as complete a picture as practicable of a State's current or planned nuclear programme. The central components of the strengthened safeguards system are: increased IAEA access to and evaluation of information about States' nuclear and nuclear-related activities and increased physical access to relevant locations for verification of the exclusively peaceful content of a States' nuclear programme. Strengthening measures implemented under the existing legal authority of the Agency have contributed to increased information and physical access. Thus the role of integrated information systems for safeguards relevant data acquisition became more significant

  5. Integrated Information Systems Across the Weather-Climate Continuum

    Science.gov (United States)

    Pulwarty, R. S.; Higgins, W.; Nierenberg, C.; Trtanj, J.

    2015-12-01

    The increasing demand for well-organized (integrated) end-to-end research-based information has been highlighted in several National Academy studies, in IPCC Reports (such as the SREX and Fifth Assessment) and by public and private constituents. Such information constitutes a significant component of the "environmental intelligence" needed to address myriad societal needs for early warning and resilience across the weather-climate continuum. The next generation of climate research in service to the nation requires an even more visible, authoritative and robust commitment to scientific integration in support of adaptive information systems that address emergent risks and inform longer-term resilience strategies. A proven mechanism for resourcing such requirements is to demonstrate vision, purpose, support, connection to constituencies, and prototypes of desired capabilities. In this presentation we will discuss efforts at NOAA, and elsewhere, that: Improve information on how changes in extremes in key phenomena such as drought, floods, and heat stress impact management decisions for resource planning and disaster risk reduction Develop regional integrated information systems to address these emergent challenges, that integrate observations, monitoring and prediction, impacts assessments and scenarios, preparedness and adaptation, and coordination and capacity-building. Such systems, as illustrated through efforts such as NIDIS, have strengthened the integration across the foundational research enterprise (through for instance, RISAs, Modeling Analysis Predictions and Projections) by increasing agility for responding to emergent risks. The recently- initiated Climate Services Information System, in support of the WMO Global Framework for Climate Services draws on the above models and will be introduced during the presentation.

  6. Integrated information in discrete dynamical systems: motivation and theoretical framework.

    Directory of Open Access Journals (Sweden)

    David Balduzzi

    2008-06-01

    Full Text Available This paper introduces a time- and state-dependent measure of integrated information, phi, which captures the repertoire of causal states available to a system as a whole. Specifically, phi quantifies how much information is generated (uncertainty is reduced when a system enters a particular state through causal interactions among its elements, above and beyond the information generated independently by its parts. Such mathematical characterization is motivated by the observation that integrated information captures two key phenomenological properties of consciousness: (i there is a large repertoire of conscious experiences so that, when one particular experience occurs, it generates a large amount of information by ruling out all the others; and (ii this information is integrated, in that each experience appears as a whole that cannot be decomposed into independent parts. This paper extends previous work on stationary systems and applies integrated information to discrete networks as a function of their dynamics and causal architecture. An analysis of basic examples indicates the following: (i phi varies depending on the state entered by a network, being higher if active and inactive elements are balanced and lower if the network is inactive or hyperactive. (ii phi varies for systems with identical or similar surface dynamics depending on the underlying causal architecture, being low for systems that merely copy or replay activity states. (iii phi varies as a function of network architecture. High phi values can be obtained by architectures that conjoin functional specialization with functional integration. Strictly modular and homogeneous systems cannot generate high phi because the former lack integration, whereas the latter lack information. Feedforward and lattice architectures are capable of generating high phi but are inefficient. (iv In Hopfield networks, phi is low for attractor states and neutral states, but increases if the networks

  7. Integrated information in discrete dynamical systems: motivation and theoretical framework.

    Science.gov (United States)

    Balduzzi, David; Tononi, Giulio

    2008-06-13

    This paper introduces a time- and state-dependent measure of integrated information, phi, which captures the repertoire of causal states available to a system as a whole. Specifically, phi quantifies how much information is generated (uncertainty is reduced) when a system enters a particular state through causal interactions among its elements, above and beyond the information generated independently by its parts. Such mathematical characterization is motivated by the observation that integrated information captures two key phenomenological properties of consciousness: (i) there is a large repertoire of conscious experiences so that, when one particular experience occurs, it generates a large amount of information by ruling out all the others; and (ii) this information is integrated, in that each experience appears as a whole that cannot be decomposed into independent parts. This paper extends previous work on stationary systems and applies integrated information to discrete networks as a function of their dynamics and causal architecture. An analysis of basic examples indicates the following: (i) phi varies depending on the state entered by a network, being higher if active and inactive elements are balanced and lower if the network is inactive or hyperactive. (ii) phi varies for systems with identical or similar surface dynamics depending on the underlying causal architecture, being low for systems that merely copy or replay activity states. (iii) phi varies as a function of network architecture. High phi values can be obtained by architectures that conjoin functional specialization with functional integration. Strictly modular and homogeneous systems cannot generate high phi because the former lack integration, whereas the latter lack information. Feedforward and lattice architectures are capable of generating high phi but are inefficient. (iv) In Hopfield networks, phi is low for attractor states and neutral states, but increases if the networks are optimized

  8. Spoken Language Understanding Systems for Extracting Semantic Information from Speech

    CERN Document Server

    Tur, Gokhan

    2011-01-01

    Spoken language understanding (SLU) is an emerging field in between speech and language processing, investigating human/ machine and human/ human communication by leveraging technologies from signal processing, pattern recognition, machine learning and artificial intelligence. SLU systems are designed to extract the meaning from speech utterances and its applications are vast, from voice search in mobile devices to meeting summarization, attracting interest from both commercial and academic sectors. Both human/machine and human/human communications can benefit from the application of SLU, usin

  9. An information integration system for structured documents, Web, and databases

    OpenAIRE

    Morishima, Atsuyuki

    1998-01-01

    Rapid advance in computer network technology has changed the style of computer utilization. Distributed computing resources over world-wide computer networks are available from our local computers. They include powerful computers and a variety of information sources. This change is raising more advanced requirements. Integration of distributed information sources is one of such requirements. In addition to conventional databases, structured documents have been widely used, and have increasing...

  10. Sifting Through Chaos: Extracting Information from Unstructured Legal Opinions.

    Science.gov (United States)

    Oliveira, Bruno Miguel; Guimarães, Rui Vasconcellos; Antunes, Luís; Rodrigues, Pedro Pereira

    2018-01-01

    Abiding to the law is, in some cases, a delicate balance between the rights of different players. Re-using health records is such a case. While the law grants reuse rights to public administration documents, in which health records produced in public health institutions are included, it also grants privacy to personal records. To safeguard a correct usage of data, public hospitals in Portugal employ jurists that are responsible for allowing or withholding access rights to health records. To help decision making, these jurists can consult the legal opinions issued by the national committee on public administration documents usage. While these legal opinions are of undeniable value, due to their doctrine contribution, they are only available in a format best suited from printing, forcing individual consultation of each document, with no option, whatsoever of clustered search, filtering or indexing, which are standard operations nowadays in a document management system. When having to decide on tens of data requests a day, it becomes unfeasible to consult the hundreds of legal opinions already available. With the objective to create a modern document management system, we devised an open, platform agnostic system that extracts and compiles the legal opinions, ex-tracts its contents and produces metadata, allowing for a fast searching and filtering of said legal opinions.

  11. The integration of Information and Communication Technology into nursing.

    Science.gov (United States)

    Lupiáñez-Villanueva, Francisco; Hardey, Michael; Torrent, Joan; Ficapal, Pilar

    2011-02-01

    To identify and characterise different profiles of nurses' utilization of Information and Communication Technology (ICT) and the Internet and to identify factors that can enhance or inhibit the use of these technologies within nursing. An online survey of the 13,588 members of the Nurses Association of Barcelona who had a registered email account in 2006 was carried out. Factor analysis, cluster analysis and binomial logit model was undertaken. Although most of the nurses (76.70%) are utilizing the Internet within their daily work, multivariate statistics analysis revealed two profiles of the adoption of ICT. The first profile (4.58%) represents those nurses who value ICT and the Internet so that it forms an integral part of their practice. This group is thus referred to as 'integrated nurses'. The second profile (95.42%) represents those nurses who place less emphasis on ICT and the Internet and are consequently labelled 'non-integrated nurses'. From the statistical modelling, it was observed that undertaking research activities an emphasis on international information and a belief that health information available on the Internet was 'very relevant' play a positive and significant role in the probability of being an integrated nurse. The emerging world of the 'integrated nurse' cannot be adequately understood without examining how nurses make use of ICT and the Internet within nursing practice and the way this is shaped by institutional, technical and professional opportunities and constraints. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  12. Integrated Information Centers within Academic Environments: Introduction and Overview.

    Science.gov (United States)

    Lunin, Luis F., Ed.; D'Elia, George, Ed.

    1991-01-01

    Introduces eight articles on the Integrated Information Center (IIC) Project, which investigated significant behavioral, technological, organizational, financial, and legal factors involved in the management of IICs. Four articles address design and management issues of general interest, and four focus on specific design considerations and a…

  13. Information extraction from FN plots of tungsten microemitters

    Energy Technology Data Exchange (ETDEWEB)

    Mussa, Khalil O. [Department of Physics, Mu' tah University, Al-Karak (Jordan); Mousa, Marwan S., E-mail: mmousa@mutah.edu.jo [Department of Physics, Mu' tah University, Al-Karak (Jordan); Fischer, Andreas, E-mail: andreas.fischer@physik.tu-chemnitz.de [Institut für Physik, Technische Universität Chemnitz, Chemnitz (Germany)

    2013-09-15

    Tungsten based microemitter tips have been prepared both clean and coated with dielectric materials. For clean tungsten tips, apex radii have been varied ranging from 25 to 500 nm. These tips were manufactured by electrochemical etching a 0.1 mm diameter high purity (99.95%) tungsten wire at the meniscus of two molar NaOH solution. Composite micro-emitters considered here are consisting of a tungsten core coated with different dielectric materials—such as magnesium oxide (MgO), sodium hydroxide (NaOH), tetracyanoethylene (TCNE), and zinc oxide (ZnO). It is worthwhile noting here, that the rather unconventional NaOH coating has shown several interesting properties. Various properties of these emitters were measured including current–voltage (IV) characteristics and the physical shape of the tips. A conventional field emission microscope (FEM) with a tip (cathode)–screen (anode) separation standardized at 10 mm was used to electrically characterize the electron emitters. The system was evacuated down to a base pressure of ∼10{sup −8}mbar when baked at up to ∼180°C overnight. This allowed measurements of typical field electron emission (FE) characteristics, namely the IV characteristics and the emission images on a conductive phosphorus screen (the anode). Mechanical characterization has been performed through a FEI scanning electron microscope (SEM). Within this work, the mentioned experimental results are connected to the theory for analyzing Fowler–Nordheim (FN) plots. We compared and evaluated the data extracted from clean tungsten tips of different radii and determined deviations between the results of different extraction methods applied. In particular, we derived the apex radii of several clean and coated tungsten tips by both SEM imaging and analyzing FN plots. The aim of this analysis is to support the ongoing discussion on recently developed improvements of the theory for analyzing FN plots related to metal field electron emitters, which in

  14. Information extraction from FN plots of tungsten microemitters

    International Nuclear Information System (INIS)

    Mussa, Khalil O.; Mousa, Marwan S.; Fischer, Andreas

    2013-01-01

    Tungsten based microemitter tips have been prepared both clean and coated with dielectric materials. For clean tungsten tips, apex radii have been varied ranging from 25 to 500 nm. These tips were manufactured by electrochemical etching a 0.1 mm diameter high purity (99.95%) tungsten wire at the meniscus of two molar NaOH solution. Composite micro-emitters considered here are consisting of a tungsten core coated with different dielectric materials—such as magnesium oxide (MgO), sodium hydroxide (NaOH), tetracyanoethylene (TCNE), and zinc oxide (ZnO). It is worthwhile noting here, that the rather unconventional NaOH coating has shown several interesting properties. Various properties of these emitters were measured including current–voltage (IV) characteristics and the physical shape of the tips. A conventional field emission microscope (FEM) with a tip (cathode)–screen (anode) separation standardized at 10 mm was used to electrically characterize the electron emitters. The system was evacuated down to a base pressure of ∼10 −8 mbar when baked at up to ∼180°C overnight. This allowed measurements of typical field electron emission (FE) characteristics, namely the IV characteristics and the emission images on a conductive phosphorus screen (the anode). Mechanical characterization has been performed through a FEI scanning electron microscope (SEM). Within this work, the mentioned experimental results are connected to the theory for analyzing Fowler–Nordheim (FN) plots. We compared and evaluated the data extracted from clean tungsten tips of different radii and determined deviations between the results of different extraction methods applied. In particular, we derived the apex radii of several clean and coated tungsten tips by both SEM imaging and analyzing FN plots. The aim of this analysis is to support the ongoing discussion on recently developed improvements of the theory for analyzing FN plots related to metal field electron emitters, which in

  15. Empowerment of Cancer Survivors Through Information Technology: An Integrative Review.

    Science.gov (United States)

    Groen, Wim G; Kuijpers, Wilma; Oldenburg, Hester Sa; Wouters, Michel Wjm; Aaronson, Neil K; van Harten, Wim H

    2015-11-27

    Patient empowerment may be an effective approach to strengthen the role of cancer survivors and to reduce the burden on health care. However, it is not well conceptualized, notably in oncology. Furthermore, it is unclear to what extent information technology (IT) services can contribute to empowerment of cancer survivors. We aim to define the conceptual components of patient empowerment of chronic disease patients, especially cancer survivors, and to explore the contribution of existing and new IT services to promote empowerment. Electronic databases were searched to identify theoretical and empirical articles regarding empowerment. We extracted and synthesized conceptual components of patient empowerment (ie, attributes, antecedents, and consequences) according to the integrated review methodology. We identified recent IT services for cancer survivors by examining systematic reviews and a proposed inventory of new services, and we related their features and effects to the identified components of empowerment. Based on 26 articles, we identified five main attributes of patient empowerment: (1) being autonomous and respected, (2) having knowledge, (3) having psychosocial and behavioral skills, (4) perceiving support from community, family, and friends, and (5) perceiving oneself to be useful. The latter two were specific for the cancer setting. Systematic reviews of IT services and our additional inventory helped us identify five main categories: (1) educational services, including electronic survivorship care plan services, (2) patient-to-patient services, (3) electronic patient-reported outcome (ePRO) services, (4) multicomponent services, and (5) portal services. Potential impact on empowerment included knowledge enhancement and, to a lesser extent, enhancing autonomy and skills. Newly developed services offer promising and exciting opportunities to empower cancer survivors, for instance, by providing tailored advice for supportive or follow-up care based on

  16. Optimal Information Extraction of Laser Scanning Dataset by Scale-Adaptive Reduction

    Science.gov (United States)

    Zang, Y.; Yang, B.

    2018-04-01

    3D laser technology is widely used to collocate the surface information of object. For various applications, we need to extract a good perceptual quality point cloud from the scanned points. To solve the problem, most of existing methods extract important points based on a fixed scale. However, geometric features of 3D object come from various geometric scales. We propose a multi-scale construction method based on radial basis function. For each scale, important points are extracted from the point cloud based on their importance. We apply a perception metric Just-Noticeable-Difference to measure degradation of each geometric scale. Finally, scale-adaptive optimal information extraction is realized. Experiments are undertaken to evaluate the effective of the proposed method, suggesting a reliable solution for optimal information extraction of object.

  17. OPTIMAL INFORMATION EXTRACTION OF LASER SCANNING DATASET BY SCALE-ADAPTIVE REDUCTION

    Directory of Open Access Journals (Sweden)

    Y. Zang

    2018-04-01

    Full Text Available 3D laser technology is widely used to collocate the surface information of object. For various applications, we need to extract a good perceptual quality point cloud from the scanned points. To solve the problem, most of existing methods extract important points based on a fixed scale. However, geometric features of 3D object come from various geometric scales. We propose a multi-scale construction method based on radial basis function. For each scale, important points are extracted from the point cloud based on their importance. We apply a perception metric Just-Noticeable-Difference to measure degradation of each geometric scale. Finally, scale-adaptive optimal information extraction is realized. Experiments are undertaken to evaluate the effective of the proposed method, suggesting a reliable solution for optimal information extraction of object.

  18. Note: Fully integrated 3.2 Gbps quantum random number generator with real-time extraction

    International Nuclear Information System (INIS)

    Zhang, Xiao-Guang; Nie, You-Qi; Liang, Hao; Zhang, Jun; Pan, Jian-Wei; Zhou, Hongyi; Ma, Xiongfeng

    2016-01-01

    We present a real-time and fully integrated quantum random number generator (QRNG) by measuring laser phase fluctuations. The QRNG scheme based on laser phase fluctuations is featured for its capability of generating ultra-high-speed random numbers. However, the speed bottleneck of a practical QRNG lies on the limited speed of randomness extraction. To close the gap between the fast randomness generation and the slow post-processing, we propose a pipeline extraction algorithm based on Toeplitz matrix hashing and implement it in a high-speed field-programmable gate array. Further, all the QRNG components are integrated into a module, including a compact and actively stabilized interferometer, high-speed data acquisition, and real-time data post-processing and transmission. The final generation rate of the QRNG module with real-time extraction can reach 3.2 Gbps.

  19. Note: Fully integrated 3.2 Gbps quantum random number generator with real-time extraction

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Xiao-Guang; Nie, You-Qi; Liang, Hao; Zhang, Jun, E-mail: zhangjun@ustc.edu.cn; Pan, Jian-Wei [Hefei National Laboratory for Physical Sciences at the Microscale and Department of Modern Physics, University of Science and Technology of China, Hefei, Anhui 230026 (China); CAS Center for Excellence and Synergetic Innovation Center in Quantum Information and Quantum Physics, University of Science and Technology of China, Hefei, Anhui 230026 (China); Zhou, Hongyi; Ma, Xiongfeng [Center for Quantum Information, Institute for Interdisciplinary Information Sciences, Tsinghua University, Beijing 100084 (China)

    2016-07-15

    We present a real-time and fully integrated quantum random number generator (QRNG) by measuring laser phase fluctuations. The QRNG scheme based on laser phase fluctuations is featured for its capability of generating ultra-high-speed random numbers. However, the speed bottleneck of a practical QRNG lies on the limited speed of randomness extraction. To close the gap between the fast randomness generation and the slow post-processing, we propose a pipeline extraction algorithm based on Toeplitz matrix hashing and implement it in a high-speed field-programmable gate array. Further, all the QRNG components are integrated into a module, including a compact and actively stabilized interferometer, high-speed data acquisition, and real-time data post-processing and transmission. The final generation rate of the QRNG module with real-time extraction can reach 3.2 Gbps.

  20. Efficient Algorithms for Searching the Minimum Information Partition in Integrated Information Theory

    Science.gov (United States)

    Kitazono, Jun; Kanai, Ryota; Oizumi, Masafumi

    2018-03-01

    The ability to integrate information in the brain is considered to be an essential property for cognition and consciousness. Integrated Information Theory (IIT) hypothesizes that the amount of integrated information ($\\Phi$) in the brain is related to the level of consciousness. IIT proposes that to quantify information integration in a system as a whole, integrated information should be measured across the partition of the system at which information loss caused by partitioning is minimized, called the Minimum Information Partition (MIP). The computational cost for exhaustively searching for the MIP grows exponentially with system size, making it difficult to apply IIT to real neural data. It has been previously shown that if a measure of $\\Phi$ satisfies a mathematical property, submodularity, the MIP can be found in a polynomial order by an optimization algorithm. However, although the first version of $\\Phi$ is submodular, the later versions are not. In this study, we empirically explore to what extent the algorithm can be applied to the non-submodular measures of $\\Phi$ by evaluating the accuracy of the algorithm in simulated data and real neural data. We find that the algorithm identifies the MIP in a nearly perfect manner even for the non-submodular measures. Our results show that the algorithm allows us to measure $\\Phi$ in large systems within a practical amount of time.

  1. An integration of Emergency Department Information and Ambulance Systems.

    Science.gov (United States)

    Al-Harbi, Nada; El-Masri, Samir; Saddik, Basema

    2012-01-01

    In this paper we propose an Emergency Department Information System that will be integrated with the ambulance system to improve the communication, enhance the quality of provided emergency services and facilitate information sharing. The proposed system utilizes new advanced technologies such as mobile web services that overcome the problems of interoperability between different systems, HL7 and GPS. The system is unique in that it allows ambulance officers to locate the nearest specialized hospital and allows access to the patient's electronic health record as well as providing the hospital with required information to prepare for the incoming patient.

  2. Integration of Information and Scientific Literacy: Promoting Literacy in Undergraduates

    Science.gov (United States)

    Wolbach, Kevin C.; Purzycki, Catherine B.; Bowman, Leslie A.; Agbada, Eva; Mostrom, Alison M.

    2010-01-01

    The Association of College and Research Libraries recommends incorporating information literacy (IL) skills across university and college curricula, for the goal of developing information literate graduates. Congruent with this goal, the Departments of Biological Sciences and Information Science developed an integrated IL and scientific literacy (SL) exercise for use in a first-year biology course. Students were provided the opportunity to access, retrieve, analyze, and evaluate primary scientific literature. By the completion of this project, student responses improved concerning knowledge and relevance of IL and SL skills. This project exposes students to IL and SL early in their undergraduate experience, preparing them for future academic advancement. PMID:21123700

  3. Information extraction from FN plots of tungsten microemitters.

    Science.gov (United States)

    Mussa, Khalil O; Mousa, Marwan S; Fischer, Andreas

    2013-09-01

    Tungsten based microemitter tips have been prepared both clean and coated with dielectric materials. For clean tungsten tips, apex radii have been varied ranging from 25 to 500 nm. These tips were manufactured by electrochemical etching a 0.1 mm diameter high purity (99.95%) tungsten wire at the meniscus of two molar NaOH solution. Composite micro-emitters considered here are consisting of a tungsten core coated with different dielectric materials-such as magnesium oxide (MgO), sodium hydroxide (NaOH), tetracyanoethylene (TCNE), and zinc oxide (ZnO). It is worthwhile noting here, that the rather unconventional NaOH coating has shown several interesting properties. Various properties of these emitters were measured including current-voltage (IV) characteristics and the physical shape of the tips. A conventional field emission microscope (FEM) with a tip (cathode)-screen (anode) separation standardized at 10 mm was used to electrically characterize the electron emitters. The system was evacuated down to a base pressure of ∼10(-8) mbar when baked at up to ∼180 °C overnight. This allowed measurements of typical field electron emission (FE) characteristics, namely the IV characteristics and the emission images on a conductive phosphorus screen (the anode). Mechanical characterization has been performed through a FEI scanning electron microscope (SEM). Within this work, the mentioned experimental results are connected to the theory for analyzing Fowler-Nordheim (FN) plots. We compared and evaluated the data extracted from clean tungsten tips of different radii and determined deviations between the results of different extraction methods applied. In particular, we derived the apex radii of several clean and coated tungsten tips by both SEM imaging and analyzing FN plots. The aim of this analysis is to support the ongoing discussion on recently developed improvements of the theory for analyzing FN plots related to metal field electron emitters, which in particular

  4. Study on methods and techniques of aeroradiometric weak information extraction for sandstone-hosted uranium deposits based on GIS

    International Nuclear Information System (INIS)

    Han Shaoyang; Ke Dan; Hou Huiqun

    2005-01-01

    The weak information extraction is one of the important research contents in the current sandstone-type uranium prospecting in China. This paper introduces the connotation of aeroradiometric weak information extraction, and discusses the formation theories of aeroradiometric weak information extraction, and discusses the formation theories of aeroradiometric weak information and establishes some effective mathematic models for weak information extraction. Models for weak information extraction are realized based on GIS software platform. Application tests of weak information extraction are realized based on GIS software platform. Application tests of weak information extraction are completed in known uranium mineralized areas. Research results prove that the prospective areas of sandstone-type uranium deposits can be rapidly delineated by extracting aeroradiometric weak information. (authors)

  5. Extraction of Graph Information Based on Image Contents and the Use of Ontology

    Science.gov (United States)

    Kanjanawattana, Sarunya; Kimura, Masaomi

    2016-01-01

    A graph is an effective form of data representation used to summarize complex information. Explicit information such as the relationship between the X- and Y-axes can be easily extracted from a graph by applying human intelligence. However, implicit knowledge such as information obtained from other related concepts in an ontology also resides in…

  6. On the effects of multimodal information integration in multitasking.

    Science.gov (United States)

    Stock, Ann-Kathrin; Gohil, Krutika; Huster, René J; Beste, Christian

    2017-07-07

    There have recently been considerable advances in our understanding of the neuronal mechanisms underlying multitasking, but the role of multimodal integration for this faculty has remained rather unclear. We examined this issue by comparing different modality combinations in a multitasking (stop-change) paradigm. In-depth neurophysiological analyses of event-related potentials (ERPs) were conducted to complement the obtained behavioral data. Specifically, we applied signal decomposition using second order blind identification (SOBI) to the multi-subject ERP data and source localization. We found that both general multimodal information integration and modality-specific aspects (potentially related to task difficulty) modulate behavioral performance and associated neurophysiological correlates. Simultaneous multimodal input generally increased early attentional processing of visual stimuli (i.e. P1 and N1 amplitudes) as well as measures of cognitive effort and conflict (i.e. central P3 amplitudes). Yet, tactile-visual input caused larger impairments in multitasking than audio-visual input. General aspects of multimodal information integration modulated the activity in the premotor cortex (BA 6) as well as different visual association areas concerned with the integration of visual information with input from other modalities (BA 19, BA 21, BA 37). On top of this, differences in the specific combination of modalities also affected performance and measures of conflict/effort originating in prefrontal regions (BA 6).

  7. An Integrated Information Retrieval Support System for Campus Network

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    This paper presents a new integrated information retrieval support system (IIRSS) which can help Web search engines retrieve cross-lingual information from heterogeneous resources stored in multi-databases in Intranet. The IIRSS, with a three-layer architecture, can cooperate with other application servers running in Intranet. By using intelligent agents to collect information and to create indexes on-the-fly, using an access control strategy to confine a user to browsing those accessible documents for him/her through a single portal, and using a new cross-lingual translation tool to help the search engine retrieve documents, the new system provides controllable information access with different authorizations, personalized services, and real-time information retrieval.

  8. Extracting information of fixational eye movements through pupil tracking

    Science.gov (United States)

    Xiao, JiangWei; Qiu, Jian; Luo, Kaiqin; Peng, Li; Han, Peng

    2018-01-01

    Human eyes are never completely static even when they are fixing a stationary point. These irregular, small movements, which consist of micro-tremors, micro-saccades and drifts, can prevent the fading of the images that enter our eyes. The importance of researching the fixational eye movements has been experimentally demonstrated recently. However, the characteristics of fixational eye movements and their roles in visual process have not been explained clearly, because these signals can hardly be completely extracted by now. In this paper, we developed a new eye movement detection device with a high-speed camera. This device includes a beam splitter mirror, an infrared light source and a high-speed digital video camera with a frame rate of 200Hz. To avoid the influence of head shaking, we made the device wearable by fixing the camera on a safety helmet. Using this device, the experiments of pupil tracking were conducted. By localizing the pupil center and spectrum analysis, the envelope frequency spectrum of micro-saccades, micro-tremors and drifts are shown obviously. The experimental results show that the device is feasible and effective, so that the device can be applied in further characteristic analysis.

  9. Extracting Social Networks and Contact Information From Email and the Web

    National Research Council Canada - National Science Library

    Culotta, Aron; Bekkerman, Ron; McCallum, Andrew

    2005-01-01

    ...-suited for such information extraction tasks. By recursively calling itself on new people discovered on the Web, the system builds a social network with multiple degrees of separation from the user...

  10. Integrated project management information systems: the French nuclear industry experience

    International Nuclear Information System (INIS)

    Jacquin, J.-C.; Caupin, G.-M.

    1990-01-01

    The article discusses the desirability of integrated project management systems within the French nuclear power industry. Change in demand for nuclear generation facilities over the last two decades has necessitated a change of policy concerning organization, cost and planning within the industry. Large corporate systems can benefit from integrating equipment and bulk materials tracking. Project management for the nuclear industry will, in future, need to incorporate computer aided design tools and project management information systems data bases as well as equipment and planning data. (UK)

  11. Integrated project management information systems: the French nuclear industry experience

    Energy Technology Data Exchange (ETDEWEB)

    Jacquin, J.-C.; Caupin, G.-M.

    1990-03-01

    The article discusses the desirability of integrated project management systems within the French nuclear power industry. Change in demand for nuclear generation facilities over the last two decades has necessitated a change of policy concerning organization, cost and planning within the industry. Large corporate systems can benefit from integrating equipment and bulk materials tracking. Project management for the nuclear industry will, in future, need to incorporate computer aided design tools and project management information systems data bases as well as equipment and planning data. (UK).

  12. Lithium NLP: A System for Rich Information Extraction from Noisy User Generated Text on Social Media

    OpenAIRE

    Bhargava, Preeti; Spasojevic, Nemanja; Hu, Guoning

    2017-01-01

    In this paper, we describe the Lithium Natural Language Processing (NLP) system - a resource-constrained, high- throughput and language-agnostic system for information extraction from noisy user generated text on social media. Lithium NLP extracts a rich set of information including entities, topics, hashtags and sentiment from text. We discuss several real world applications of the system currently incorporated in Lithium products. We also compare our system with existing commercial and acad...

  13. Information Management Processes for Extraction of Student Dropout Indicators in Courses in Distance Mode

    Directory of Open Access Journals (Sweden)

    Renata Maria Abrantes Baracho

    2016-04-01

    Full Text Available This research addresses the use of information management processes in order to extract student dropout indicators in distance mode courses. Distance education in Brazil aims to facilitate access to information. The MEC (Ministry of Education announced, in the second semester of 2013, that the main obstacles faced by institutions offering courses in this mode were students dropping out and the resistance of both educators and students to this mode. The research used a mixed methodology, qualitative and quantitative, to obtain student dropout indicators. The factors found and validated in this research were: the lack of interest from students, insufficient training in the use of the virtual learning environment for students, structural problems in the schools that were chosen to offer the course, students without e-mail, incoherent answers to activities to the course, lack of knowledge on the part of the student when using the computer tool. The scenario considered was a course offered in distance mode called Aluno Integrado (Integrated Student

  14. Mediator infrastructure for information integration and semantic data integration environment for biomedical research.

    Science.gov (United States)

    Grethe, Jeffrey S; Ross, Edward; Little, David; Sanders, Brian; Gupta, Amarnath; Astakhov, Vadim

    2009-01-01

    This paper presents current progress in the development of semantic data integration environment which is a part of the Biomedical Informatics Research Network (BIRN; http://www.nbirn.net) project. BIRN is sponsored by the National Center for Research Resources (NCRR), a component of the National Institutes of Health (NIH). A goal is the development of a cyberinfrastructure for biomedical research that supports advance data acquisition, data storage, data management, data integration, data mining, data visualization, and other computing and information processing services over the Internet. Each participating institution maintains storage of their experimental or computationally derived data. Mediator-based data integration system performs semantic integration over the databases to enable researchers to perform analyses based on larger and broader datasets than would be available from any single institution's data. This paper describes recent revision of the system architecture, implementation, and capabilities of the semantically based data integration environment for BIRN.

  15. Testing can counteract proactive interference by integrating competing information

    Science.gov (United States)

    Wahlheim, Christopher N.

    2015-01-01

    Testing initially learned information before presenting new information has been shown to counteract the deleterious effects of proactive interference by segregating competing sources of information. The present experiments were conducted to demonstrate that testing can also have its effects in part by integrating competing information. Variations of classic A–B, A–D paired-associate learning paradigms were employed that included two lists of word pairs and a cued-recall test. Repeated pairs appeared in both lists (A–B, A–B), control pairs appeared in List 2 only (A–B, C–D), and changed pairs appeared with the same cue in both lists but with different responses (A–B, A–D). The critical manipulation was whether pairs were tested or restudied in an interpolated phase that occurred between Lists 1 and 2. On a final cued-recall test, participants recalled List 2 responses and then indicated when they recollected that responses had earlier changed between lists. The change recollection measure indexed the extent to which competing responses were integrated during List 2. Change was recollected more often for tested than for restudied pairs. Proactive facilitation was obtained in cued recall when change was recollected, whereas proactive interference was obtained when change was not recollected. These results provide evidence that testing counteracted proactive interference in part by making List 1 responses more accessible during List 2, thus promoting integration and increasing later recollection of change. These results have theoretical implications because they show that testing can counteract proactive interference by integrating or segregating competing information. PMID:25120241

  16. Testing can counteract proactive interference by integrating competing information.

    Science.gov (United States)

    Wahlheim, Christopher N

    2015-01-01

    Testing initially learned information before presenting new information has been shown to counteract the deleterious effects of proactive interference by segregating competing sources of information. The present experiments were conducted to demonstrate that testing can also have its effects in part by integrating competing information. Variations of classic A-B, A-D paired-associate learning paradigms were employed that included two lists of word pairs and a cued-recall test. Repeated pairs appeared in both lists (A-B, A-B), control pairs appeared in List 2 only (A-B, C-D), and changed pairs appeared with the same cue in both lists but with different responses (A-B, A-D). The critical manipulation was whether pairs were tested or restudied in an interpolated phase that occurred between Lists 1 and 2. On a final cued-recall test, participants recalled List 2 responses and then indicated when they recollected that responses had earlier changed between lists. The change recollection measure indexed the extent to which competing responses were integrated during List 2. Change was recollected more often for tested than for restudied pairs. Proactive facilitation was obtained in cued recall when change was recollected, whereas proactive interference was obtained when change was not recollected. These results provide evidence that testing counteracted proactive interference in part by making List 1 responses more accessible during List 2, thus promoting integration and increasing later recollection of change. These results have theoretical implications because they show that testing can counteract proactive interference by integrating or segregating competing information.

  17. Information Extraction of High Resolution Remote Sensing Images Based on the Calculation of Optimal Segmentation Parameters

    Science.gov (United States)

    Zhu, Hongchun; Cai, Lijie; Liu, Haiying; Huang, Wei

    2016-01-01

    Multi-scale image segmentation and the selection of optimal segmentation parameters are the key processes in the object-oriented information extraction of high-resolution remote sensing images. The accuracy of remote sensing special subject information depends on this extraction. On the basis of WorldView-2 high-resolution data, the optimal segmentation parameters methodof object-oriented image segmentation and high-resolution image information extraction, the following processes were conducted in this study. Firstly, the best combination of the bands and weights was determined for the information extraction of high-resolution remote sensing image. An improved weighted mean-variance method was proposed andused to calculatethe optimal segmentation scale. Thereafter, the best shape factor parameter and compact factor parameters were computed with the use of the control variables and the combination of the heterogeneity and homogeneity indexes. Different types of image segmentation parameters were obtained according to the surface features. The high-resolution remote sensing images were multi-scale segmented with the optimal segmentation parameters. Ahierarchical network structure was established by setting the information extraction rules to achieve object-oriented information extraction. This study presents an effective and practical method that can explain expert input judgment by reproducible quantitative measurements. Furthermore the results of this procedure may be incorporated into a classification scheme. PMID:27362762

  18. Review of Statistical Learning Methods in Integrated Omics Studies (An Integrated Information Science).

    Science.gov (United States)

    Zeng, Irene Sui Lan; Lumley, Thomas

    2018-01-01

    Integrated omics is becoming a new channel for investigating the complex molecular system in modern biological science and sets a foundation for systematic learning for precision medicine. The statistical/machine learning methods that have emerged in the past decade for integrated omics are not only innovative but also multidisciplinary with integrated knowledge in biology, medicine, statistics, machine learning, and artificial intelligence. Here, we review the nontrivial classes of learning methods from the statistical aspects and streamline these learning methods within the statistical learning framework. The intriguing findings from the review are that the methods used are generalizable to other disciplines with complex systematic structure, and the integrated omics is part of an integrated information science which has collated and integrated different types of information for inferences and decision making. We review the statistical learning methods of exploratory and supervised learning from 42 publications. We also discuss the strengths and limitations of the extended principal component analysis, cluster analysis, network analysis, and regression methods. Statistical techniques such as penalization for sparsity induction when there are fewer observations than the number of features and using Bayesian approach when there are prior knowledge to be integrated are also included in the commentary. For the completeness of the review, a table of currently available software and packages from 23 publications for omics are summarized in the appendix.

  19. Overview of ImageCLEF 2017: information extraction from images

    OpenAIRE

    Ionescu, Bogdan; Müller, Henning; Villegas, Mauricio; Arenas, Helbert; Boato, Giulia; Dang Nguyen, Duc Tien; Dicente Cid, Yashin; Eickhoff, Carsten; Seco de Herrera, Alba G.; Gurrin, Cathal; Islam, Bayzidul; Kovalev, Vassili; Liauchuk, Vitali; Mothe, Josiane; Piras, Luca

    2017-01-01

    This paper presents an overview of the ImageCLEF 2017 evaluation campaign, an event that was organized as part of the CLEF (Conference and Labs of the Evaluation Forum) labs 2017. ImageCLEF is an ongoing initiative (started in 2003) that promotes the evaluation of technologies for annotation, indexing and retrieval for providing information access to collections of images in various usage scenarios and domains. In 2017, the 15th edition of ImageCLEF, three main tasks were proposed and one pil...

  20. Entropy-based analysis and bioinformatics-inspired integration of global economic information transfer.

    Directory of Open Access Journals (Sweden)

    Jinkyu Kim

    Full Text Available The assessment of information transfer in the global economic network helps to understand the current environment and the outlook of an economy. Most approaches on global networks extract information transfer based mainly on a single variable. This paper establishes an entirely new bioinformatics-inspired approach to integrating information transfer derived from multiple variables and develops an international economic network accordingly. In the proposed methodology, we first construct the transfer entropies (TEs between various intra- and inter-country pairs of economic time series variables, test their significances, and then use a weighted sum approach to aggregate information captured in each TE. Through a simulation study, the new method is shown to deliver better information integration compared to existing integration methods in that it can be applied even when intra-country variables are correlated. Empirical investigation with the real world data reveals that Western countries are more influential in the global economic network and that Japan has become less influential following the Asian currency crisis.

  1. Entropy-based analysis and bioinformatics-inspired integration of global economic information transfer.

    Science.gov (United States)

    Kim, Jinkyu; Kim, Gunn; An, Sungbae; Kwon, Young-Kyun; Yoon, Sungroh

    2013-01-01

    The assessment of information transfer in the global economic network helps to understand the current environment and the outlook of an economy. Most approaches on global networks extract information transfer based mainly on a single variable. This paper establishes an entirely new bioinformatics-inspired approach to integrating information transfer derived from multiple variables and develops an international economic network accordingly. In the proposed methodology, we first construct the transfer entropies (TEs) between various intra- and inter-country pairs of economic time series variables, test their significances, and then use a weighted sum approach to aggregate information captured in each TE. Through a simulation study, the new method is shown to deliver better information integration compared to existing integration methods in that it can be applied even when intra-country variables are correlated. Empirical investigation with the real world data reveals that Western countries are more influential in the global economic network and that Japan has become less influential following the Asian currency crisis.

  2. Statistical techniques to extract information during SMAP soil moisture assimilation

    Science.gov (United States)

    Kolassa, J.; Reichle, R. H.; Liu, Q.; Alemohammad, S. H.; Gentine, P.

    2017-12-01

    Statistical techniques permit the retrieval of soil moisture estimates in a model climatology while retaining the spatial and temporal signatures of the satellite observations. As a consequence, the need for bias correction prior to an assimilation of these estimates is reduced, which could result in a more effective use of the independent information provided by the satellite observations. In this study, a statistical neural network (NN) retrieval algorithm is calibrated using SMAP brightness temperature observations and modeled soil moisture estimates (similar to those used to calibrate the SMAP Level 4 DA system). Daily values of surface soil moisture are estimated using the NN and then assimilated into the NASA Catchment model. The skill of the assimilation estimates is assessed based on a comprehensive comparison to in situ measurements from the SMAP core and sparse network sites as well as the International Soil Moisture Network. The NN retrieval assimilation is found to significantly improve the model skill, particularly in areas where the model does not represent processes related to agricultural practices. Additionally, the NN method is compared to assimilation experiments using traditional bias correction techniques. The NN retrieval assimilation is found to more effectively use the independent information provided by SMAP resulting in larger model skill improvements than assimilation experiments using traditional bias correction techniques.

  3. Identifying influential factors on integrated marketing planning using information technology

    Directory of Open Access Journals (Sweden)

    Karim Hamdi

    2014-07-01

    Full Text Available This paper presents an empirical investigation to identify important factors influencing integrated marketing planning using information technology. The proposed study designs a questionnaire for measuring integrated marketing planning, which consists of three categories of structural factors, behavioral factors and background factors. There are 40 questions associated with the proposed study in Likert scale. Cronbach alphas have been calculated for structural factors, behavioral factors and background factors as 0.89, 0.86 and 0.83, respectively. Using some statistical test, the study has confirmed the effects of three factors on integrated marketing. In addition, the implementation of Freedman test has revealed that structural factors were the most important factor followed by background factors and behavioral factors.

  4. Research on Crowdsourcing Emergency Information Extraction of Based on Events' Frame

    Science.gov (United States)

    Yang, Bo; Wang, Jizhou; Ma, Weijun; Mao, Xi

    2018-01-01

    At present, the common information extraction method cannot extract the structured emergency event information accurately; the general information retrieval tool cannot completely identify the emergency geographic information; these ways also do not have an accurate assessment of these results of distilling. So, this paper proposes an emergency information collection technology based on event framework. This technique is to solve the problem of emergency information picking. It mainly includes emergency information extraction model (EIEM), complete address recognition method (CARM) and the accuracy evaluation model of emergency information (AEMEI). EIEM can be structured to extract emergency information and complements the lack of network data acquisition in emergency mapping. CARM uses a hierarchical model and the shortest path algorithm and allows the toponomy pieces to be joined as a full address. AEMEI analyzes the results of the emergency event and summarizes the advantages and disadvantages of the event framework. Experiments show that event frame technology can solve the problem of emergency information drawing and provides reference cases for other applications. When the emergency disaster is about to occur, the relevant departments query emergency's data that has occurred in the past. They can make arrangements ahead of schedule which defense and reducing disaster. The technology decreases the number of casualties and property damage in the country and world. This is of great significance to the state and society.

  5. IMIS (Integrated Measurement and Information System) - the German integrated radioactivity information and decision support system

    International Nuclear Information System (INIS)

    Weiss, W.; Leeb, H.

    1993-01-01

    IMIS is being set up as part of the German Government's National Response Plan for dealing with the consequences of a large scale radioactive contamination of the environment. The IMIS system has three operational action levels. Level 3 covers the collection of radiological data from state-of-the-art monitoring networks and measurement laboratories. Level 2 involves computerised data processing and quality control, based on standardised procedures for the collection and presentation of measurements. This level also includes the use of transport and dose assessment models. Level 1 includes evaluation of the data, management of the consequences of a given situation, legal enforcement of protective measures and provision of information to the public. In its final form the IMIS system will consist of a total of 75 RISC computers linked together by an efficient packet-switched Wide Area Network. Owing to various demands of the individual users, three different types of RISC computers are used. The system software includes ULTRIX, TCP/IP and X windows. The relational database management system ORACLE is used together with the query language SQL-Plus. Statistical analyses are carried out with the standard product SAS. The geographical information system TERRA provides all the tools necessary for a detailed geographic presentation of the data. (author)

  6. Information integration for a sky survey by data warehousing

    Science.gov (United States)

    Luo, A.; Zhang, Y.; Zhao, Y.

    The virtualization service of data system for a sky survey LAMOST is very important for astronomers The service needs to integrate information from data collections catalogs and references and support simple federation of a set of distributed files and associated metadata Data warehousing has been in existence for several years and demonstrated superiority over traditional relational database management systems by providing novel indexing schemes that supported efficient on-line analytical processing OLAP of large databases Now relational database systems such as Oracle etc support the warehouse capability which including extensions to the SQL language to support OLAP operations and a number of metadata management tools have been created The information integration of LAMOST by applying data warehousing is to effectively provide data and knowledge on-line

  7. Integrating Environmental and Information Systems Management: An Enterprise Architecture Approach

    Science.gov (United States)

    Noran, Ovidiu

    Environmental responsibility is fast becoming an important aspect of strategic management as the reality of climate change settles in and relevant regulations are expected to tighten significantly in the near future. Many businesses react to this challenge by implementing environmental reporting and management systems. However, the environmental initiative is often not properly integrated in the overall business strategy and its information system (IS) and as a result the management does not have timely access to (appropriately aggregated) environmental information. This chapter argues for the benefit of integrating the environmental management (EM) project into the ongoing enterprise architecture (EA) initiative present in all successful companies. This is done by demonstrating how a reference architecture framework and a meta-methodology using EA artefacts can be used to co-design the EM system, the organisation and its IS in order to achieve a much needed synergy.

  8. Integration of genomic information with biological networks using Cytoscape.

    Science.gov (United States)

    Bauer-Mehren, Anna

    2013-01-01

    Cytoscape is an open-source software for visualizing, analyzing, and modeling biological networks. This chapter explains how to use Cytoscape to analyze the functional effect of sequence variations in the context of biological networks such as protein-protein interaction networks and signaling pathways. The chapter is divided into five parts: (1) obtaining information about the functional effect of sequence variation in a Cytoscape readable format, (2) loading and displaying different types of biological networks in Cytoscape, (3) integrating the genomic information (SNPs and mutations) with the biological networks, and (4) analyzing the effect of the genomic perturbation onto the network structure using Cytoscape built-in functions. Finally, we briefly outline how the integrated data can help in building mathematical network models for analyzing the effect of the sequence variation onto the dynamics of the biological system. Each part is illustrated by step-by-step instructions on an example use case and visualized by many screenshots and figures.

  9. Information resources assessment of a healthcare integrated delivery system.

    Science.gov (United States)

    Gadd, C. S.; Friedman, C. P.; Douglas, G.; Miller, D. J.

    1999-01-01

    While clinical healthcare systems may have lagged behind computer applications in other fields in the shift from mainframes to client-server architectures, the rapid deployment of newer applications is closing that gap. Organizations considering the transition to client-server must identify and position themselves to provide the resources necessary to implement and support the infrastructure requirements of client-server architectures and to manage the accelerated complexity at the desktop, including hardware and software deployment, training, and maintenance needs. This paper describes an information resources assessment of the recently aligned Pennsylvania regional Veterans Administration Stars and Stripes Health Network (VISN4), in anticipation of the shift from a predominantly mainframe to a client-server information systems architecture in its well-established VistA clinical information system. The multimethod assessment study is described here to demonstrate this approach and its value to regional healthcare networks undergoing organizational integration and/or significant information technology transformations. PMID:10566414

  10. Project Integration Architecture: A Practical Demonstration of Information Propagation

    Science.gov (United States)

    Jones, William Henry

    2005-01-01

    One of the goals of the Project Integration Architecture (PIA) effort is to provide the ability to propagate information between disparate applications. With this ability, applications may then be formed into an application graph constituting a super-application. Such a super-application would then provide all of the analysis appropriate to a given technical system. This paper reports on a small demonstration of this concept in which a Computer Aided Design (CAD) application was connected to an inlet analysis code and geometry information automatically propagated from one to the other. The majority of the work reported involved not the technology of information propagation, but rather the conversion of propagated information into a form usable by the receiving application.

  11. Information resources assessment of a healthcare integrated delivery system.

    Science.gov (United States)

    Gadd, C S; Friedman, C P; Douglas, G; Miller, D J

    1999-01-01

    While clinical healthcare systems may have lagged behind computer applications in other fields in the shift from mainframes to client-server architectures, the rapid deployment of newer applications is closing that gap. Organizations considering the transition to client-server must identify and position themselves to provide the resources necessary to implement and support the infrastructure requirements of client-server architectures and to manage the accelerated complexity at the desktop, including hardware and software deployment, training, and maintenance needs. This paper describes an information resources assessment of the recently aligned Pennsylvania regional Veterans Administration Stars and Stripes Health Network (VISN4), in anticipation of the shift from a predominantly mainframe to a client-server information systems architecture in its well-established VistA clinical information system. The multimethod assessment study is described here to demonstrate this approach and its value to regional healthcare networks undergoing organizational integration and/or significant information technology transformations.

  12. [Extraction of management information from the national quality assurance program].

    Science.gov (United States)

    Stausberg, Jürgen; Bartels, Claus; Bobrowski, Christoph

    2007-07-15

    Starting with clinically motivated projects, the national quality assurance program has established a legislative obligatory framework. Annual feedback of results is an important means of quality control. The annual reports cover quality-related information with high granularity. A synopsis for corporate management is missing, however. Therefore, the results of the University Clinics in Greifswald, Germany, have been analyzed and aggregated to support hospital management. Strengths were identified by the ranking of results within the state for each quality indicator, weaknesses by the comparison with national reference values. The assessment was aggregated per clinical discipline and per category (indication, process, and outcome). A composition of quality indicators was claimed multiple times. A coherent concept is still missing. The method presented establishes a plausible summary of strengths and weaknesses of a hospital from the point of view of the national quality assurance program. Nevertheless, further adaptation of the program is needed to better assist corporate management.

  13. Information Integration in Risky Choice: Identification and Stability

    OpenAIRE

    Stewart, Neil

    2011-01-01

    How is information integrated across the\\ud attributes of an option when making risky\\ud choices? In most descriptive models of\\ud decision under risk, information about\\ud risk, and reward is combined multiplicatively\\ud (e.g., expected value; expected utility\\ud theory, Bernouli, 1738/1954; subjective\\ud expected utility theory, Savage, 1954;\\ud Edwards, 1955; prospect theory, Kahneman\\ud and Tversky, 1979; rank-dependent utility,\\ud Quiggin, 1993; decision field theory,\\ud Busemeyer and To...

  14. Creating integrated information management system for small and medium business

    Directory of Open Access Journals (Sweden)

    Deinega Valentina Nikolaevna

    2014-09-01

    Full Text Available Enterprises regardless of their size and ownership, focused on a long and successful work needed to create a system of integrated information systems. This is dictated by the fact that, firstly, it combines the financial data, and secondly provides a standardized manufacturing processes, thirdly, solves the problem of standardization of information in a frame. The main thing in your decision-making - the definition of the strategy of the business and the reflection of this strategy on goals and objectives. ERP-system help to maintain competitiveness and leadership in the market.

  15. Moral judgment as information processing: an integrative review.

    Science.gov (United States)

    Guglielmo, Steve

    2015-01-01

    How do humans make moral judgments about others' behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment.

  16. Methodology of Adaptive Integrated Accounting System in Information Environment

    Directory of Open Access Journals (Sweden)

    Bochulya Tetyana V.

    2013-12-01

    Full Text Available The goal of the article lies in the study of logical and methodological justification of formation of the integrated system of accounting based on realities of the co-ordinated transformation of the society and economy and development of a new knowledge about formation and adjustment of the accounting system in it’s a priori new information competence with expansion of functionality for the justified idea of existence and development of business. Taking developments of the best representatives of the leading scientific society as a basis, the article offers a new vision of organisation of the accounting system, based on the modern projection of information competence and harmonisation of main processes of information service for adaptation of the system for multi-vector inquiries of consumers of information. Pursuant to results of the conducted study, the article makes an effort to change the established opinion about information and professional competences of the accounting system and attach a new qualitative significance to them. The article makes a proposal with respect to calculation of quality of the information system on the basis of key indicators of its information service. It lays the foundation of the prospective study of the problems of building the accounting system in such a projection, so that realities of internal and external processes were maximally co-ordinated based on the idea of their information development.

  17. Moral judgment as information processing: an integrative review

    Science.gov (United States)

    Guglielmo, Steve

    2015-01-01

    How do humans make moral judgments about others’ behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment. PMID:26579022

  18. Integration of information and communication technologies in special relativity teaching

    International Nuclear Information System (INIS)

    Barbier, Remi; Fleck, Sonia; Perries, Stephane; Ray, Cedric

    2005-01-01

    Integration of information and communication technologies (ICTs) in special relativity teaching may bring multiple and complementary methods for introducing either difficult or abstract counterintuitive concepts. This paper describes multimedia content developed at Lyon University to enhance the learning process of undergraduate students. Two categories of animated scenarios have been identified: real experiments and thought experiments. Both typical examples of these scenarios and their impacts on the teaching process are discussed

  19. Diablo Canyon plant information management system and integrated communication system

    International Nuclear Information System (INIS)

    Stanley, J.W.; Groff, C.

    1990-01-01

    The implementation of a comprehensive maintenance system called the plant information management system (PIMS) at the Diablo Canyon plant, together with its associated integrated communication system (ICS), is widely regarded as the most comprehensive undertaking of its kind in the nuclear industry. This paper provides an overview of the program at Diablo Canyon, an evaluation of system benefits, and highlights the future course of PIMS

  20. Diablo Canyon plant information management system and integrated communication system

    Energy Technology Data Exchange (ETDEWEB)

    Stanley, J.W.; Groff, C.

    1990-06-01

    The implementation of a comprehensive maintenance system called the plant information management system (PIMS) at the Diablo Canyon plant, together with its associated integrated communication system (ICS), is widely regarded as the most comprehensive undertaking of its kind in the nuclear industry. This paper provides an overview of the program at Diablo Canyon, an evaluation of system benefits, and highlights the future course of PIMS.

  1. The Influence of Information Technology on Integration of Logistics Processess

    OpenAIRE

    Ivica Jujnovic

    2011-01-01

    Business globalization increases physical distance between location of raw material production and location of product and services consummation, by which logistics expenses and the entire business costs increase. Achieving greater efficiency and competitiveness requires adoption of numerous recent trends in logistics. It includes process approach to business using information technology in the integration of logistics processes, especially technologies such as exchange of electronic data, ba...

  2. Architectural Building A Public Key Infrastructure Integrated Information Space

    Directory of Open Access Journals (Sweden)

    Vadim Ivanovich Korolev

    2015-10-01

    Full Text Available The article keeps under consideration the mattersto apply the cryptographic system having a public key to provide information security and to implya digital signature. It performs the analysis of trust models at the formation of certificates and their use. The article describes the relationships between the trust model and the architecture public key infrastructure. It contains conclusions in respect of the options for building the public key infrastructure for integrated informationspace.

  3. Earth science information: Planning for the integration and use of global change information

    Science.gov (United States)

    Lousma, Jack R.

    1992-01-01

    Activities and accomplishments of the first six months of the Consortium for International Earth Science Information Network (CIESIN's) 1992 technical program have focused on four main missions: (1) the development and implementation of plans for initiation of the Socioeconomic Data and Applications Center (SEDAC) as part of the EOSDIS Program; (2) the pursuit and development of a broad-based global change information cooperative by providing systems analysis and integration between natural science and social science data bases held by numerous federal agencies and other sources; (3) the fostering of scientific research into the human dimensions of global change and providing integration between natural science and social science data and information; and (4) the serving of CIESIN as a gateway for global change data and information distribution through development of the Global Change Research Information Office and other comprehensive knowledge sharing systems.

  4. Benefits and problems in implementation for integrated medical information system

    International Nuclear Information System (INIS)

    Park, Chang Seo; Kim, Kee Deog; Park, Hyok; Jeong, Ho Gul

    2005-01-01

    Once the decision has been made to adopt an integrated medical information system (IMIS), there are a number of tissues to overcome. Users need to be aware of the impact the change will make on end users and be prepared to address issues that arise before they become problems. The purpose of this study is to investigate the benefits and unexpected problems encountered in the implementation of IMIS and to determine a useful framework for IMIS. The Yonsei University Dental Hospital is steadily constructing an IMIS. The vendor's PACS software, Piview STAR, supports transactions between workstations that are approved to integrating the health care enterprise (IHE) with security function. It is necessary to develop an excellent framework that is good for the patient, health care provider and information system vendors, in an expert, efficient, and cost-effective manner. The problems encountered with IMIS implementation were high initial investments, delay of EMR enforcement, underdevelopment of digital radiographic appliances and software and insufficient educational training for users. The clinical environments of dental IMIS is some different from the medical situation. The best way to overcome these differences is to establish a gold standard of dental IMIS integration, which estimates the cost payback. The IHE and its technical framework are good for the patient, the health care provider and all information systems vendors.

  5. Information Science and integrative Science. A sistemic approach to information units

    Directory of Open Access Journals (Sweden)

    Rita Dolores Santaella Ruiz

    2006-01-01

    Full Text Available Structured in two parts: The Documentation like integrating science and Systematics approach to the documentary units, this work understands the Documentation from a brought integrating perspective of the twinning that supposes same modus operandi in the information systems through the use of the technologies of the communication. From the General Theory of Systems, the present work interprets this science to multidiscipline like a system formed by the technical subsystems, of elements and individuals

  6. Extracting of implicit information in English advertising texts with phonetic and lexical-morphological means

    Directory of Open Access Journals (Sweden)

    Traikovskaya Natalya Petrovna

    2015-12-01

    Full Text Available The article deals with phonetic and lexical-morphological language means participating in the process of extracting implicit information in English-speaking advertising texts for men and women. The functioning of phonetic means of the English language is not the basis for implication of information in advertising texts. Lexical and morphological means play the role of markers of relevant information, playing the role of the activator ofimplicit information in the texts of advertising.

  7. Post-processing of Deep Web Information Extraction Based on Domain Ontology

    Directory of Open Access Journals (Sweden)

    PENG, T.

    2013-11-01

    Full Text Available Many methods are utilized to extract and process query results in deep Web, which rely on the different structures of Web pages and various designing modes of databases. However, some semantic meanings and relations are ignored. So, in this paper, we present an approach for post-processing deep Web query results based on domain ontology which can utilize the semantic meanings and relations. A block identification model (BIM based on node similarity is defined to extract data blocks that are relevant to specific domain after reducing noisy nodes. Feature vector of domain books is obtained by result set extraction model (RSEM based on vector space model (VSM. RSEM, in combination with BIM, builds the domain ontology on books which can not only remove the limit of Web page structures when extracting data information, but also make use of semantic meanings of domain ontology. After extracting basic information of Web pages, a ranking algorithm is adopted to offer an ordered list of data records to users. Experimental results show that BIM and RSEM extract data blocks and build domain ontology accurately. In addition, relevant data records and basic information are extracted and ranked. The performances precision and recall show that our proposed method is feasible and efficient.

  8. Fabricating and Characterizing the Microfluidic Solid Phase Extraction Module Coupling with Integrated ESI Emitters

    Directory of Open Access Journals (Sweden)

    Hangbin Tang

    2018-05-01

    Full Text Available Microfluidic chips coupling with mass spectrometry (MS will be of great significance to the development of relevant instruments involving chemical and bio-chemical analysis, drug detection, food and environmental applications and so on. In our previous works, we proposed two types of microfluidic electrospray ionization (ESI chip coupling with MS: the two-phase flow focusing (FF ESI microfluidic chip and the corner-integrated ESI emitter, respectively. However the pretreatment module integrated with these ESI emitters is still a challenging problem. In this paper, we concentrated on integrating the solid phase micro-extraction (SPME module with our previous proposed on-chip ESI emitters; the fabrication processes of such SPME module are fully compatible with our previous proposed ESI emitters based on the multi-layer soft lithography. We optimized the structure of the integrated chip and characterized its performance using standard samples. Furthermore, we verified its abilities of salt removal, extraction of multiple analytes and separation through on-chip elution using mimic biological urine spiked with different drugs. The results indicated that our proposed integrated module with ESI emitters is practical and effective for real biological sample pretreatment and MS detection.

  9. Optimization of the German integrated information and measurement system (IMIS)

    International Nuclear Information System (INIS)

    Wirth, E.; Weiss, W.

    2002-01-01

    The Chernobyl accident led to a widespread contamination of the environment in most European countries. In Germany, like in all other countries, it took some time to evaluate the radiological situation, time which is extremely valuable in the early phases of an accident when decisions on countermeasures like sheltering, iodine prophylaxis or evacuation have to be taken. For a better emergency preparedness the Integrated Information and Measurement System (IMIS) has been developed and established in Germany. In case of a widespread contamination of the environment, the system will provide the decision makers with all information necessary to evaluate the radiological situation and to decide on countermeasures. Presently this system is upgraded due to the adoption of the European decision supporting system RODOS and by the improvement of the national information exchange. For this purpose the web based information system ELAN has been developed. The national systems have to be integrated into the European and international communication systems. In this presentation the IMIS system is briefly described and the new features and modules of the system are discussed in greater detail

  10. An integrated process for the extraction of fuel and chemicals from marine macroalgal biomass

    Science.gov (United States)

    Trivedi, Nitin; Baghel, Ravi S.; Bothwell, John; Gupta, Vishal; Reddy, C. R. K.; Lali, Arvind M.; Jha, Bhavanath

    2016-07-01

    We describe an integrated process that can be applied to biomass of the green seaweed, Ulva fasciata, to allow the sequential recovery of four economically important fractions; mineral rich liquid extract (MRLE), lipid, ulvan, and cellulose. The main benefits of our process are: a) its simplicity and b) the consistent yields obtained from the residual biomass after each successive extraction step. For example, dry Ulva biomass yields ~26% of its starting mass as MRLE, ~3% as lipid, ~25% as ulvan, and ~11% as cellulose, with the enzymatic hydrolysis and fermentation of the final cellulose fraction under optimized conditions producing ethanol at a competitive 0.45 g/g reducing sugar. These yields are comparable to those obtained by direct processing of the individual components from primary biomass. We propose that this integration of ethanol production and chemical feedstock recovery from macroalgal biomass could substantially enhance the sustainability of marine biomass use.

  11. Distant supervision for neural relation extraction integrated with word attention and property features.

    Science.gov (United States)

    Qu, Jianfeng; Ouyang, Dantong; Hua, Wen; Ye, Yuxin; Li, Ximing

    2018-04-01

    Distant supervision for neural relation extraction is an efficient approach to extracting massive relations with reference to plain texts. However, the existing neural methods fail to capture the critical words in sentence encoding and meanwhile lack useful sentence information for some positive training instances. To address the above issues, we propose a novel neural relation extraction model. First, we develop a word-level attention mechanism to distinguish the importance of each individual word in a sentence, increasing the attention weights for those critical words. Second, we investigate the semantic information from word embeddings of target entities, which can be developed as a supplementary feature for the extractor. Experimental results show that our model outperforms previous state-of-the-art baselines. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. a Statistical Texture Feature for Building Collapse Information Extraction of SAR Image

    Science.gov (United States)

    Li, L.; Yang, H.; Chen, Q.; Liu, X.

    2018-04-01

    Synthetic Aperture Radar (SAR) has become one of the most important ways to extract post-disaster collapsed building information, due to its extreme versatility and almost all-weather, day-and-night working capability, etc. In view of the fact that the inherent statistical distribution of speckle in SAR images is not used to extract collapsed building information, this paper proposed a novel texture feature of statistical models of SAR images to extract the collapsed buildings. In the proposed feature, the texture parameter of G0 distribution from SAR images is used to reflect the uniformity of the target to extract the collapsed building. This feature not only considers the statistical distribution of SAR images, providing more accurate description of the object texture, but also is applied to extract collapsed building information of single-, dual- or full-polarization SAR data. The RADARSAT-2 data of Yushu earthquake which acquired on April 21, 2010 is used to present and analyze the performance of the proposed method. In addition, the applicability of this feature to SAR data with different polarizations is also analysed, which provides decision support for the data selection of collapsed building information extraction.

  13. A method for automating the extraction of specialized information from the web

    NARCIS (Netherlands)

    Lin, L.; Liotta, A.; Hippisley, A.; Hao, Y.; Liu, J.; Wang, Y.; Cheung, Y-M.; Yin, H.; Jiao, L.; Ma, j.; Jiao, Y-C.

    2005-01-01

    The World Wide Web can be viewed as a gigantic distributed database including millions of interconnected hosts some of which publish information via web servers or peer-to-peer systems. We present here a novel method for the extraction of semantically rich information from the web in a fully

  14. Information analysis of iris biometrics for the needs of cryptology key extraction

    Directory of Open Access Journals (Sweden)

    Adamović Saša

    2013-01-01

    Full Text Available The paper presents a rigorous analysis of iris biometric information for the synthesis of an optimized system for the extraction of a high quality cryptology key. Estimations of local entropy and mutual information were identified as segments of the iris most suitable for this purpose. In order to optimize parameters, corresponding wavelets were transformed, in order to obtain the highest possible entropy and mutual information lower in the transformation domain, which set frameworks for the synthesis of systems for the extraction of truly random sequences of iris biometrics, without compromising authentication properties. [Projekat Ministarstva nauke Republike Srbije, br. TR32054 i br. III44006

  15. Linking attentional processes and conceptual problem solving: visual cues facilitate the automaticity of extracting relevant information from diagrams.

    Science.gov (United States)

    Rouinfar, Amy; Agra, Elise; Larson, Adam M; Rebello, N Sanjay; Loschky, Lester C

    2014-01-01

    This study investigated links between visual attention processes and conceptual problem solving. This was done by overlaying visual cues on conceptual physics problem diagrams to direct participants' attention to relevant areas to facilitate problem solving. Participants (N = 80) individually worked through four problem sets, each containing a diagram, while their eye movements were recorded. Each diagram contained regions that were relevant to solving the problem correctly and separate regions related to common incorrect responses. Problem sets contained an initial problem, six isomorphic training problems, and a transfer problem. The cued condition saw visual cues overlaid on the training problems. Participants' verbal responses were used to determine their accuracy. This study produced two major findings. First, short duration visual cues which draw attention to solution-relevant information and aid in the organizing and integrating of it, facilitate both immediate problem solving and generalization of that ability to new problems. Thus, visual cues can facilitate re-representing a problem and overcoming impasse, enabling a correct solution. Importantly, these cueing effects on problem solving did not involve the solvers' attention necessarily embodying the solution to the problem, but were instead caused by solvers attending to and integrating relevant information in the problems into a solution path. Second, this study demonstrates that when such cues are used across multiple problems, solvers can automatize the extraction of problem-relevant information extraction. These results suggest that low-level attentional selection processes provide a necessary gateway for relevant information to be used in problem solving, but are generally not sufficient for correct problem solving. Instead, factors that lead a solver to an impasse and to organize and integrate problem information also greatly facilitate arriving at correct solutions.

  16. Integration of Solid-phase Extraction with Electrothermal Atomic Absorption Spectrometry for Determination of Trace Elements

    OpenAIRE

    NUKATSUKA, Isoshi; OHZEKI, Kunio

    2006-01-01

    An enrichment step in a sample treatment is essential for trace analysis to improve the sensitivity and to eliminate the matrix of the sample. Solid-phase extraction (SPE) is one of the widely used enrichment technique. Electrothermal atomic absorption spectrometry (ETAAS) is a well-established determination technique for trace elements. The integration of SPE with ETAAS leads to further improvement of sensitivity, an automation of the measurement and the economy in the sample size, amounts o...

  17. Performance measurement integrated information framework in e-Manufacturing

    Science.gov (United States)

    Teran, Hilaida; Hernandez, Juan Carlos; Vizán, Antonio; Ríos, José

    2014-11-01

    The implementation of Internet technologies has led to e-Manufacturing technologies becoming more widely used and to the development of tools for compiling, transforming and synchronising manufacturing data through the Web. In this context, a potential area for development is the extension of virtual manufacturing to performance measurement (PM) processes, a critical area for decision making and implementing improvement actions in manufacturing. This paper proposes a PM information framework to integrate decision support systems in e-Manufacturing. Specifically, the proposed framework offers a homogeneous PM information exchange model that can be applied through decision support in e-Manufacturing environment. Its application improves the necessary interoperability in decision-making data processing tasks. It comprises three sub-systems: a data model, a PM information platform and PM-Web services architecture. A practical example of data exchange for measurement processes in the area of equipment maintenance is shown to demonstrate the utility of the model.

  18. Three-dimensional multi-terminal superconductive integrated circuit inductance extraction

    International Nuclear Information System (INIS)

    Fourie, Coenrad J; Wetzstein, Olaf; Kunert, Jürgen; Ortlepp, Thomas

    2011-01-01

    Accurate inductance calculations are critical for the design of both digital and analogue superconductive integrated circuits, and three-dimensional calculations are gaining importance with the advent of inductive biasing, inductive coupling and sky plane shielding for RSFQ cells. InductEx, an extraction programme based on the three-dimensional calculation software FastHenry, was proposed earlier. InductEx uses segmentation techniques designed to accurately model the geometries of superconductive integrated circuit structures. Inductance extraction for complex multi-terminal three-dimensional structures from current distributions calculated by FastHenry is discussed. Results for both a reflection plane modelling an infinite ground plane and a finite segmented ground plane that allows inductive elements to extend over holes in the ground plane are shown. Several SQUIDs were designed for and fabricated with IPHT's 1 kA cm −2 RSFQ1D niobium process. These SQUIDs implement a number of loop structures that span different layers, include vias, inductively coupled control lines and ground plane holes. We measured the loop inductance of these SQUIDs and show how the results are used to calibrate the layer parameters in InductEx and verify the extraction accuracy. We also show that, with proper modelling, FastHenry can be fast enough to be used for the extraction of typical RSFQ cell inductances.

  19. Definition of information technology architectures for continuous data management and medical device integration in diabetes.

    Science.gov (United States)

    Hernando, M Elena; Pascual, Mario; Salvador, Carlos H; García-Sáez, Gema; Rodríguez-Herrero, Agustín; Martínez-Sarriegui, Iñaki; Gómez, Enrique J

    2008-09-01

    The growing availability of continuous data from medical devices in diabetes management makes it crucial to define novel information technology architectures for efficient data storage, data transmission, and data visualization. The new paradigm of care demands the sharing of information in interoperable systems as the only way to support patient care in a continuum of care scenario. The technological platforms should support all the services required by the actors involved in the care process, located in different scenarios and managing diverse information for different purposes. This article presents basic criteria for defining flexible and adaptive architectures that are capable of interoperating with external systems, and integrating medical devices and decision support tools to extract all the relevant knowledge to support diabetes care.

  20. Waste Information Management System with Integrated Transportation Forecast Data

    International Nuclear Information System (INIS)

    Upadhyay, H.; Quintero, W.; Shoffner, P.; Lagos, L.

    2009-01-01

    The Waste Information Management System with Integrated Transportation Forecast Data was developed to support the Department of Energy (DOE) mandated accelerated cleanup program. The schedule compression required close coordination and a comprehensive review and prioritization of the barriers that impeded treatment and disposition of the waste streams at each site. Many issues related to site waste treatment and disposal were potential critical path issues under the accelerated schedules. In order to facilitate accelerated cleanup initiatives, waste managers at DOE field sites and at DOE Headquarters in Washington, D.C., needed timely waste forecast and transportation information regarding the volumes and types of waste that would be generated by the DOE sites over the next 40 years. Each local DOE site has historically collected, organized, and displayed site waste forecast information in separate and unique systems. However, waste and shipment information from all sites needed a common application to allow interested parties to understand and view the complete complex-wide picture. The Waste Information Management System with Integrated Transportation Forecast Data allows identification of total forecasted waste volumes, material classes, disposition sites, choke points, technological or regulatory barriers to treatment and disposal, along with forecasted waste transportation information by rail, truck and inter-modal shipments. The Applied Research Center (ARC) at Florida International University (FIU) in Miami, Florida, has deployed the web-based forecast and transportation system and is responsible for updating the waste forecast and transportation data on a regular basis to ensure the long-term viability and value of this system. (authors)

  1. IHE, Solution for integration of information systems and PACS

    Directory of Open Access Journals (Sweden)

    Milad Janghorban Lariche

    2014-10-01

    Full Text Available PACS is used as a way to store images and matches well with the workflow in the radiology department and can spread to other parts of hospital. Integration with other PACS and other hospital systems like radiology information system (RIS, hospital information system (HIS, and electronic patient records has been completely done, but there are still problems. PACS also provide good conditions for setting up Tele-radiology. The next step for PACS is where hospitals and health care organizations share photos in integrated electronic patient record. Among the different ways for sharing photos between different hospitals, IHE (integrating the health care enterprise standard indexes the cross-enterprise document sharing profile (XDS and allows sharing photos from various hospitals even if their PACS has different brands and different vendors. Application of XDS is useful for sharing images between health care organizations without duplicating them in a central archive. Images need to be indexed in a central registry. In the XDS profile, IHE defines an indexing mechanism for printing and indexing images in the central document registry. IHE also defines mechanisms to be used by each hospital to retrieve images, regardless of storing them in hospital PACS.

  2. Impact of informal institutions on the development integration processes

    Directory of Open Access Journals (Sweden)

    Sidorova Alexandra, M.

    2015-06-01

    Full Text Available The paper deals with the impact of informal institutions on the definition of the vector integration processes and the development of integration processes in the countries of the Customs Union and Ukraine. The degree of scientific development of the phenomenon in different economic schools is determined in this article. Economic mentality is a basic informal institutions, which determines the degree of effectiveness of the integration processes. This paper examines the nature, characteristics and effects of economic mentality on the economic activities of people. Ethnometrichal method allows to quantify the economic mentality that enables deeper understanding and analysis of the formation and functioning of political and economic system, especially business and management, establishing contacts with other cultures. It was measured modern Belarusian economic mentality based on international methodology Hofstede and compared with the economic mentality of Russia, Ukraine and Kazakhstan. With the help of cluster analysis congruence economic mentality of the Customs Union and Ukraine was determined. Economic mentality of these countries was also compared with the economic mentality of other countries in order to identify the main types of economic culture.

  3. TOMATOMICS: A Web Database for Integrated Omics Information in Tomato

    KAUST Repository

    Kudo, Toru; Kobayashi, Masaaki; Terashima, Shin; Katayama, Minami; Ozaki, Soichi; Kanno, Maasa; Saito, Misa; Yokoyama, Koji; Ohyanagi, Hajime; Aoki, Koh; Kubo, Yasutaka; Yano, Kentaro

    2016-01-01

    Solanum lycopersicum (tomato) is an important agronomic crop and a major model fruit-producing plant. To facilitate basic and applied research, comprehensive experimental resources and omics information on tomato are available following their development. Mutant lines and cDNA clones from a dwarf cultivar, Micro-Tom, are two of these genetic resources. Large-scale sequencing data for ESTs and full-length cDNAs from Micro-Tom continue to be gathered. In conjunction with information on the reference genome sequence of another cultivar, Heinz 1706, the Micro-Tom experimental resources have facilitated comprehensive functional analyses. To enhance the efficiency of acquiring omics information for tomato biology, we have integrated the information on the Micro-Tom experimental resources and the Heinz 1706 genome sequence. We have also inferred gene structure by comparison of sequences between the genome of Heinz 1706 and the transcriptome, which are comprised of Micro-Tom full-length cDNAs and Heinz 1706 RNA-seq data stored in the KaFTom and Sequence Read Archive databases. In order to provide large-scale omics information with streamlined connectivity we have developed and maintain a web database TOMATOMICS (http://bioinf.mind.meiji.ac.jp/tomatomics/). In TOMATOMICS, access to the information on the cDNA clone resources, full-length mRNA sequences, gene structures, expression profiles and functional annotations of genes is available through search functions and the genome browser, which has an intuitive graphical interface.

  4. TOMATOMICS: A Web Database for Integrated Omics Information in Tomato

    KAUST Repository

    Kudo, Toru

    2016-11-29

    Solanum lycopersicum (tomato) is an important agronomic crop and a major model fruit-producing plant. To facilitate basic and applied research, comprehensive experimental resources and omics information on tomato are available following their development. Mutant lines and cDNA clones from a dwarf cultivar, Micro-Tom, are two of these genetic resources. Large-scale sequencing data for ESTs and full-length cDNAs from Micro-Tom continue to be gathered. In conjunction with information on the reference genome sequence of another cultivar, Heinz 1706, the Micro-Tom experimental resources have facilitated comprehensive functional analyses. To enhance the efficiency of acquiring omics information for tomato biology, we have integrated the information on the Micro-Tom experimental resources and the Heinz 1706 genome sequence. We have also inferred gene structure by comparison of sequences between the genome of Heinz 1706 and the transcriptome, which are comprised of Micro-Tom full-length cDNAs and Heinz 1706 RNA-seq data stored in the KaFTom and Sequence Read Archive databases. In order to provide large-scale omics information with streamlined connectivity we have developed and maintain a web database TOMATOMICS (http://bioinf.mind.meiji.ac.jp/tomatomics/). In TOMATOMICS, access to the information on the cDNA clone resources, full-length mRNA sequences, gene structures, expression profiles and functional annotations of genes is available through search functions and the genome browser, which has an intuitive graphical interface.

  5. The impact of IAIMS on the work of information experts. Integrated Advanced Information Management Systems.

    Science.gov (United States)

    Ash, J

    1995-10-01

    Integrated Advanced Information Management Systems (IAIMS) programs differ but have certain characteristics in common. Technological and organizational integration are universal goals. As integration takes place, what happens to those implementing the vision? A survey of 125 staff members, or information experts, involved in information or informatics at an IAIMS-funded institution was conducted during the last year of the implementation phase. The purpose was to measure the impact of IAIMS on the jobs of those in the library and related service units, and the computing, telecommunications, and health informatics divisions. The researchers used newly developed scales measuring levels of integration (knowledge of and involvement with other departments), customer orientation (focus on the user), and informatedness (changes in the nature of work beyond automation of former routines). Ninety-four percent of respondents indicated that their jobs had changed a great deal; the changes were similar regardless of division. To further investigate the impact of IAIMS on librarians in particular, a separate skills survey was conducted. The IAIMS librarians indicated that technology and training skills are especially needed in the new, integrated environment.

  6. Utilization of Integrated Geophysical Techniques to Delineate the Extraction of Mining Bench of Ornamental Rocks (Marble

    Directory of Open Access Journals (Sweden)

    Julián Martínez

    2017-12-01

    Full Text Available Low yields in ornamental rock mining remain one of the most important problems in this industry. This fact is usually associated with the presence of anisotropies in the rock, which makes it difficult to extract the blocks. An optimised planning of the exploitation, together with an improved geological understanding of the deposit, could increase these yields. In this work, marble mining in Macael (Spain was studied to test the capacity of non-destructive geophysical prospecting methods (GPR and ERI as tools to characterize the geology of the deposit. It is well-known that the ERI method provides a greater penetration depth. By using this technique, it is possible to distinguish the boundaries between the marble and the underlying micaschists, the morphology of the unit to be exploited, and even fracture zones to be identified. Therefore, this technique could be used in the early stages of research, to estimate the reserves of the deposit. The GPR methodology, with a lower penetration depth, is able to offer more detailed information. Specifically, it detects lateral and vertical changes of the facies inside the marble unit, as well as the anisotropies of the rock (fractures or holes. This technique would be suitable for use in a second stage of research. On the one hand, it is very useful for characterization of the texture and fabric of the rock, which allows us to determine in advance its properties, and therefore, the quality for ornamental use. On the other hand, the localization of anisotropy using the GPR technique will make it possible to improve the planning of the rock exploitation in order to increase yields. Both integrated geophysical techniques are effective for assessing the quality of ornamental rock and thus can serve as useful tools in mine planning to improve yields and costs.

  7. Integrated radiation information system in the Czech Republic

    International Nuclear Information System (INIS)

    Drabova, D.; Prouza, Z.; Malatova, I.; Kuca, P.; Bucina, I.

    1998-01-01

    Outline and organizational structure of radiation monitoring network (RMN) in the Czech Republic is conformable with similar networks abroad integrated system of a number of components serve for continuous monitoring of radiation situation on the territory of the Czech Republic, detecting an abnormal radiological situation due to domestic source, detecting a non notified accident abroad with consequences on the territory of the Czech Republic, monitoring the evolution, determining the components of any radioactivity discharge, first estimation of accident extent, forecasting of accident development and of dispersion of radionuclides in the vicinity of source, acquisition of base for decision upon evaluation and other countermeasures and remedial actions, assessment and forecast of contamination for regulation of food and water consumption, review of enforced countermeasures based on actual monitoring data and refined forecast. For model calculations and decision making in case of a nuclear accident an integrated comprehensive computer based information system is now being set up in Czech Republic. (R.P.)

  8. The informed application of building-integrated wind power

    Energy Technology Data Exchange (ETDEWEB)

    Breshears, J.; Briscoe, C. [Zimmer Gunsal Frasca Architects, Portland, OR (United States)

    2009-07-01

    This paper reported on an exercise that was undertaken to integrate small-scale wind turbines into the design of an urban high-rise in Portland, Oregon. Wind behaviour in the urban environment is very complex, as the flow of wind over and around buildings often triggers multiple transitions of the air from laminar flow to turbulent. The study documented the process of moving beyond a simplistic approach to a truly informed application of building-integrated wind generation. The 4 key issues addressed in the study process were quantifying the geographical wind regime; predicting wind flow over the building; turbine selection; and pragmatics regarding the design of roof mounting to accommodate structural loads and mitigate vibration. The results suggested that the turbine array should produce in the range of only 1 per cent of the electrical load of the building. 13 refs., 11 figs.

  9. Unifying Kohlberg with Information Integration: The Moral Algebra of Recompense and of Kohlbergian Moral Informers

    Science.gov (United States)

    Hommers, Wilfried; Lee, Wha-Yong

    2010-01-01

    In order to unify two major theories of moral judgment, a novel task is employed which combines elements of Kohlberg's stage theory and of the theory of information integration. In contrast to the format of Kohlberg's moral judgment interview, a nonverbal and quantitative response which makes low demands on verbal facility was used. Moral…

  10. Integrated information system for analysis of nuclear power plants

    International Nuclear Information System (INIS)

    Galperin, A.

    1994-01-01

    Performing complicated engineering analyses of a nuclear power plant requires storage and manipulation of a large amount of information, both data and knowledge. This information is characterized by its multidisciplinary nature, complexity, and diversity. The problems caused by inefficient and lengthy manual operations involving the data flow management within the frame-work of the safety-related analysis of a power plant can be solved by applying the computer aided engineering principles. These principles are the basis of the design of an integrated information storage system (IRIS). The basic idea is to create a computerized environment, which includes both database and functional capabilities. Consideration and analysis of the data types and required data manipulation capabilities as well as operational requirements, resulted in the choice of an object-oriented data-base management system (OODBMS) as a development platform for solving the software engineering problems. Several advantages of OODBMSs over conventional relations database systems were found of crucial importance, especially providing the necessary flexibility for different data types and extensibility potential. A detailed design of a data model is produced for the plant technical data and for the storage of analysis results. The overall system architecture was designed to assure the feasibility of integrating database capabilities with procedures and functions written in conventional algorithmic programming languages

  11. Strengthening Rehabilitation in Health Systems Worldwide by Integrating Information on Functioning in National Health Information Systems.

    Science.gov (United States)

    Stucki, Gerold; Bickenbach, Jerome; Melvin, John

    2017-09-01

    A complete understanding of the experience of health requires information relevant not merely to the health indicators of mortality and morbidity but also to functioning-that is, information about what it means to live in a health state, "the lived experience of health." Not only is functioning information relevant to healthcare and the overall objectives of person-centered healthcare but to the successful operation of all components of health systems.In light of population aging and major epidemiological trends, the health strategy of rehabilitation, whose aim has always been to optimize functioning and minimize disability, will become a key health strategy. The increasing prominence of the rehabilitative strategy within the health system drives the argument for the integration of functioning information as an essential component in national health information systems.Rehabilitation professionals and researchers have long recognized in WHO's International Classification of Functioning, Disability and Health the best prospect for an internationally recognized, sufficiently complete and powerful information reference for the documentation of functioning information. This paper opens the discussion of the promise of integrating the ICF as an essential component in national health systems to secure access to functioning information for rehabilitation, across health systems and countries.

  12. The integration of Information and Communication Technology into medical practice.

    Science.gov (United States)

    Lupiáñez-Villanueva, Francisco; Hardey, Michael; Torrent, Joan; Ficapal, Pilar

    2010-07-01

    To identify doctors' utilization of ICT; to develop and characterise a typology of doctors' utilization of ICT and to identify factors that can enhance or inhibit the use of these technologies within medical practice. An online survey of the 16,531 members of the Physicians Association of Barcelona who had a registered email account in 2006 was carried out. Factor analysis, cluster analysis and binomial logit model were undertaken. Multivariate statistics analysis of the 2199 responses obtained revealed two profiles of adoption of ICT. The first profile (38.61% of respondents) represents those doctors who place high emphasis on ICT within their practice. This group is thus referred to as 'integrated doctors'. The second profile (61.39% of respondents) represents those doctors who make less use of ICT so are consequently labelled 'non-integrated doctors'. From the statistical modelling, it was observed that an emphasis on international information; emphasis on ICT for research and medical practice; emphasis on information systems to consult and prescribe; undertaking teaching/research activities; a belief that the use of the Internet improved communication with patients and practice in both public and private health organizations play a positive and significant role in the probability of being an 'integrated doctor'. The integration of ICT within medical practice cannot be adequately understood and appreciated without examining how doctors are making use of ICT within their own practice, organizational contexts and the opportunities and constraints afforded by institutional, professional and patient expectations and demands. 2010 Elsevier Ireland Ltd. All rights reserved.

  13. An Integrative Behavioral Model of Information Security Policy Compliance

    Directory of Open Access Journals (Sweden)

    Sang Hoon Kim

    2014-01-01

    Full Text Available The authors found the behavioral factors that influence the organization members’ compliance with the information security policy in organizations on the basis of neutralization theory, Theory of planned behavior, and protection motivation theory. Depending on the theory of planned behavior, members’ attitudes towards compliance, as well as normative belief and self-efficacy, were believed to determine the intention to comply with the information security policy. Neutralization theory, a prominent theory in criminology, could be expected to provide the explanation for information system security policy violations. Based on the protection motivation theory, it was inferred that the expected efficacy could have an impact on intentions of compliance. By the above logical reasoning, the integrative behavioral model and eight hypotheses could be derived. Data were collected by conducting a survey; 194 out of 207 questionnaires were available. The test of the causal model was conducted by PLS. The reliability, validity, and model fit were found to be statistically significant. The results of the hypotheses tests showed that seven of the eight hypotheses were acceptable. The theoretical implications of this study are as follows: (1 the study is expected to play a role of the baseline for future research about organization members’ compliance with the information security policy, (2 the study attempted an interdisciplinary approach by combining psychology and information system security research, and (3 the study suggested concrete operational definitions of influencing factors for information security policy compliance through a comprehensive theoretical review. Also, the study has some practical implications. First, it can provide the guideline to support the successful execution of the strategic establishment for the implement of information system security policies in organizations. Second, it proves that the need of education and training

  14. An integrative behavioral model of information security policy compliance.

    Science.gov (United States)

    Kim, Sang Hoon; Yang, Kyung Hoon; Park, Sunyoung

    2014-01-01

    The authors found the behavioral factors that influence the organization members' compliance with the information security policy in organizations on the basis of neutralization theory, Theory of planned behavior, and protection motivation theory. Depending on the theory of planned behavior, members' attitudes towards compliance, as well as normative belief and self-efficacy, were believed to determine the intention to comply with the information security policy. Neutralization theory, a prominent theory in criminology, could be expected to provide the explanation for information system security policy violations. Based on the protection motivation theory, it was inferred that the expected efficacy could have an impact on intentions of compliance. By the above logical reasoning, the integrative behavioral model and eight hypotheses could be derived. Data were collected by conducting a survey; 194 out of 207 questionnaires were available. The test of the causal model was conducted by PLS. The reliability, validity, and model fit were found to be statistically significant. The results of the hypotheses tests showed that seven of the eight hypotheses were acceptable. The theoretical implications of this study are as follows: (1) the study is expected to play a role of the baseline for future research about organization members' compliance with the information security policy, (2) the study attempted an interdisciplinary approach by combining psychology and information system security research, and (3) the study suggested concrete operational definitions of influencing factors for information security policy compliance through a comprehensive theoretical review. Also, the study has some practical implications. First, it can provide the guideline to support the successful execution of the strategic establishment for the implement of information system security policies in organizations. Second, it proves that the need of education and training programs suppressing

  15. Domain XML semantic integration based on extraction rules and ontology mapping

    Directory of Open Access Journals (Sweden)

    Huayu LI

    2016-08-01

    Full Text Available A plenty of XML documents exist in petroleum engineering field, but traditional XML integration solution can’t provide semantic query, which leads to low data use efficiency. In light of WeXML(oil&gas well XML data semantic integration and query requirement, this paper proposes a semantic integration method based on extraction rules and ontology mapping. The method firstly defines a series of extraction rules with which elements and properties of WeXML Schema are mapped to classes and properties in WeOWL ontology, respectively; secondly, an algorithm is used to transform WeXML documents into WeOWL instances. Because WeOWL provides limited semantics, ontology mappings between two ontologies are then built to explain class and property of global ontology with terms of WeOWL, and semantic query based on global domain concepts model is provided. By constructing a WeXML data semantic integration prototype system, the proposed transformational rule, the transfer algorithm and the mapping rule are tested.

  16. Integrating Decentralized Indoor Evacuation with Information Depositories in the Field

    Directory of Open Access Journals (Sweden)

    Haifeng Zhao

    2017-07-01

    Full Text Available The lonelier evacuees find themselves, the riskier become their wayfinding decisions. This research supports single evacuees in a dynamically changing environment with risk-aware guidance. It deploys the concept of decentralized evacuation, where evacuees are guided by smartphones acquiring environmental knowledge and risk information via exploration and knowledge sharing by peer-to-peer communication. Peer-to-peer communication, however, relies on the chance that people come into communication range with each other. This chance can be low. To bridge between people being not at the same time at the same places, this paper suggests information depositories at strategic locations to improve information sharing. Information depositories collect the knowledge acquired by the smartphones of evacuees passing by, maintain this information, and convey it to other passing-by evacuees. Multi-agent simulation implementing these depositories in an indoor environment shows that integrating depositories improves evacuation performance: It enhances the risk awareness and consequently increases the chance that people survive and reduces their evacuation time. For evacuating dynamic events, deploying depositories at staircases has been shown more effective than deploying them in corridors.

  17. Project Integration Architecture: Inter-Application Propagation of Information

    Science.gov (United States)

    Jones, William Henry

    2005-01-01

    A principal goal of the Project Integration Architecture (PIA) is to facilitate the meaningful inter-application transfer of application-value-added information. Such exchanging applications may be largely unrelated to each other except through their applicability to an overall project; however, the PIA effort recognizes as fundamental the need to make such applications cooperate despite wide disparaties either in the fidelity of the analyses carried out, or even the disciplines of the analysis. This paper discusses the approach and techniques applied and anticipated by the PIA project in treating this need.

  18. SpecOp: Optimal Extraction Software for Integral Field Unit Spectrographs

    Science.gov (United States)

    McCarron, Adam; Ciardullo, Robin; Eracleous, Michael

    2018-01-01

    The Hobby-Eberly Telescope’s new low resolution integral field spectrographs, LRS2-B and LRS2-R, each cover a 12”x6” area on the sky with 280 fibers and generate spectra with resolutions between R=1100 and R=1900. To extract 1-D spectra from the instrument’s 3D data cubes, a program is needed that is flexible enough to work for a wide variety of targets, including continuum point sources, emission line sources, and compact sources embedded in complex backgrounds. We therefore introduce SpecOp, a user-friendly python program for optimally extracting spectra from integral-field unit spectrographs. As input, SpecOp takes a sky-subtracted data cube consisting of images at each wavelength increment set by the instrument’s spectral resolution, and an error file for each count measurement. All of these files are generated by the current LRS2 reduction pipeline. The program then collapses the cube in the image plane using the optimal extraction algorithm detailed by Keith Horne (1986). The various user-selected options include the fraction of the total signal enclosed in a contour-defined region, the wavelength range to analyze, and the precision of the spatial profile calculation. SpecOp can output the weighted counts and errors at each wavelength in various table formats using python’s astropy package. We outline the algorithm used for extraction and explain how the software can be used to easily obtain high-quality 1-D spectra. We demonstrate the utility of the program by applying it to spectra of a variety of quasars and AGNs. In some of these targets, we extract the spectrum of a nuclear point source that is superposed on a spatially extended galaxy.

  19. An integrated healthcare enterprise information portal and healthcare information system framework.

    Science.gov (United States)

    Hsieh, S L; Lai, Feipei; Cheng, P H; Chen, J L; Lee, H H; Tsai, W N; Weng, Y C; Hsieh, S H; Hsu, K P; Ko, L F; Yang, T H; Chen, C H

    2006-01-01

    The paper presents an integrated, distributed Healthcare Enterprise Information Portal (HEIP) and Hospital Information Systems (HIS) framework over wireless/wired infrastructure at National Taiwan University Hospital (NTUH). A single sign-on solution for the hospital customer relationship management (CRM) in HEIP has been established. The outcomes of the newly developed Outpatient Information Systems (OIS) in HIS are discussed. The future HEIP blueprints with CRM oriented features: e-Learning, Remote Consultation and Diagnosis (RCD), as well as on-Line Vaccination Services are addressed. Finally, the integrated HEIP and HIS architectures based on the middleware technologies are proposed along with the feasible approaches. The preliminary performance of multi-media, time-based data exchanges over the wireless HEIP side is collected to evaluate the efficiency of the architecture.

  20. Research of building information extraction and evaluation based on high-resolution remote-sensing imagery

    Science.gov (United States)

    Cao, Qiong; Gu, Lingjia; Ren, Ruizhi; Wang, Lang

    2016-09-01

    Building extraction currently is important in the application of high-resolution remote sensing imagery. At present, quite a few algorithms are available for detecting building information, however, most of them still have some obvious disadvantages, such as the ignorance of spectral information, the contradiction between extraction rate and extraction accuracy. The purpose of this research is to develop an effective method to detect building information for Chinese GF-1 data. Firstly, the image preprocessing technique is used to normalize the image and image enhancement is used to highlight the useful information in the image. Secondly, multi-spectral information is analyzed. Subsequently, an improved morphological building index (IMBI) based on remote sensing imagery is proposed to get the candidate building objects. Furthermore, in order to refine building objects and further remove false objects, the post-processing (e.g., the shape features, the vegetation index and the water index) is employed. To validate the effectiveness of the proposed algorithm, the omission errors (OE), commission errors (CE), the overall accuracy (OA) and Kappa are used at final. The proposed method can not only effectively use spectral information and other basic features, but also avoid extracting excessive interference details from high-resolution remote sensing images. Compared to the original MBI algorithm, the proposed method reduces the OE by 33.14% .At the same time, the Kappa increase by 16.09%. In experiments, IMBI achieved satisfactory results and outperformed other algorithms in terms of both accuracies and visual inspection

  1. Mutual Information Based Dynamic Integration of Multiple Feature Streams for Robust Real-Time LVCSR

    Science.gov (United States)

    Sato, Shoei; Kobayashi, Akio; Onoe, Kazuo; Homma, Shinichi; Imai, Toru; Takagi, Tohru; Kobayashi, Tetsunori

    We present a novel method of integrating the likelihoods of multiple feature streams, representing different acoustic aspects, for robust speech recognition. The integration algorithm dynamically calculates a frame-wise stream weight so that a higher weight is given to a stream that is robust to a variety of noisy environments or speaking styles. Such a robust stream is expected to show discriminative ability. A conventional method proposed for the recognition of spoken digits calculates the weights front the entropy of the whole set of HMM states. This paper extends the dynamic weighting to a real-time large-vocabulary continuous speech recognition (LVCSR) system. The proposed weight is calculated in real-time from mutual information between an input stream and active HMM states in a searchs pace without an additional likelihood calculation. Furthermore, the mutual information takes the width of the search space into account by calculating the marginal entropy from the number of active states. In this paper, we integrate three features that are extracted through auditory filters by taking into account the human auditory system's ability to extract amplitude and frequency modulations. Due to this, features representing energy, amplitude drift, and resonant frequency drifts, are integrated. These features are expected to provide complementary clues for speech recognition. Speech recognition experiments on field reports and spontaneous commentary from Japanese broadcast news showed that the proposed method reduced error words by 9.2% in field reports and 4.7% in spontaneous commentaries relative to the best result obtained from a single stream.

  2. Design principles for achieving integrated healthcare information systems.

    Science.gov (United States)

    Jensen, Tina Blegind

    2013-03-01

    Achieving integrated healthcare information systems has become a common goal for many countries in their pursuit of obtaining coordinated and comprehensive healthcare services. This article focuses on how a small local project termed 'Standardized pull of patient data' expanded and is now used on a large scale providing a majority of hospitals, general practitioners and citizens across Denmark with the possibility of accessing healthcare data from different electronic patient record systems and other systems. I build on design theory for information infrastructures, as presented by Hanseth and Lyytinen, to examine the design principles that facilitated this smallscale project to expand and become widespread. As a result of my findings, I outline three lessons learned that emphasize: (i) principles of flexibility, (ii) expansion from the installed base through modular strategies and (iii) identification of key healthcare actors to provide them with immediate benefits.

  3. Information Extraction of High-Resolution Remotely Sensed Image Based on Multiresolution Segmentation

    Directory of Open Access Journals (Sweden)

    Peng Shao

    2014-08-01

    Full Text Available The principle of multiresolution segmentation was represented in detail in this study, and the canny algorithm was applied for edge-detection of a remotely sensed image based on this principle. The target image was divided into regions based on object-oriented multiresolution segmentation and edge-detection. Furthermore, object hierarchy was created, and a series of features (water bodies, vegetation, roads, residential areas, bare land and other information were extracted by the spectral and geometrical features. The results indicate that the edge-detection has a positive effect on multiresolution segmentation, and overall accuracy of information extraction reaches to 94.6% by the confusion matrix.

  4. End-to-end information extraction without token-level supervision

    DEFF Research Database (Denmark)

    Palm, Rasmus Berg; Hovy, Dirk; Laws, Florian

    2017-01-01

    Most state-of-the-art information extraction approaches rely on token-level labels to find the areas of interest in text. Unfortunately, these labels are time-consuming and costly to create, and consequently, not available for many real-life IE tasks. To make matters worse, token-level labels...... and output text. We evaluate our model on the ATIS data set, MIT restaurant corpus and the MIT movie corpus and compare to neural baselines that do use token-level labels. We achieve competitive results, within a few percentage points of the baselines, showing the feasibility of E2E information extraction...

  5. Integrating Records Management (RM) and Information Technology (IT)

    Energy Technology Data Exchange (ETDEWEB)

    NUSBAUM,ANNA W.; CUSIMANO,LINDA J.

    2000-03-02

    Records Managers are continually exploring ways to integrate their services with those offered by Information Technology-related professions to capitalize on the advantages of providing customers a total solution to managing their records and information. In this day and age, where technology abounds, there often exists a fear on the part of records management that this integration will result in a loss of identity and the focus of one's own mission - a fear that records management may become subordinated to the fast-paced technology fields. They need to remember there is strength in numbers and it benefits RM, IT, and the customer when they can bring together the unique offerings each possess to reach synergy for the benefit of all the corporations. Records Managers, need to continually strive to move ''outside the records management box'', network, expand their knowledge, and influence the IT disciplines to incorporate the concept of ''management'' into their customer solutions.

  6. Unifying Kohlberg with Information Integration: The Moral Algebra of Recompense and of Kohlbergian Moral Informers

    Directory of Open Access Journals (Sweden)

    Wilfried Hommers

    2010-01-01

    Full Text Available In order to unify two major theories of moral judgment, a novel task is employed which combines elements of Kohlberg's stage theory and of the theory of information integration. In contrast to the format of Kohlberg's moral judgment interview, a nonverbal and quantitative response which makes low demands on verbal facility was used . Moral informers differing in value, i.e. high and low, are presented. The differences in effect of those two pieces of information should be substantial for a person at that specific moral stage, but small for a person at a different stage. Hence, these differences may diagnose the person's moral stage in the simplest possible way as the two levels of each of the thoughts were about typical content of the four Kohlbergian preconventional and conventional stages. The novel task allowed additionally to measure the influence of another moral concept which was about the non-Kohlbergian moral concept of recompense. After a training phase, pairs of those thoughts were presented to allow for the study of integration and individual differences. German and Korean children, 8, 10, and 12 years in age, judged deserved punishment. The patterns of means, correlations and factor loadings showed that elements of both theories can be unified, but produced unexpected results also. Additive integration of each of the two pairs of moral informers appeared, either with two Kohlbergian moral informers or with another Kohlbergian moral informer in combination with information about recompense. Also cultural independence as well as dependence, developmental changes between 8 and 10 years, and an outstanding moral impact of recompense in size and distinctiveness were observed.

  7. Towards a Unified Approach to Information Integration - A review paper on data/information fusion

    Energy Technology Data Exchange (ETDEWEB)

    Whitney, Paul D.; Posse, Christian; Lei, Xingye C.

    2005-10-14

    Information or data fusion of data from different sources are ubiquitous in many applications, from epidemiology, medical, biological, political, and intelligence to military applications. Data fusion involves integration of spectral, imaging, text, and many other sensor data. For example, in epidemiology, information is often obtained based on many studies conducted by different researchers at different regions with different protocols. In the medical field, the diagnosis of a disease is often based on imaging (MRI, X-Ray, CT), clinical examination, and lab results. In the biological field, information is obtained based on studies conducted on many different species. In military field, information is obtained based on data from radar sensors, text messages, chemical biological sensor, acoustic sensor, optical warning and many other sources. Many methodologies are used in the data integration process, from classical, Bayesian, to evidence based expert systems. The implementation of the data integration ranges from pure software design to a mixture of software and hardware. In this review we summarize the methodologies and implementations of data fusion process, and illustrate in more detail the methodologies involved in three examples. We propose a unified multi-stage and multi-path mapping approach to the data fusion process, and point out future prospects and challenges.

  8. Extraction Method for Earthquake-Collapsed Building Information Based on High-Resolution Remote Sensing

    International Nuclear Information System (INIS)

    Chen, Peng; Wu, Jian; Liu, Yaolin; Wang, Jing

    2014-01-01

    At present, the extraction of earthquake disaster information from remote sensing data relies on visual interpretation. However, this technique cannot effectively and quickly obtain precise and efficient information for earthquake relief and emergency management. Collapsed buildings in the town of Zipingpu after the Wenchuan earthquake were used as a case study to validate two kinds of rapid extraction methods for earthquake-collapsed building information based on pixel-oriented and object-oriented theories. The pixel-oriented method is based on multi-layer regional segments that embody the core layers and segments of the object-oriented method. The key idea is to mask layer by layer all image information, including that on the collapsed buildings. Compared with traditional techniques, the pixel-oriented method is innovative because it allows considerably rapid computer processing. As for the object-oriented method, a multi-scale segment algorithm was applied to build a three-layer hierarchy. By analyzing the spectrum, texture, shape, location, and context of individual object classes in different layers, the fuzzy determined rule system was established for the extraction of earthquake-collapsed building information. We compared the two sets of results using three variables: precision assessment, visual effect, and principle. Both methods can extract earthquake-collapsed building information quickly and accurately. The object-oriented method successfully overcomes the pepper salt noise caused by the spectral diversity of high-resolution remote sensing data and solves the problem of same object, different spectrums and that of same spectrum, different objects. With an overall accuracy of 90.38%, the method achieves more scientific and accurate results compared with the pixel-oriented method (76.84%). The object-oriented image analysis method can be extensively applied in the extraction of earthquake disaster information based on high-resolution remote sensing

  9. An Integrated Information System for Supporting Quality Management Tasks

    Science.gov (United States)

    Beyer, N.; Helmreich, W.

    2004-08-01

    In a competitive environment, well defined processes become the strategic advantage of a company. Hence, targeted Quality Management ensures efficiency, trans- parency and, ultimately, customer satisfaction. In the particular context of a Space Test Centre, a num- ber of specific Quality Management standards have to be applied. According to the revision of ISO 9001 dur- ing 2000, and due to the adaptation of ECSS-Q20-07, process orientation and data analysis are key tasks for ensuring and evaluating the efficiency of a company's processes. In line with these requirements, an integrated management system for accessing the necessary infor- mation to support Quality Management and other proc- esses has been established. Some of its test-related fea- tures are presented here. Easy access to the integrated management system from any work place at IABG's Space Test Centre is ensured by means of an intranet portal. It comprises a full set of quality-related process descriptions, information on test facilities, emergency procedures, and other relevant in- formation. The portal's web interface provides direct access to a couple of external applications. Moreover, easy updating of all information and low cost mainte- nance are features of this integrated information system. The timely and transparent management of non- conformances is covered by a dedicated NCR database which incorporates full documentation capability, elec- tronic signature and e-mail notification of concerned staff. A search interface allows for queries across all documented non-conformances. Furthermore, print ver- sions can be generated at any stage in the process, e.g. for distribution to customers. Feedback on customer satisfaction is sought through a web-based questionnaire. The process is initiated by the responsible test manager through submission of an e- mail that contains a hyperlink to a secure website, ask- ing the customer to complete the brief online form, which is directly fed to a database

  10. The integral and extrinsic bioactive proteins in the aqueous extracted soybean oil bodies.

    Science.gov (United States)

    Zhao, Luping; Chen, Yeming; Cao, Yanyun; Kong, Xiangzhen; Hua, Yufei

    2013-10-09

    Soybean oil bodies (OBs), naturally pre-emulsified soybean oil, have been examined by many researchers owing to their great potential utilizations in food, cosmetics, pharmaceutical, and other applications requiring stable oil-in-water emulsions. This study was the first time to confirm that lectin, Gly m Bd 28K (Bd 28K, one soybean allergenic protein), Kunitz trypsin inhibitor (KTI), and Bowman-Birk inhibitor (BBI) were not contained in the extracted soybean OBs even by neutral pH aqueous extraction. It was clarified that the well-known Gly m Bd 30K (Bd 30K), another soybean allergenic protein, was strongly bound to soybean OBs through a disulfide bond with 24 kDa oleosin. One steroleosin isoform (41 kDa) and two caleosin isoforms (27 kDa, 29 kDa), the integral bioactive proteins, were confirmed for the first time in soybean OBs, and a considerable amount of calcium, necessary for the biological activities of caleosin, was strongly bound to OBs. Unexpectedly, it was found that 24 kDa and 18 kDa oleosins could be hydrolyzed by an unknown soybean endoprotease in the extracted soybean OBs, which might give some hints for improving the enzyme-assisted aqueous extraction processing of soybean free oil.

  11. An integrated biohydrogen refinery: synergy of photofermentation, extractive fermentation and hydrothermal hydrolysis of food wastes.

    Science.gov (United States)

    Redwood, Mark D; Orozco, Rafael L; Majewski, Artur J; Macaskie, Lynne E

    2012-09-01

    An Integrated Biohydrogen Refinery (IBHR) and experimental net energy analysis are reported. The IBHR converts biomass to electricity using hydrothermal hydrolysis, extractive biohydrogen fermentation and photobiological hydrogen fermentation for electricity generation in a fuel cell. An extractive fermentation, developed previously, is applied to waste-derived substrates following hydrothermal pre-treatment, achieving 83-99% biowaste destruction. The selective separation of organic acids from waste-fed fermentations provided suitable substrate for photofermentative hydrogen production, which enhanced the gross energy generation up to 11-fold. Therefore, electrodialysis provides the key link in an IBHR for 'waste to energy'. The IBHR compares favourably to 'renewables' (photovoltaics, on-shore wind, crop-derived biofuels) and also emerging biotechnological options (microbial electrolysis) and anaerobic digestion. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. Cancer survival classification using integrated data sets and intermediate information.

    Science.gov (United States)

    Kim, Shinuk; Park, Taesung; Kon, Mark

    2014-09-01

    Although numerous studies related to cancer survival have been published, increasing the prediction accuracy of survival classes still remains a challenge. Integration of different data sets, such as microRNA (miRNA) and mRNA, might increase the accuracy of survival class prediction. Therefore, we suggested a machine learning (ML) approach to integrate different data sets, and developed a novel method based on feature selection with Cox proportional hazard regression model (FSCOX) to improve the prediction of cancer survival time. FSCOX provides us with intermediate survival information, which is usually discarded when separating survival into 2 groups (short- and long-term), and allows us to perform survival analysis. We used an ML-based protocol for feature selection, integrating information from miRNA and mRNA expression profiles at the feature level. To predict survival phenotypes, we used the following classifiers, first, existing ML methods, support vector machine (SVM) and random forest (RF), second, a new median-based classifier using FSCOX (FSCOX_median), and third, an SVM classifier using FSCOX (FSCOX_SVM). We compared these methods using 3 types of cancer tissue data sets: (i) miRNA expression, (ii) mRNA expression, and (iii) combined miRNA and mRNA expression. The latter data set included features selected either from the combined miRNA/mRNA profile or independently from miRNAs and mRNAs profiles (IFS). In the ovarian data set, the accuracy of survival classification using the combined miRNA/mRNA profiles with IFS was 75% using RF, 86.36% using SVM, 84.09% using FSCOX_median, and 88.64% using FSCOX_SVM with a balanced 22 short-term and 22 long-term survivor data set. These accuracies are higher than those using miRNA alone (70.45%, RF; 75%, SVM; 75%, FSCOX_median; and 75%, FSCOX_SVM) or mRNA alone (65.91%, RF; 63.64%, SVM; 72.73%, FSCOX_median; and 70.45%, FSCOX_SVM). Similarly in the glioblastoma multiforme data, the accuracy of miRNA/mRNA using IFS

  13. Information retrieval and terminology extraction in online resources for patients with diabetes.

    Science.gov (United States)

    Seljan, Sanja; Baretić, Maja; Kucis, Vlasta

    2014-06-01

    Terminology use, as a mean for information retrieval or document indexing, plays an important role in health literacy. Specific types of users, i.e. patients with diabetes need access to various online resources (on foreign and/or native language) searching for information on self-education of basic diabetic knowledge, on self-care activities regarding importance of dietetic food, medications, physical exercises and on self-management of insulin pumps. Automatic extraction of corpus-based terminology from online texts, manuals or professional papers, can help in building terminology lists or list of "browsing phrases" useful in information retrieval or in document indexing. Specific terminology lists represent an intermediate step between free text search and controlled vocabulary, between user's demands and existing online resources in native and foreign language. The research aiming to detect the role of terminology in online resources, is conducted on English and Croatian manuals and Croatian online texts, and divided into three interrelated parts: i) comparison of professional and popular terminology use ii) evaluation of automatic statistically-based terminology extraction on English and Croatian texts iii) comparison and evaluation of extracted terminology performed on English manual using statistical and hybrid approaches. Extracted terminology candidates are evaluated by comparison with three types of reference lists: list created by professional medical person, list of highly professional vocabulary contained in MeSH and list created by non-medical persons, made as intersection of 15 lists. Results report on use of popular and professional terminology in online diabetes resources, on evaluation of automatically extracted terminology candidates in English and Croatian texts and on comparison of statistical and hybrid extraction methods in English text. Evaluation of automatic and semi-automatic terminology extraction methods is performed by recall

  14. GeoDeepDive: Towards a Machine Reading-Ready Digital Library and Information Integration Resource

    Science.gov (United States)

    Husson, J. M.; Peters, S. E.; Livny, M.; Ross, I.

    2015-12-01

    Recent developments in machine reading and learning approaches to text and data mining hold considerable promise for accelerating the pace and quality of literature-based data synthesis, but these advances have outpaced even basic levels of access to the published literature. For many geoscience domains, particularly those based on physical samples and field-based descriptions, this limitation is significant. Here we describe a general infrastructure to support published literature-based machine reading and learning approaches to information integration and knowledge base creation. This infrastructure supports rate-controlled automated fetching of original documents, along with full bibliographic citation metadata, from remote servers, the secure storage of original documents, and the utilization of considerable high-throughput computing resources for the pre-processing of these documents by optical character recognition, natural language parsing, and other document annotation and parsing software tools. New tools and versions of existing tools can be automatically deployed against original documents when they are made available. The products of these tools (text/XML files) are managed by MongoDB and are available for use in data extraction applications. Basic search and discovery functionality is provided by ElasticSearch, which is used to identify documents of potential relevance to a given data extraction task. Relevant files derived from the original documents are then combined into basic starting points for application building; these starting points are kept up-to-date as new relevant documents are incorporated into the digital library. Currently, our digital library stores contains more than 360K documents supplied by Elsevier and the USGS and we are actively seeking additional content providers. By focusing on building a dependable infrastructure to support the retrieval, storage, and pre-processing of published content, we are establishing a foundation for

  15. Design of Integrated Database on Mobile Information System: A Study of Yogyakarta Smart City App

    Science.gov (United States)

    Nurnawati, E. K.; Ermawati, E.

    2018-02-01

    An integration database is a database which acts as the data store for multiple applications and thus integrates data across these applications (in contrast to an Application Database). An integration database needs a schema that takes all its client applications into account. The benefit of the schema that sharing data among applications does not require an extra layer of integration services on the applications. Any changes to data made in a single application are made available to all applications at the time of database commit - thus keeping the applications’ data use better synchronized. This study aims to design and build an integrated database that can be used by various applications in a mobile device based system platforms with the based on smart city system. The built-in database can be used by various applications, whether used together or separately. The design and development of the database are emphasized on the flexibility, security, and completeness of attributes that can be used together by various applications to be built. The method used in this study is to choice of the appropriate database logical structure (patterns of data) and to build the relational-database models (Design Databases). Test the resulting design with some prototype apps and analyze system performance with test data. The integrated database can be utilized both of the admin and the user in an integral and comprehensive platform. This system can help admin, manager, and operator in managing the application easily and efficiently. This Android-based app is built based on a dynamic clientserver where data is extracted from an external database MySQL. So if there is a change of data in the database, then the data on Android applications will also change. This Android app assists users in searching of Yogyakarta (as smart city) related information, especially in culture, government, hotels, and transportation.

  16. Towards integrated biorefinery from dried distillers grains: Selective extraction of pentoses using dilute acid hydrolysis

    International Nuclear Information System (INIS)

    Fonseca, Dania A.; Lupitskyy, Robert; Timmons, David; Gupta, Mayank; Satyavolu, Jagannadh

    2014-01-01

    The abundant availability and high level of hemicellulose content make dried distillers grains (DDG) an attractive feedstock for production of pentoses (C5) and conversion of C5 to bioproducts. One target of this work was to produce a C5 extract (hydrolyzate) with high yield and purity with a low concentration of C5 degradation products. A high selectivity towards pentoses was achieved using dilute acid hydrolysis of DDG in a percolation reactor with liquid recirculation. Pretreatment of starting material using screening and ultrasonication resulted in fractional increase of the pentose yield by 42%. A 94% yield of pentoses on the DDG (280.9 g kg −1 ) was obtained. Selective extraction of individual pentoses has been achieved by using a 2-stage hydrolysis process, resulting in arabinose-rich (arabinose 81.5%) and xylose-rich (xylose 85.2%) streams. A broader impact of this work is towards an Integrated Bio-Refinery based on DDG – for production of biofuels, biochemical intermediates, and other bioproducts. - Highlights: • A process for selective extraction of pentoses from DDG was presented as a part of integrated biorefinery approach. • The selectivity for pentoses was high using dilute acid hydrolysis in a percolation reactor with liquid recirculation. • Pretreatment of DDG using screening and ultrasonication resulted in fractional increase of the pentose yield by 42 %. • A 94% yield in pentoses (280.9 g kg −1 of DDG) was obtained. • A 2-stage hydrolysis process, developed to extract individual pentoses, resulted in arabinose and xylose rich streams

  17. OpenCV-Based Nanomanipulation Information Extraction and the Probe Operation in SEM

    Directory of Open Access Journals (Sweden)

    Dongjie Li

    2015-02-01

    Full Text Available Aimed at the established telenanomanipulation system, the method of extracting location information and the strategies of probe operation were studied in this paper. First, the machine learning algorithm of OpenCV was used to extract location information from SEM images. Thus nanowires and probe in SEM images can be automatically tracked and the region of interest (ROI can be marked quickly. Then the location of nanowire and probe can be extracted from the ROI. To study the probe operation strategy, the Van der Waals force between probe and a nanowire was computed; thus relevant operating parameters can be obtained. With these operating parameters, the nanowire in 3D virtual environment can be preoperated and an optimal path of the probe can be obtained. The actual probe runs automatically under the telenanomanipulation system's control. Finally, experiments were carried out to verify the above methods, and results show the designed methods have achieved the expected effect.

  18. Telematics and smart cards in integrated health information system.

    Science.gov (United States)

    Sicurello, F; Nicolosi, A

    1997-01-01

    Telematics and information technology are the base on which it will be possible to build an integrated health information system to support population and improve their quality of life. This system should be based on record linkage of all data based on the interactions of the patients with the health structures, such as general practitioners, specialists, health institutes and hospitals, pharmacies, etc. The record linkage can provide the connection and integration of various records, thanks to the use of telematic technology (either urban or geographical local networks, such as the Internet) and electronic data cards. Particular emphasis should be placed on the introduction of smart cards, such as portable health cards, which will contain a standardized data set and will be sufficient to access different databases found in various health services. The inter-operability of the social-health records (including multimedia types) and the smart cards (which are one of the most important prerequisites for the homogenization and wide diffusion of these cards at an European level) should be strongly taken into consideration. In this framework a project is going to be developed aiming towards the integration of various data bases distributed territorially, from the reading of the software and the updating of the smart cards to the complete management of the patients' evaluation records, to the quality of the services offered and to the health planning. The applications developed will support epidemiological investigation software and data analysis. The inter-connection of all the databases of the various structures involved will take place through a coordination center, the most important system of which we will call "record linkage" or "integrated database". Smart cards will be distributed to a sample group of possible users and the necessary smart card management tools will be installed in all the structures involved. All the final users (the patients) in the whole

  19. Towards integrating control and information theories from information-theoretic measures to control performance limitations

    CERN Document Server

    Fang, Song; Ishii, Hideaki

    2017-01-01

    This book investigates the performance limitation issues in networked feedback systems. The fact that networked feedback systems consist of control and communication devices and systems calls for the integration of control theory and information theory. The primary contributions of this book lie in two aspects: the newly-proposed information-theoretic measures and the newly-discovered control performance limitations. We first propose a number of information notions to facilitate the analysis. Using those notions, classes of performance limitations of networked feedback systems, as well as state estimation systems, are then investigated. In general, the book presents a unique, cohesive treatment of performance limitation issues of networked feedback systems via an information-theoretic approach. This book is believed to be the first to treat the aforementioned subjects systematically and in a unified manner, offering a unique perspective differing from existing books.

  20. Integrating multimodal information for intraoperative assistance in neurosurgery

    Directory of Open Access Journals (Sweden)

    Eisenmann U.

    2015-09-01

    Full Text Available Computer-assisted planning of complex neurosurgical interventions benefits from a variety of specific functions and tools. However, commercial planning- and neuronavigation systems are rather restrictive concerning the availability of innovative methods such as novel imaging modalities, fiber tracking algorithms or electrical dipole mapping. In this respect there is a demand for modular neurosurgical planning systems offering flexible interfaces for easy enhancement. Furthermore all relevant planning information should be available within neuron-avigation. In this work we present a planning system providing these capabilities and its suitability and application in a clinical setting. Our Multimodal Planning System (MOPS 3D offers a variety of tools such as definition of trajectories for minimally invasive surgery, segmentation of ROIs, integration of functional information from atlas maps or magnetoencephalography. It also supplies plugin interfaces for future extensions. For intraoperative application MOPS is coupled with the neuronavigation system Brainlab Vector Vision Cranial/ENT (VVC. We evaluated MOPS in the Department of Neurosurgery at the University Hospital Heidelberg. Surgical planning and navigation was performed in 5 frequently occurring clinical cases. The time necessary for planning was between 5 and 15 minutes including data import, segmentation and planning tasks. The additional information intraoperatively provided by MOPS 3D was highly appreciated by the neurosurgeons and the performance was satisfactory.

  1. Defense Nuclear Material Stewardship Integrated Inventory Information Management System (IIIMS).

    Energy Technology Data Exchange (ETDEWEB)

    Aas, Christopher A.; Lenhart, James E.; Bray, Olin H.; Witcher, Christina Jenkin

    2004-11-01

    Sandia National Laboratories was tasked with developing the Defense Nuclear Material Stewardship Integrated Inventory Information Management System (IIIMS) with the sponsorship of NA-125.3 and the concurrence of DOE/NNSA field and area offices. The purpose of IIIMS was to modernize nuclear materials management information systems at the enterprise level. Projects over the course of several years attempted to spearhead this modernization. The scope of IIIMS was broken into broad enterprise-oriented materials management and materials forecasting. The IIIMS prototype was developed to allow multiple participating user groups to explore nuclear material requirements and needs in detail. The purpose of material forecasting was to determine nuclear material availability over a 10 to 15 year period in light of the dynamic nature of nuclear materials management. Formal DOE Directives (requirements) were needed to direct IIIMS efforts but were never issued and the project has been halted. When restarted, duplicating or re-engineering the activities from 1999 to 2003 is unnecessary, and in fact future initiatives can build on previous work. IIIMS requirements should be structured to provide high confidence that discrepancies are detected, and classified information is not divulged. Enterprise-wide materials management systems maintained by the military can be used as overall models to base IIIMS implementation concepts upon.

  2. Methods to extract information on the atomic and molecular states from scientific abstracts

    International Nuclear Information System (INIS)

    Sasaki, Akira; Ueshima, Yutaka; Yamagiwa, Mitsuru; Murata, Masaki; Kanamaru, Toshiyuki; Shirado, Tamotsu; Isahara, Hitoshi

    2005-01-01

    We propose a new application of information technology to recognize and extract expressions of atomic and molecular states from electrical forms of scientific abstracts. Present results will help scientists to understand atomic states as well as the physics discussed in the articles. Combining with the internet search engines, it will make one possible to collect not only atomic and molecular data but broader scientific information over a wide range of research fields. (author)

  3. System and method for extracting physiological information from remotely detected electromagnetic radiation

    NARCIS (Netherlands)

    2016-01-01

    The present invention relates to a device and a method for extracting physiological information indicative of at least one health symptom from remotely detected electromagnetic radiation. The device comprises an interface (20) for receiving a data stream comprising remotely detected image data

  4. System and method for extracting physiological information from remotely detected electromagnetic radiation

    NARCIS (Netherlands)

    2015-01-01

    The present invention relates to a device and a method for extracting physiological information indicative of at least one health symptom from remotely detected electromagnetic radiation. The device comprises an interface (20) for receiving a data stream comprising remotely detected image data

  5. Network and Ensemble Enabled Entity Extraction in Informal Text (NEEEEIT) final report

    Energy Technology Data Exchange (ETDEWEB)

    Kegelmeyer, Philip W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shead, Timothy M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dunlavy, Daniel M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2013-09-01

    This SAND report summarizes the activities and outcomes of the Network and Ensemble Enabled Entity Extraction in Information Text (NEEEEIT) LDRD project, which addressed improving the accuracy of conditional random fields for named entity recognition through the use of ensemble methods.

  6. A construction scheme of web page comment information extraction system based on frequent subtree mining

    Science.gov (United States)

    Zhang, Xiaowen; Chen, Bingfeng

    2017-08-01

    Based on the frequent sub-tree mining algorithm, this paper proposes a construction scheme of web page comment information extraction system based on frequent subtree mining, referred to as FSM system. The entire system architecture and the various modules to do a brief introduction, and then the core of the system to do a detailed description, and finally give the system prototype.

  7. EXTRACT

    DEFF Research Database (Denmark)

    Pafilis, Evangelos; Buttigieg, Pier Luigi; Ferrell, Barbra

    2016-01-01

    The microbial and molecular ecology research communities have made substantial progress on developing standards for annotating samples with environment metadata. However, sample manual annotation is a highly labor intensive process and requires familiarity with the terminologies used. We have the...... and text-mining-assisted curation revealed that EXTRACT speeds up annotation by 15-25% and helps curators to detect terms that would otherwise have been missed.Database URL: https://extract.hcmr.gr/......., organism, tissue and disease terms. The evaluators in the BioCreative V Interactive Annotation Task found the system to be intuitive, useful, well documented and sufficiently accurate to be helpful in spotting relevant text passages and extracting organism and environment terms. Comparison of fully manual...

  8. Two-dimensional parasitic capacitance extraction for integrated circuit with dual discrete geometric methods

    International Nuclear Information System (INIS)

    Ren Dan; Ren Zhuoxiang; Qu Hui; Xu Xiaoyu

    2015-01-01

    Capacitance extraction is one of the key issues in integrated circuits and also a typical electrostatic problem. The dual discrete geometric method (DGM) is investigated to provide relative solutions in two-dimensional unstructured mesh space. The energy complementary characteristic and quick field energy computation thereof based on it are emphasized. Contrastive analysis between the dual finite element methods and the dual DGMs are presented both from theoretical derivation and through case studies. The DGM, taking the scalar potential as unknown on dual interlocked meshes, with simple form and good accuracy, is expected to be one of the mainstreaming methods in associated areas. (paper)

  9. Semi-automatic building extraction in informal settlements from high-resolution satellite imagery

    Science.gov (United States)

    Mayunga, Selassie David

    The extraction of man-made features from digital remotely sensed images is considered as an important step underpinning management of human settlements in any country. Man-made features and buildings in particular are required for varieties of applications such as urban planning, creation of geographical information systems (GIS) databases and Urban City models. The traditional man-made feature extraction methods are very expensive in terms of equipment, labour intensive, need well-trained personnel and cannot cope with changing environments, particularly in dense urban settlement areas. This research presents an approach for extracting buildings in dense informal settlement areas using high-resolution satellite imagery. The proposed system uses a novel strategy of extracting building by measuring a single point at the approximate centre of the building. The fine measurement of the building outlines is then effected using a modified snake model. The original snake model on which this framework is based, incorporates an external constraint energy term which is tailored to preserving the convergence properties of the snake model; its use to unstructured objects will negatively affect their actual shapes. The external constrained energy term was removed from the original snake model formulation, thereby, giving ability to cope with high variability of building shapes in informal settlement areas. The proposed building extraction system was tested on two areas, which have different situations. The first area was Tungi in Dar Es Salaam, Tanzania where three sites were tested. This area is characterized by informal settlements, which are illegally formulated within the city boundaries. The second area was Oromocto in New Brunswick, Canada where two sites were tested. Oromocto area is mostly flat and the buildings are constructed using similar materials. Qualitative and quantitative measures were employed to evaluate the accuracy of the results as well as the performance

  10. [Development method of healthcare information system integration based on business collaboration model].

    Science.gov (United States)

    Li, Shasha; Nie, Hongchao; Lu, Xudong; Duan, Huilong

    2015-02-01

    Integration of heterogeneous systems is the key to hospital information construction due to complexity of the healthcare environment. Currently, during the process of healthcare information system integration, people participating in integration project usually communicate by free-format document, which impairs the efficiency and adaptability of integration. A method utilizing business process model and notation (BPMN) to model integration requirement and automatically transforming it to executable integration configuration was proposed in this paper. Based on the method, a tool was developed to model integration requirement and transform it to integration configuration. In addition, an integration case in radiology scenario was used to verify the method.

  11. Dietary integration with natural extract in rabbit: effects on growth performances and meat quality.

    Directory of Open Access Journals (Sweden)

    Sara Chiapparini

    2018-06-01

    Full Text Available In many countries of Europe rabbit meat is consumed for its nutritional characteristics, (Dalle Zotte, 2014; Hernández and Gondret, 2006. Since the ban of the use of antibiotic as growth promoter, natural substances have been studied as alternative with antioxidant, anti-inflammatory, antimicrobic and antiviral properties. The aim was to evaluate the effect of a dietary supplementation with natural extract mixture in growing rabbit on growth performances, carcass characteristics and Longissimus lumborum (LL muscle parameters. The trial was performed at the Research Institute for Animal Production (Nitra, Slovak Republic and lasted 42 days. At 35 days of age, 144 New Zealand White rabbits were randomly selected and divided in 3 experimental groups (4 rabbits/cage. The first fed a basal diet, the second (T1 and the third one (T2 received 0.3% and 0.6% of natural extract mixture, containing polyphenols from plants and seaweeds.  Dietary integration with natural extract improve (P<0.05 growth performances (ADG, FI and FC in T1 group. The fatty acid composition of LL muscle was positively affected (P=0.037 by natural extract supplementation with an increase of n-3 FA in T2 group than other treatments. Cholesterol content tended to be lower in T2 group (P=0.082 than T1 and C group (24.8 mg/100g T2 vs 34.6 mg/100g T1 vs 33.2 mg/100g C. Sensory analysis revealed that only the aroma was affected (P<0.05 by dietary treatments. Overall these results highlight that dietary supplementation with natural extract mixture, containing polyphenols from plants and seaweeds enhance growth performances, carcass weight, improving LL muscle nutritional parameters.

  12. AN INTERACTIVE LOGISTICS CENTRE INFORMATION INTEGRATION SYSTEM USING VIRTUAL REALITY

    Directory of Open Access Journals (Sweden)

    S. Hong

    2018-04-01

    Full Text Available The logistics industry plays a very important role in the operation of modern cities. Meanwhile, the development of logistics industry has derived various problems that are urgent to be solved, such as the safety of logistics products. This paper combines the study of logistics industry traceability and logistics centre environment safety supervision with virtual reality technology, creates an interactive logistics centre information integration system. The proposed system utilizes the immerse characteristic of virtual reality, to simulate the real logistics centre scene distinctly, which can make operation staff conduct safety supervision training at any time without regional restrictions. On the one hand, a large number of sensor data can be used to simulate a variety of disaster emergency situations. On the other hand, collecting personnel operation data, to analyse the improper operation, which can improve the training efficiency greatly.

  13. An Interactive Logistics Centre Information Integration System Using Virtual Reality

    Science.gov (United States)

    Hong, S.; Mao, B.

    2018-04-01

    The logistics industry plays a very important role in the operation of modern cities. Meanwhile, the development of logistics industry has derived various problems that are urgent to be solved, such as the safety of logistics products. This paper combines the study of logistics industry traceability and logistics centre environment safety supervision with virtual reality technology, creates an interactive logistics centre information integration system. The proposed system utilizes the immerse characteristic of virtual reality, to simulate the real logistics centre scene distinctly, which can make operation staff conduct safety supervision training at any time without regional restrictions. On the one hand, a large number of sensor data can be used to simulate a variety of disaster emergency situations. On the other hand, collecting personnel operation data, to analyse the improper operation, which can improve the training efficiency greatly.

  14. Integrated System Technologies for Modular Trapped Ion Quantum Information Processing

    Science.gov (United States)

    Crain, Stephen G.

    Although trapped ion technology is well-suited for quantum information science, scalability of the system remains one of the main challenges. One of the challenges associated with scaling the ion trap quantum computer is the ability to individually manipulate the increasing number of qubits. Using micro-mirrors fabricated with micro-electromechanical systems (MEMS) technology, laser beams are focused on individual ions in a linear chain and steer the focal point in two dimensions. Multiple single qubit gates are demonstrated on trapped 171Yb+ qubits and the gate performance is characterized using quantum state tomography. The system features negligible crosstalk to neighboring ions (technologies demonstrated in this thesis can be integrated to form a single quantum register with all of the necessary resources to perform local gates as well as high fidelity readout and provide a photon link to other systems.

  15. Integrated multimedia information system on interactive CATV network

    Science.gov (United States)

    Lee, Meng-Huang; Chang, Shin-Hung

    1998-10-01

    In the current CATV system architectures, they provide one- way delivery of a common menu of entertainment to all the homes through the cable network. Through the technologies evolution, the interactive services (or two-way services) can be provided in the cable TV systems. They can supply customers with individualized programming and support real- time two-way communications. With a view to the service type changed from the one-way delivery systems to the two-way interactive systems, `on demand services' is a distinct feature of multimedia systems. In this paper, we present our work of building up an integrated multimedia system on interactive CATV network in Shih Chien University. Besides providing the traditional analog TV programming from the cable operator, we filter some channels to reserve them as our campus information channels. In addition to the analog broadcasting channel, the system also provides the interactive digital multimedia services, e.g. Video-On- Demand (VOD), Virtual Reality, BBS, World-Wide-Web, and Internet Radio Station. These two kinds of services are integrated in a CATV network by the separation of frequency allocation for the analog broadcasting service and the digital interactive services. Our ongoing work is to port our previous work of building up a VOD system conformed to DAVIC standard (for inter-operability concern) on Ethernet network into the current system.

  16. MIMI: multimodality, multiresource, information integration environment for biomedical core facilities.

    Science.gov (United States)

    Szymanski, Jacek; Wilson, David L; Zhang, Guo-Qiang

    2009-10-01

    The rapid expansion of biomedical research has brought substantial scientific and administrative data management challenges to modern core facilities. Scientifically, a core facility must be able to manage experimental workflow and the corresponding set of large and complex scientific data. It must also disseminate experimental data to relevant researchers in a secure and expedient manner that facilitates collaboration and provides support for data interpretation and analysis. Administratively, a core facility must be able to manage the scheduling of its equipment and to maintain a flexible and effective billing system to track material, resource, and personnel costs and charge for services to sustain its operation. It must also have the ability to regularly monitor the usage and performance of its equipment and to provide summary statistics on resources spent on different categories of research. To address these informatics challenges, we introduce a comprehensive system called MIMI (multimodality, multiresource, information integration environment) that integrates the administrative and scientific support of a core facility into a single web-based environment. We report the design, development, and deployment experience of a baseline MIMI system at an imaging core facility and discuss the general applicability of such a system in other types of core facilities. These initial results suggest that MIMI will be a unique, cost-effective approach to addressing the informatics infrastructure needs of core facilities and similar research laboratories.

  17. Modular multiple sensors information management for computer-integrated surgery.

    Science.gov (United States)

    Vaccarella, Alberto; Enquobahrie, Andinet; Ferrigno, Giancarlo; Momi, Elena De

    2012-09-01

    In the past 20 years, technological advancements have modified the concept of modern operating rooms (ORs) with the introduction of computer-integrated surgery (CIS) systems, which promise to enhance the outcomes, safety and standardization of surgical procedures. With CIS, different types of sensor (mainly position-sensing devices, force sensors and intra-operative imaging devices) are widely used. Recently, the need for a combined use of different sensors raised issues related to synchronization and spatial consistency of data from different sources of information. In this study, we propose a centralized, multi-sensor management software architecture for a distributed CIS system, which addresses sensor information consistency in both space and time. The software was developed as a data server module in a client-server architecture, using two open-source software libraries: Image-Guided Surgery Toolkit (IGSTK) and OpenCV. The ROBOCAST project (FP7 ICT 215190), which aims at integrating robotic and navigation devices and technologies in order to improve the outcome of the surgical intervention, was used as the benchmark. An experimental protocol was designed in order to prove the feasibility of a centralized module for data acquisition and to test the application latency when dealing with optical and electromagnetic tracking systems and ultrasound (US) imaging devices. Our results show that a centralized approach is suitable for minimizing synchronization errors; latency in the client-server communication was estimated to be 2 ms (median value) for tracking systems and 40 ms (median value) for US images. The proposed centralized approach proved to be adequate for neurosurgery requirements. Latency introduced by the proposed architecture does not affect tracking system performance in terms of frame rate and limits US images frame rate at 25 fps, which is acceptable for providing visual feedback to the surgeon in the OR. Copyright © 2012 John Wiley & Sons, Ltd.

  18. Annotating novel genes by integrating synthetic lethals and genomic information

    Directory of Open Access Journals (Sweden)

    Faty Mahamadou

    2008-01-01

    Full Text Available Abstract Background Large scale screening for synthetic lethality serves as a common tool in yeast genetics to systematically search for genes that play a role in specific biological processes. Often the amounts of data resulting from a single large scale screen far exceed the capacities of experimental characterization of every identified target. Thus, there is need for computational tools that select promising candidate genes in order to reduce the number of follow-up experiments to a manageable size. Results We analyze synthetic lethality data for arp1 and jnm1, two spindle migration genes, in order to identify novel members in this process. To this end, we use an unsupervised statistical method that integrates additional information from biological data sources, such as gene expression, phenotypic profiling, RNA degradation and sequence similarity. Different from existing methods that require large amounts of synthetic lethal data, our method merely relies on synthetic lethality information from two single screens. Using a Multivariate Gaussian Mixture Model, we determine the best subset of features that assign the target genes to two groups. The approach identifies a small group of genes as candidates involved in spindle migration. Experimental testing confirms the majority of our candidates and we present she1 (YBL031W as a novel gene involved in spindle migration. We applied the statistical methodology also to TOR2 signaling as another example. Conclusion We demonstrate the general use of Multivariate Gaussian Mixture Modeling for selecting candidate genes for experimental characterization from synthetic lethality data sets. For the given example, integration of different data sources contributes to the identification of genetic interaction partners of arp1 and jnm1 that play a role in the same biological process.

  19. Contextual Sensing: Integrating Contextual Information with Human and Technical Geo-Sensor Information for Smart Cities

    Science.gov (United States)

    Sagl, Günther; Resch, Bernd; Blaschke, Thomas

    2015-01-01

    In this article we critically discuss the challenge of integrating contextual information, in particular spatiotemporal contextual information, with human and technical sensor information, which we approach from a geospatial perspective. We start by highlighting the significance of context in general and spatiotemporal context in particular and introduce a smart city model of interactions between humans, the environment, and technology, with context at the common interface. We then focus on both the intentional and the unintentional sensing capabilities of today’s technologies and discuss current technological trends that we consider have the ability to enrich human and technical geo-sensor information with contextual detail. The different types of sensors used to collect contextual information are analyzed and sorted into three groups on the basis of names considering frequently used related terms, and characteristic contextual parameters. These three groups, namely technical in situ sensors, technical remote sensors, and human sensors are analyzed and linked to three dimensions involved in sensing (data generation, geographic phenomena, and type of sensing). In contrast to other scientific publications, we found a large number of technologies and applications using in situ and mobile technical sensors within the context of smart cities, and surprisingly limited use of remote sensing approaches. In this article we further provide a critical discussion of possible impacts and influences of both technical and human sensing approaches on society, pointing out that a larger number of sensors, increased fusion of information, and the use of standardized data formats and interfaces will not necessarily result in any improvement in the quality of life of the citizens of a smart city. This article seeks to improve our understanding of technical and human geo-sensing capabilities, and to demonstrate that the use of such sensors can facilitate the integration of different

  20. Contextual Sensing: Integrating Contextual Information with Human and Technical Geo-Sensor Information for Smart Cities.

    Science.gov (United States)

    Sagl, Günther; Resch, Bernd; Blaschke, Thomas

    2015-07-14

    In this article we critically discuss the challenge of integrating contextual information, in particular spatiotemporal contextual information, with human and technical sensor information, which we approach from a geospatial perspective. We start by highlighting the significance of context in general and spatiotemporal context in particular and introduce a smart city model of interactions between humans, the environment, and technology, with context at the common interface. We then focus on both the intentional and the unintentional sensing capabilities of today's technologies and discuss current technological trends that we consider have the ability to enrich human and technical geo-sensor information with contextual detail. The different types of sensors used to collect contextual information are analyzed and sorted into three groups on the basis of names considering frequently used related terms, and characteristic contextual parameters. These three groups, namely technical in situ sensors, technical remote sensors, and human sensors are analyzed and linked to three dimensions involved in sensing (data generation, geographic phenomena, and type of sensing). In contrast to other scientific publications, we found a large number of technologies and applications using in situ and mobile technical sensors within the context of smart cities, and surprisingly limited use of remote sensing approaches. In this article we further provide a critical discussion of possible impacts and influences of both technical and human sensing approaches on society, pointing out that a larger number of sensors, increased fusion of information, and the use of standardized data formats and interfaces will not necessarily result in any improvement in the quality of life of the citizens of a smart city. This article seeks to improve our understanding of technical and human geo-sensing capabilities, and to demonstrate that the use of such sensors can facilitate the integration of different

  1. 48 CFR 9.104-6 - Federal Awardee Performance and Integrity Information System.

    Science.gov (United States)

    2010-10-01

    ... Performance and Integrity Information System. 9.104-6 Section 9.104-6 Federal Acquisition Regulations System... Contractors 9.104-6 Federal Awardee Performance and Integrity Information System. (a) Before awarding a... Federal Awardee Performance and Integrity Information System (FAPIIS), (available at www.ppirs.gov, then...

  2. The fruit extract of Berberis crataegina DC: exerts potent antioxidant activity and protects DNA integrity.

    Science.gov (United States)

    Charehsaz, Mohammad; Sipahi, Hande; Celep, Engin; Üstündağ, Aylin; Cemiloğlu Ülker, Özge; Duydu, Yalçın; Aydın, Ahmet; Yesilada, Erdem

    2015-04-17

    Dried fruits of Berberis crataegina (Berberidaceae) have been frequently consumed as food garniture in Turkish cuisine, while its fruit paste has been used to increase stamina and in particular to prevent from cardiovascular dysfunctions in Northeastern Black Sea region of Turkey. This study investigated this folkloric information in order to explain the claimed healing effects as well as to evaluate possible risks. Total phenolic, flavonoid and proanthocyanidin contents and antioxidant capacity of the methanolic fruit extract were evaluated through several in vitro assays. The cytotoxic and genotoxic effects of B. crataegina fruit extract were also assessed in both cervical cancer cell line (HeLa) and human peripheral blood lymphocytes. The extract showed protective effects against ferric-induced oxidative stress and had a relatively good antioxidant activity. It also ameliorated the H2O2 mediated DNA damage in lymphocytes, suggesting the protective effect against oxidative DNA damage. The methanolic extract of B. crataegina fruits may be a potential antioxidant nutrient and also may exert a protective role against lipid peroxidation as well as oxidative DNA damage.

  3. Mechanisms for integration of information models across related domains

    Science.gov (United States)

    Atkinson, Rob

    2010-05-01

    It is well recognised that there are opportunities and challenges in cross-disciplinary data integration. A significant barrier, however, is creating a conceptual model of the combined domains and the area of integration. For example, a groundwater domain application may require information from several related domains: geology, hydrology, water policy, etc. Each domain may have its own data holdings and conceptual models, but these will share various common concepts (eg. The concept of an aquifer). These areas of semantic overlap present significant challenges, firstly to choose a single representation (model) of a concept that appears in multiple disparate models,, then to harmonise these other models with the single representation. In addition, models may exist at different levels of abstraction depending on how closely aligned they are with a particular implementation. This makes it hard for modellers in one domain to introduce elements from another domain without either introducing a specific style of implementation, or conversely dealing with a set of abstract patterns that are hard to integrate with existing implementations. Models are easier to integrate if they are broken down into small units, with common concepts implemented using common models from well-known, and predictably managed shared libraries. This vision however requires development of a set of mechanisms (tools and procedures) for implementing and exploiting libraries of model components. These mechanisms need to handle publication, discovery, subscription, versioning and implementation of models in different forms. In this presentation a coherent suite of such mechanisms is proposed, using a scenario based on re-use of geosciences models. This approach forms the basis of a comprehensive strategy to empower domain modellers to create more interoperable systems. The strategy address a range of concerns and practice, and includes methodologies, an accessible toolkit, improvements to available

  4. RESEARCH ON REMOTE SENSING GEOLOGICAL INFORMATION EXTRACTION BASED ON OBJECT ORIENTED CLASSIFICATION

    Directory of Open Access Journals (Sweden)

    H. Gao

    2018-04-01

    Full Text Available The northern Tibet belongs to the Sub cold arid climate zone in the plateau. It is rarely visited by people. The geological working conditions are very poor. However, the stratum exposures are good and human interference is very small. Therefore, the research on the automatic classification and extraction of remote sensing geological information has typical significance and good application prospect. Based on the object-oriented classification in Northern Tibet, using the Worldview2 high-resolution remote sensing data, combined with the tectonic information and image enhancement, the lithological spectral features, shape features, spatial locations and topological relations of various geological information are excavated. By setting the threshold, based on the hierarchical classification, eight kinds of geological information were classified and extracted. Compared with the existing geological maps, the accuracy analysis shows that the overall accuracy reached 87.8561 %, indicating that the classification-oriented method is effective and feasible for this study area and provides a new idea for the automatic extraction of remote sensing geological information.

  5. From false integration of viewers on informative TV

    Directory of Open Access Journals (Sweden)

    Felisbela Lopes

    2011-12-01

    Full Text Available New media fast-paced technologies are constantly feeding the contemporary (telespectator with the promise of "empowerment". In the last decades, an "empowered user" notion is being built around the mythical narrative of 'omnipotence', that becomes the verb and the active complement to the modern promise of 'omnividence": the one who sees everything, can do everything. However, what we actually find when we scan the news and information broadcasts of portuguese TV, is that we don't find many traces of this supposedly "empowered" spectator. Broadcasts scarcely consider a structural participation of their public,or they only include spectators in euphemistical terms, letting them in just for the sake of having them in, thus treating the public not as citizens but as audiences and revealing a false "empowerment". In this article we analyze the integration of TV spectators on a total of 1673 news and information broadcasts in 6 Portuguese channels (RTP1, SIC, TVI, SICN, RTPN, TVI 24. This work is part of a research project called "Television journalism and citizenship".

  6. Clinical Information Systems Integration in New York City's First Mobile Stroke Unit.

    Science.gov (United States)

    Kummer, Benjamin R; Lerario, Michael P; Navi, Babak B; Ganzman, Adam C; Ribaudo, Daniel; Mir, Saad A; Pishanidar, Sammy; Lekic, Tim; Williams, Olajide; Kamel, Hooman; Marshall, Randolph S; Hripcsak, George; Elkind, Mitchell S V; Fink, Matthew E

    2018-01-01

    Mobile stroke units (MSUs) reduce time to thrombolytic therapy in acute ischemic stroke. These units are widely used, but the clinical information systems underlying MSU operations are understudied. The first MSU on the East Coast of the United States was established at New York Presbyterian Hospital (NYP) in October 2016. We describe our program's 7-month pilot, focusing on the integration of our hospital's clinical information systems into our MSU to support patient care and research efforts. NYP's MSU was staffed by two paramedics, one radiology technologist, and a vascular neurologist. The unit was equipped with four laptop computers and networking infrastructure enabling all staff to access the hospital intranet and clinical applications during operating hours. A telephone-based registration procedure registered patients from the field into our admit/discharge/transfer system, which interfaced with the institutional electronic health record (EHR). We developed and implemented a computerized physician order entry set in our EHR with prefilled values to permit quick ordering of medications, imaging, and laboratory testing. We also developed and implemented a structured clinician note to facilitate care documentation and clinical data extraction. Our MSU began operating on October 3, 2016. As of April 27, 2017, the MSU transported 49 patients, of whom 16 received tissue plasminogen activator (t-PA). Zero technical problems impacting patient care were reported around registration, order entry, or intranet access. Two onboard network failures occurred, resulting in computed tomography scanner malfunctions, although no patients became ineligible for time-sensitive treatment as a result. Thirteen (26.5%) clinical notes contained at least one incomplete time field. The main technical challenges encountered during the integration of our hospital's clinical information systems into our MSU were onboard network failures and incomplete clinical documentation. Future

  7. Weather information integration in transportation management center (TMC) operations.

    Science.gov (United States)

    2011-01-02

    This report presents the results of the third phase of an on-going FHWA study on weather integration in Transportation Management Center (TMC) operations. The report briefly describes the earlier phases of the integration study, summarizes the findin...

  8. Extract transformation loading from OLTP to OLAP data using pentaho data integration

    Science.gov (United States)

    Salaki, R. J.; Waworuntu, J.; Tangkawarow, I. R. H. T.

    2016-04-01

    The design of the data warehouse in this case is expected to solve the problem of evaluation of learning results as well as the relevance of the information received to support decision-making by the leader. Data warehouse design is very important, which is designed to utilize the existing resources of information. GPA (Grade Point Average) data warehouse can be used for the process of evaluation, decision making and even further planning of the study program of PTIK. The diversity of data sources in the course PTIK make decisionmaking and evaluation process becomes not easier. Pentaho Data Integration is used to integrate data in PTIK easy. CPI data warehouse design with multidimensional database modeling approach using the dimension tables and fact tables.

  9. Mass extraction container closure integrity physical testing method development for parenteral container closure systems.

    Science.gov (United States)

    Yoon, Seung-Yil; Sagi, Hemi; Goldhammer, Craig; Li, Lei

    2012-01-01

    Container closure integrity (CCI) is a critical factor to ensure that product sterility is maintained over its entire shelf life. Assuring the CCI during container closure (C/C) system qualification, routine manufacturing and stability is important. FDA guidance also encourages industry to develop a CCI physical testing method in lieu of sterility testing in a stability program. A mass extraction system has been developed to check CCI for a variety of container closure systems such as vials, syringes, and cartridges. Various types of defects (e.g., glass micropipette, laser drill, wire) were created and used to demonstrate a detection limit. Leakage, detected as mass flow in this study, changes as a function of defect length and diameter. Therefore, the morphology of defects has been examined in detail with fluid theories. This study demonstrated that a mass extraction system was able to distinguish between intact samples and samples with 2 μm defects reliably when the defect was exposed to air, water, placebo, or drug product (3 mg/mL concentration) solution. Also, it has been verified that the method was robust, and capable of determining the acceptance limit using 3σ for syringes and 6σ for vials. Sterile products must maintain their sterility over their entire shelf life. Container closure systems such as those found in syringes and vials provide a seal between rubber and glass containers. This seal must be ensured to maintain product sterility. A mass extraction system has been developed to check container closure integrity for a variety of container closure systems such as vials, syringes, and cartridges. In order to demonstrate the method's capability, various types of defects (e.g., glass micropipette, laser drill, wire) were created in syringes and vials and were tested. This study demonstrated that a mass extraction system was able to distinguish between intact samples and samples with 2 μm defects reliably when the defect was exposed to air, water

  10. Research on monitoring and management information integration technique in waste treatment and management

    International Nuclear Information System (INIS)

    Kong Jinsong; Yu Ren; Mao Wei

    2013-01-01

    The integration of the waste treatment process and the device status monitoring information and management information is a key problem required to be solved in the information integration of the waste treatment and management. The main content of the monitoring and management information integration is discussed in the paper. The data exchange techniques, which are based on the OPC, FTP and data push technology, are applied to the different monitoring system respectively, according to their development platform, to realize the integration of the waste treatment process and device status monitoring information and management information in a waste treatment center. (authors)

  11. Integrated System Validation Usability Questionnaire: Information Display Element

    International Nuclear Information System (INIS)

    Garcés, Ma. I.; Torralba, B.

    2015-01-01

    The Research and Development (R&D) project on “Theoretical and Methodological Approaches to Integrated System Validation of Control Rooms, 2014-2015”, in which the research activities described in this report are framed, has two main objectives: to develop the items for an usability methodology conceived as a part of the measurement framework for performance-based control room evaluation that the OECD Halden Reactor Project will test in the experiments planned for 2015; and the statistical analysis of the data generated in the experimental activities of the Halden Man-Machine Laboratory (HAMMLAB) facility, with previous usability questionnaires, in 2010 and 2011. In this report, the procedure designed to meet the first goal of the project is described, in particular, the process followed to identify the items related to information displays, one of the elements to be included in the usability questionnaire. Three phases are performed, in the first one, the approaches developed by the United States Nuclear Regulatory Commission, NRC, are reviewed and the models proposed by the nuclear energy industry and their technical support organizations, mainly, the United States Electric Power Research Institute, EPRI, are analyzed. In the remaining stages, general and specific guidelines for information displays, in particular, display pages, formats, elements and data quality and update rate recommendations are compared and criteria for the preliminary selection of the items that should be incorporated into the usability questionnaire are defined. This proposal will be reviewed and adapted by the Halden Reactor Project to the design of the specific experiments performed in HAMMLAB.

  12. Extracting information from two-dimensional electrophoresis gels by partial least squares regression

    DEFF Research Database (Denmark)

    Jessen, Flemming; Lametsch, R.; Bendixen, E.

    2002-01-01

    of all proteins/spots in the gels. In the present study it is demonstrated how information can be extracted by multivariate data analysis. The strategy is based on partial least squares regression followed by variable selection to find proteins that individually or in combination with other proteins vary......Two-dimensional gel electrophoresis (2-DE) produces large amounts of data and extraction of relevant information from these data demands a cautious and time consuming process of spot pattern matching between gels. The classical approach of data analysis is to detect protein markers that appear...... or disappear depending on the experimental conditions. Such biomarkers are found by comparing the relative volumes of individual spots in the individual gels. Multivariate statistical analysis and modelling of 2-DE data for comparison and classification is an alternative approach utilising the combination...

  13. ROLE OF INTEGRATIVE ENTERPRENEURIAL CULTURE IN THE INFORMATION SOCIETY

    OpenAIRE

    Malysheva, E.V.

    2014-01-01

    The article deals with concepts such as universal entrepreneurial culture and integrative entrepreneurial culture. In article studied main characteristics of universal entrepreneurial culture and integrative entrepreneurial culture. In article explores the concept of «knowledge management» and « diversity management». In the article presents real examples of the integrative entrepreneurial culture in companies.

  14. Addressing Risk Assessment for Patient Safety in Hospitals through Information Extraction in Medical Reports

    Science.gov (United States)

    Proux, Denys; Segond, Frédérique; Gerbier, Solweig; Metzger, Marie Hélène

    Hospital Acquired Infections (HAI) is a real burden for doctors and risk surveillance experts. The impact on patients' health and related healthcare cost is very significant and a major concern even for rich countries. Furthermore required data to evaluate the threat is generally not available to experts and that prevents from fast reaction. However, recent advances in Computational Intelligence Techniques such as Information Extraction, Risk Patterns Detection in documents and Decision Support Systems allow now to address this problem.

  15. From Specific Information Extraction to Inferences: A Hierarchical Framework of Graph Comprehension

    Science.gov (United States)

    2004-09-01

    The skill to interpret the information displayed in graphs is so important to have, the National Council of Teachers of Mathematics has created...guidelines to ensure that students learn these skills ( NCTM : Standards for Mathematics , 2003). These guidelines are based primarily on the extraction of...graphical perception. Human Computer Interaction, 8, 353-388. NCTM : Standards for Mathematics . (2003, 2003). Peebles, D., & Cheng, P. C.-H. (2002

  16. The Proteins API: accessing key integrated protein and genome information.

    Science.gov (United States)

    Nightingale, Andrew; Antunes, Ricardo; Alpi, Emanuele; Bursteinas, Borisas; Gonzales, Leonardo; Liu, Wudong; Luo, Jie; Qi, Guoying; Turner, Edd; Martin, Maria

    2017-07-03

    The Proteins API provides searching and programmatic access to protein and associated genomics data such as curated protein sequence positional annotations from UniProtKB, as well as mapped variation and proteomics data from large scale data sources (LSS). Using the coordinates service, researchers are able to retrieve the genomic sequence coordinates for proteins in UniProtKB. This, the LSS genomics and proteomics data for UniProt proteins is programmatically only available through this service. A Swagger UI has been implemented to provide documentation, an interface for users, with little or no programming experience, to 'talk' to the services to quickly and easily formulate queries with the services and obtain dynamically generated source code for popular programming languages, such as Java, Perl, Python and Ruby. Search results are returned as standard JSON, XML or GFF data objects. The Proteins API is a scalable, reliable, fast, easy to use RESTful services that provides a broad protein information resource for users to ask questions based upon their field of expertise and allowing them to gain an integrated overview of protein annotations available to aid their knowledge gain on proteins in biological processes. The Proteins API is available at (http://www.ebi.ac.uk/proteins/api/doc). © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  17. Integrated Shoreline Extraction Approach with Use of Rasat MS and SENTINEL-1A SAR Images

    Science.gov (United States)

    Demir, N.; Oy, S.; Erdem, F.; Şeker, D. Z.; Bayram, B.

    2017-09-01

    Shorelines are complex ecosystems and highly important socio-economic environments. They may change rapidly due to both natural and human-induced effects. Determination of movements along the shoreline and monitoring of the changes are essential for coastline management, modeling of sediment transportation and decision support systems. Remote sensing provides an opportunity to obtain rapid, up-to-date and reliable information for monitoring of shoreline. In this study, approximately 120 km of Antalya-Kemer shoreline which is under the threat of erosion, deposition, increasing of inhabitants and urbanization and touristic hotels, has been selected as the study area. In the study, RASAT pansharpened and SENTINEL-1A SAR images have been used to implement proposed shoreline extraction methods. The main motivation of this study is to combine the land/water body segmentation results of both RASAT MS and SENTINEL-1A SAR images to improve the quality of the results. The initial land/water body segmentation has been obtained using RASAT image by means of Random Forest classification method. This result has been used as training data set to define fuzzy parameters for shoreline extraction from SENTINEL-1A SAR image. Obtained results have been compared with the manually digitized shoreline. The accuracy assessment has been performed by calculating perpendicular distances between reference data and extracted shoreline by proposed method. As a result, the mean difference has been calculated around 1 pixel.

  18. Extraction of land cover change information from ENVISAT-ASAR data in Chengdu Plain

    Science.gov (United States)

    Xu, Wenbo; Fan, Jinlong; Huang, Jianxi; Tian, Yichen; Zhang, Yong

    2006-10-01

    Land cover data are essential to most global change research objectives, including the assessment of current environmental conditions and the simulation of future environmental scenarios that ultimately lead to public policy development. Chinese Academy of Sciences generated a nationwide land cover database in order to carry out the quantification and spatial characterization of land use/cover changes (LUCC) in 1990s. In order to improve the reliability of the database, we will update the database anytime. But it is difficult to obtain remote sensing data to extract land cover change information in large-scale. It is hard to acquire optical remote sensing data in Chengdu plain, so the objective of this research was to evaluate multitemporal ENVISAT advanced synthetic aperture radar (ASAR) data for extracting land cover change information. Based on the fieldwork and the nationwide 1:100000 land cover database, the paper assesses several land cover changes in Chengdu plain, for example: crop to buildings, forest to buildings, and forest to bare land. The results show that ENVISAT ASAR data have great potential for the applications of extracting land cover change information.

  19. KneeTex: an ontology-driven system for information extraction from MRI reports.

    Science.gov (United States)

    Spasić, Irena; Zhao, Bo; Jones, Christopher B; Button, Kate

    2015-01-01

    In the realm of knee pathology, magnetic resonance imaging (MRI) has the advantage of visualising all structures within the knee joint, which makes it a valuable tool for increasing diagnostic accuracy and planning surgical treatments. Therefore, clinical narratives found in MRI reports convey valuable diagnostic information. A range of studies have proven the feasibility of natural language processing for information extraction from clinical narratives. However, no study focused specifically on MRI reports in relation to knee pathology, possibly due to the complexity of knee anatomy and a wide range of conditions that may be associated with different anatomical entities. In this paper we describe KneeTex, an information extraction system that operates in this domain. As an ontology-driven information extraction system, KneeTex makes active use of an ontology to strongly guide and constrain text analysis. We used automatic term recognition to facilitate the development of a domain-specific ontology with sufficient detail and coverage for text mining applications. In combination with the ontology, high regularity of the sublanguage used in knee MRI reports allowed us to model its processing by a set of sophisticated lexico-semantic rules with minimal syntactic analysis. The main processing steps involve named entity recognition combined with coordination, enumeration, ambiguity and co-reference resolution, followed by text segmentation. Ontology-based semantic typing is then used to drive the template filling process. We adopted an existing ontology, TRAK (Taxonomy for RehAbilitation of Knee conditions), for use within KneeTex. The original TRAK ontology expanded from 1,292 concepts, 1,720 synonyms and 518 relationship instances to 1,621 concepts, 2,550 synonyms and 560 relationship instances. This provided KneeTex with a very fine-grained lexico-semantic knowledge base, which is highly attuned to the given sublanguage. Information extraction results were evaluated

  20. Integrated Modeling Approach for the Development of Climate-Informed, Actionable Information

    Directory of Open Access Journals (Sweden)

    David R. Judi

    2018-06-01

    Full Text Available Flooding is a prevalent natural disaster with both short and long-term social, economic, and infrastructure impacts. Changes in intensity and frequency of precipitation (including rain, snow, and rain-on-snow events create challenges for the planning and management of resilient infrastructure and communities. While there is general acknowledgment that new infrastructure design should account for future climate change, no clear methods or actionable information are available to community planners and designers to ensure resilient designs considering an uncertain climate future. This research demonstrates an approach for an integrated, multi-model, and multi-scale simulation to evaluate future flood impacts. This research used regional climate projections to drive high-resolution hydrology and flood models to evaluate social, economic, and infrastructure resilience for the Snohomish Watershed, WA, USA. Using the proposed integrated modeling approach, the peaks of precipitation and streamflows were found to shift from spring and summer to the earlier winter season. Moreover, clear non-stationarities in future flood risk were discovered under various climate scenarios. This research provides a clear approach for the incorporation of climate science in flood resilience analysis and to also provides actionable information relative to the frequency and intensity of future precipitation events.

  1. SAR matrices: automated extraction of information-rich SAR tables from large compound data sets.

    Science.gov (United States)

    Wassermann, Anne Mai; Haebel, Peter; Weskamp, Nils; Bajorath, Jürgen

    2012-07-23

    We introduce the SAR matrix data structure that is designed to elucidate SAR patterns produced by groups of structurally related active compounds, which are extracted from large data sets. SAR matrices are systematically generated and sorted on the basis of SAR information content. Matrix generation is computationally efficient and enables processing of large compound sets. The matrix format is reminiscent of SAR tables, and SAR patterns revealed by different categories of matrices are easily interpretable. The structural organization underlying matrix formation is more flexible than standard R-group decomposition schemes. Hence, the resulting matrices capture SAR information in a comprehensive manner.

  2. Analysis of Factors Affect to Organizational Performance In Using Accounting Information Systems Through Users Satisfaction and Integration Information Systems

    Directory of Open Access Journals (Sweden)

    Anton Arisman

    2017-09-01

    Full Text Available The aim of this research is to investigate the factors affecting organizational performance in using accounting information system through users satisfaction and integration information systems. The research respondents were 447 companies that listed in Indonesian Stock Exchange. The data are gathered through consensus method and in total there are 176 responses with complete data. Structural Equation Model (SEM is used in analyzing the data and system theory is utilized in this research. The result shows that knowledge management systems and management control system have significant influence on users satisfaction and integration information systems.  Integration information system and users satisfaction has positive significant on organizational performance.

  3. Advanced Recovery and Integrated Extraction System (ARIES) program plan. Rev. 1

    International Nuclear Information System (INIS)

    Nelson, T.O.; Massey, P.W.; Cremers, T.L.

    1996-01-01

    The Advanced Recovery and Integrated Extraction System (ARIES) demonstration combines various technologies, some of which were/are being developed under previous/other Department of Energy (DOE) funded programs. ARIES is an overall processing system for the dismantlement of nuclear weapon primaries. The program will demonstrate dismantlement of nuclear weapons and retrieval of the plutonium into a form that is compatible with long term storage and that is inspectable in an unclassified form appropriate for the application of traditional international safeguards. The success of the ARIES demonstration would lead to the development of a transportable modular or other facility type systems for weapons dismantlement to be used at other DOE sites as well as in other countries

  4. Comparison of methods of extracting information for meta-analysis of observational studies in nutritional epidemiology

    Directory of Open Access Journals (Sweden)

    Jong-Myon Bae

    2016-01-01

    Full Text Available OBJECTIVES: A common method for conducting a quantitative systematic review (QSR for observational studies related to nutritional epidemiology is the “highest versus lowest intake” method (HLM, in which only the information concerning the effect size (ES of the highest category of a food item is collected on the basis of its lowest category. However, in the interval collapsing method (ICM, a method suggested to enable a maximum utilization of all available information, the ES information is collected by collapsing all categories into a single category. This study aimed to compare the ES and summary effect size (SES between the HLM and ICM. METHODS: A QSR for evaluating the citrus fruit intake and risk of pancreatic cancer and calculating the SES by using the HLM was selected. The ES and SES were estimated by performing a meta-analysis using the fixed-effect model. The directionality and statistical significance of the ES and SES were used as criteria for determining the concordance between the HLM and ICM outcomes. RESULTS: No significant differences were observed in the directionality of SES extracted by using the HLM or ICM. The application of the ICM, which uses a broader information base, yielded more-consistent ES and SES, and narrower confidence intervals than the HLM. CONCLUSIONS: The ICM is advantageous over the HLM owing to its higher statistical accuracy in extracting information for QSR on nutritional epidemiology. The application of the ICM should hence be recommended for future studies.

  5. An integrated biomedical knowledge extraction and analysis platform: using federated search and document clustering technology.

    Science.gov (United States)

    Taylor, Donald P

    2007-01-01

    High content screening (HCS) requires time-consuming and often complex iterative information retrieval and assessment approaches to optimally conduct drug discovery programs and biomedical research. Pre- and post-HCS experimentation both require the retrieval of information from public as well as proprietary literature in addition to structured information assets such as compound libraries and projects databases. Unfortunately, this information is typically scattered across a plethora of proprietary bioinformatics tools and databases and public domain sources. Consequently, single search requests must be presented to each information repository, forcing the results to be manually integrated for a meaningful result set. Furthermore, these bioinformatics tools and data repositories are becoming increasingly complex to use; typically they fail to allow for more natural query interfaces. Vivisimo has developed an enterprise software platform to bridge disparate silos of information. The platform automatically categorizes search results into descriptive folders without the use of taxonomies to drive the categorization. A new approach to information retrieval for HCS experimentation is proposed.

  6. Integrating Multiple On-line Knowledge Bases for Disease-Lab Test Relation Extraction.

    Science.gov (United States)

    Zhang, Yaoyun; Soysal, Ergin; Moon, Sungrim; Wang, Jingqi; Tao, Cui; Xu, Hua

    2015-01-01

    A computable knowledge base containing relations between diseases and lab tests would be a great resource for many biomedical informatics applications. This paper describes our initial step towards establishing a comprehensive knowledge base of disease and lab tests relations utilizing three public on-line resources. LabTestsOnline, MedlinePlus and Wikipedia are integrated to create a freely available, computable disease-lab test knowledgebase. Disease and lab test concepts are identified using MetaMap and relations between diseases and lab tests are determined based on source-specific rules. Experimental results demonstrate a high precision for relation extraction, with Wikipedia achieving the highest precision of 87%. Combining the three sources reached a recall of 51.40%, when compared with a subset of disease-lab test relations extracted from a reference book. Moreover, we found additional disease-lab test relations from on-line resources, indicating they are complementary to existing reference books for building a comprehensive disease and lab test relation knowledge base.

  7. Feature extraction and learning using context cue and Rényi entropy based mutual information

    DEFF Research Database (Denmark)

    Pan, Hong; Olsen, Søren Ingvor; Zhu, Yaping

    2015-01-01

    information. In particular, for feature extraction, we develop a new set of kernel descriptors−Context Kernel Descriptors (CKD), which enhance the original KDES by embedding the spatial context into the descriptors. Context cues contained in the context kernel enforce some degree of spatial consistency, thus...... improving the robustness of CKD. For feature learning and reduction, we propose a novel codebook learning method, based on a Rényi quadratic entropy based mutual information measure called Cauchy-Schwarz Quadratic Mutual Information (CSQMI), to learn a compact and discriminative CKD codebook. Projecting...... as the information about the underlying labels of the CKD using CSQMI. Thus the resulting codebook and reduced CKD are discriminative. We verify the effectiveness of our method on several public image benchmark datasets such as YaleB, Caltech-101 and CIFAR-10, as well as a challenging chicken feet dataset of our own...

  8. Method of extracting significant trouble information of nuclear power plants using probabilistic analysis technique

    International Nuclear Information System (INIS)

    Shimada, Yoshio; Miyazaki, Takamasa

    2005-01-01

    In order to analyze and evaluate large amounts of trouble information of overseas nuclear power plants, it is necessary to select information that is significant in terms of both safety and reliability. In this research, a method of efficiently and simply classifying degrees of importance of components in terms of safety and reliability while paying attention to root-cause components appearing in the information was developed. Regarding safety, the reactor core damage frequency (CDF), which is used in the probabilistic analysis of a reactor, was used. Regarding reliability, the automatic plant trip probability (APTP), which is used in the probabilistic analysis of automatic reactor trips, was used. These two aspects were reflected in the development of criteria for classifying degrees of importance of components. By applying these criteria, a simple method of extracting significant trouble information of overseas nuclear power plants was developed. (author)

  9. An integrated organisation-wide data quality management and information governance framework: theoretical underpinnings.

    Science.gov (United States)

    Liaw, Siaw-Teng; Pearce, Christopher; Liyanage, Harshana; Liaw, Gladys S S; de Lusignan, Simon

    2014-01-01

    Increasing investment in eHealth aims to improve cost effectiveness and safety of care. Data extraction and aggregation can create new data products to improve professional practice and provide feedback to improve the quality of source data. A previous systematic review concluded that locally relevant clinical indicators and use of clinical record systems could support clinical governance. We aimed to extend and update the review with a theoretical framework. We searched PubMed, Medline, Web of Science, ABI Inform (Proquest) and Business Source Premier (EBSCO) using the terms curation, information ecosystem, data quality management (DQM), data governance, information governance (IG) and data stewardship. We focused on and analysed the scope of DQM and IG processes, theoretical frameworks, and determinants of the processing, quality assurance, presentation and sharing of data across the enterprise. There are good theoretical reasons for integrated governance, but there is variable alignment of DQM, IG and health system objectives across the health enterprise. Ethical constraints exist that require health information ecosystems to process data in ways that are aligned with improving health and system efficiency and ensuring patient safety. Despite an increasingly 'big-data' environment, DQM and IG in health services are still fragmented across the data production cycle. We extend current work on DQM and IG with a theoretical framework for integrated IG across the data cycle. The dimensions of this theory-based framework would require testing with qualitative and quantitative studies to examine the applicability and utility, along with an evaluation of its impact on data quality across the health enterprise.

  10. An integrated organisation-wide data quality management and information governance framework: theoretical underpinnings

    Directory of Open Access Journals (Sweden)

    Siaw-Teng Liaw

    2014-10-01

    Full Text Available Introduction Increasing investment in eHealth aims to improve cost effectiveness and safety of care. Data extraction and aggregation can create new data products to improve professional practice and provide feedback to improve the quality of source data. A previous systematic review concluded that locally relevant clinical indicators and use of clinical record systems could support clinical governance. We aimed to extend and update the review with a theoretical framework.Methods We searched PubMed, Medline, Web of Science, ABI Inform (Proquest and Business Source Premier (EBSCO using the terms curation, information ecosystem, data quality management (DQM, data governance, information governance (IG and data stewardship. We focused on and analysed the scope of DQM and IG processes, theoretical frameworks, and determinants of the processing, quality assurance, presentation and sharing of data across the enterprise.Findings There are good theoretical reasons for integrated governance, but there is variable alignment of DQM, IG and health system objectives across the health enterprise. Ethical constraints exist that require health information ecosystems to process data in ways that are aligned with improving health and system efficiency and ensuring patient safety. Despite an increasingly ‘big-data’ environment, DQM and IG in health services are still fragmented across the data production cycle. We extend current work on DQM and IG with a theoretical framework for integrated IG across the data cycle.Conclusions The dimensions of this theory-based framework would require testing with qualitative and quantitative studies to examine the applicability and utility, along with an evaluation of its impact on data quality across the health enterprise.

  11. Instantaneous Shoreline Extraction Utilizing Integrated Spectrum and Shadow Analysis From LiDAR Data and High-resolution Satellite Imagery

    Science.gov (United States)

    Lee, I.-Chieh

    Shoreline delineation and shoreline change detection are expensive processes in data source acquisition and manual shoreline delineation. These costs confine the frequency and interval of shoreline mapping periods. In this dissertation, a new shoreline delineation approach was developed targeting on lowering the data source cost and reducing human labor. To lower the cost of data sources, we used the public domain LiDAR data sets and satellite images to delineate shorelines without the requirement of data sets being acquired simultaneously, which is a new concept in this field. To reduce the labor cost, we made improvements in classifying LiDAR points and satellite images. Analyzing shadow relations with topography to improve the satellite image classification performance is also a brand-new concept. The extracted shoreline of the proposed approach could achieve an accuracy of 1.495 m RMSE, or 4.452m at the 95% confidence level. Consequently, the proposed approach could successfully lower the cost and shorten the processing time, in other words, to increase the shoreline mapping frequency with a reasonable accuracy. However, the extracted shoreline may not compete with the shoreline extracted by aerial photogrammetric procedures in the aspect of accuracy. Hence, this is a trade-off between cost and accuracy. This approach consists of three phases, first, a shoreline extraction procedure based mainly on LiDAR point cloud data with multispectral information from satellite images. Second, an object oriented shoreline extraction procedure to delineate shoreline solely from satellite images; in this case WorldView-2 images were used. Third, a shoreline integration procedure combining these two shorelines based on actual shoreline changes and physical terrain properties. The actual data source cost would only be from the acquisition of satellite images. On the other hand, only two processes needed human attention. First, the shoreline within harbor areas needed to be

  12. Automated concept-level information extraction to reduce the need for custom software and rules development.

    Science.gov (United States)

    D'Avolio, Leonard W; Nguyen, Thien M; Goryachev, Sergey; Fiore, Louis D

    2011-01-01

    Despite at least 40 years of promising empirical performance, very few clinical natural language processing (NLP) or information extraction systems currently contribute to medical science or care. The authors address this gap by reducing the need for custom software and rules development with a graphical user interface-driven, highly generalizable approach to concept-level retrieval. A 'learn by example' approach combines features derived from open-source NLP pipelines with open-source machine learning classifiers to automatically and iteratively evaluate top-performing configurations. The Fourth i2b2/VA Shared Task Challenge's concept extraction task provided the data sets and metrics used to evaluate performance. Top F-measure scores for each of the tasks were medical problems (0.83), treatments (0.82), and tests (0.83). Recall lagged precision in all experiments. Precision was near or above 0.90 in all tasks. Discussion With no customization for the tasks and less than 5 min of end-user time to configure and launch each experiment, the average F-measure was 0.83, one point behind the mean F-measure of the 22 entrants in the competition. Strong precision scores indicate the potential of applying the approach for more specific clinical information extraction tasks. There was not one best configuration, supporting an iterative approach to model creation. Acceptable levels of performance can be achieved using fully automated and generalizable approaches to concept-level information extraction. The described implementation and related documentation is available for download.

  13. Extent of the integration of information communication and ...

    African Journals Online (AJOL)

    Editor

    manage electronic records and information. Key words: Botswana ... International Records Management Trust,. 2004, p. 5, Katuu ... ICT/information management in labour organisations. ...... and Information. Architecture-A Study of the Swedish.

  14. Approaches to cancer assessment in EPA's Integrated Risk Information System.

    Science.gov (United States)

    Gehlhaus, Martin W; Gift, Jeffrey S; Hogan, Karen A; Kopylev, Leonid; Schlosser, Paul M; Kadry, Abdel-Razak

    2011-07-15

    The U.S. Environmental Protection Agency's (EPA) Integrated Risk Information System (IRIS) Program develops assessments of health effects that may result from chronic exposure to chemicals in the environment. The IRIS database contains more than 540 assessments. When supported by available data, IRIS assessments provide quantitative analyses of carcinogenic effects. Since publication of EPA's 2005 Guidelines for Carcinogen Risk Assessment, IRIS cancer assessments have implemented new approaches recommended in these guidelines and expanded the use of complex scientific methods to perform quantitative dose-response assessments. Two case studies of the application of the mode of action framework from the 2005 Cancer Guidelines are presented in this paper. The first is a case study of 1,2,3-trichloropropane, as an example of a chemical with a mutagenic mode of carcinogenic action thus warranting the application of age-dependent adjustment factors for early-life exposure; the second is a case study of ethylene glycol monobutyl ether, as an example of a chemical with a carcinogenic action consistent with a nonlinear extrapolation approach. The use of physiologically based pharmacokinetic (PBPK) modeling to quantify interindividual variability and account for human parameter uncertainty as part of a quantitative cancer assessment is illustrated using a case study involving probabilistic PBPK modeling for dichloromethane. We also discuss statistical issues in assessing trends and model fit for tumor dose-response data, analysis of the combined risk from multiple types of tumors, and application of life-table methods for using human data to derive cancer risk estimates. These issues reflect the complexity and challenges faced in assessing the carcinogenic risks from exposure to environmental chemicals, and provide a view of the current trends in IRIS carcinogenicity risk assessment. Copyright © 2011. Published by Elsevier Inc.

  15. Approaches to cancer assessment in EPA's Integrated Risk Information System

    International Nuclear Information System (INIS)

    Gehlhaus, Martin W.; Gift, Jeffrey S.; Hogan, Karen A.; Kopylev, Leonid; Schlosser, Paul M.; Kadry, Abdel-Razak

    2011-01-01

    The U.S. Environmental Protection Agency's (EPA) Integrated Risk Information System (IRIS) Program develops assessments of health effects that may result from chronic exposure to chemicals in the environment. The IRIS database contains more than 540 assessments. When supported by available data, IRIS assessments provide quantitative analyses of carcinogenic effects. Since publication of EPA's 2005 Guidelines for Carcinogen Risk Assessment, IRIS cancer assessments have implemented new approaches recommended in these guidelines and expanded the use of complex scientific methods to perform quantitative dose-response assessments. Two case studies of the application of the mode of action framework from the 2005 Cancer Guidelines are presented in this paper. The first is a case study of 1,2,3-trichloropropane, as an example of a chemical with a mutagenic mode of carcinogenic action thus warranting the application of age-dependent adjustment factors for early-life exposure; the second is a case study of ethylene glycol monobutyl ether, as an example of a chemical with a carcinogenic action consistent with a nonlinear extrapolation approach. The use of physiologically based pharmacokinetic (PBPK) modeling to quantify interindividual variability and account for human parameter uncertainty as part of a quantitative cancer assessment is illustrated using a case study involving probabilistic PBPK modeling for dichloromethane. We also discuss statistical issues in assessing trends and model fit for tumor dose-response data, analysis of the combined risk from multiple types of tumors, and application of life-table methods for using human data to derive cancer risk estimates. These issues reflect the complexity and challenges faced in assessing the carcinogenic risks from exposure to environmental chemicals, and provide a view of the current trends in IRIS carcinogenicity risk assessment.

  16. An open, component-based information infrastructure for integrated health information networks.

    Science.gov (United States)

    Tsiknakis, Manolis; Katehakis, Dimitrios G; Orphanoudakis, Stelios C

    2002-12-18

    A fundamental requirement for achieving continuity of care is the seamless sharing of multimedia clinical information. Different technological approaches can be adopted for enabling the communication and sharing of health record segments. In the context of the emerging global information society, the creation of and access to the integrated electronic health record (I-EHR) of a citizen has been assigned high priority in many countries. This requirement is complementary to an overall requirement for the creation of a health information infrastructure (HII) to support the provision of a variety of health telematics and e-health services. In developing a regional or national HII, the components or building blocks that make up the overall information system ought to be defined and an appropriate component architecture specified. This paper discusses current international priorities and trends in developing the HII. It presents technological challenges and alternative approaches towards the creation of an I-EHR, being the aggregation of health data created during all interactions of an individual with the healthcare system. It also presents results from an ongoing Research and Development (R&D) effort towards the implementation of the HII in HYGEIAnet, the regional health information network of Crete, Greece, using a component-based software engineering approach. Critical design decisions and related trade-offs, involved in the process of component specification and development, are also discussed and the current state of development of an I-EHR service is presented. Finally, Human Computer Interaction (HCI) and security issues, which are important for the deployment and use of any I-EHR service, are considered.

  17. Knowledge-Intensive Gathering and Integration of Statistical Information on European Fisheries

    NARCIS (Netherlands)

    Klinkert, M.; Treur, J.; Verwaart, T.; Loganantharaj, R.; Palm, G.; Ali, M.

    2000-01-01

    Gathering, maintenance, integration and presentation of statistics are major activities of the Dutch Agricultural Economics Research Institute LEI. In this paper we explore how knowledge and agent technology can be exploited to support the information gathering and integration process. In

  18. 76 FR 17145 - Agency Information Collection Activities: Business Transformation-Automated Integrated Operating...

    Science.gov (United States)

    2011-03-28

    ... Collection Activities: Business Transformation--Automated Integrated Operating Environment (IOE), New... through efforts like USCIS' Business Transformation initiative. The IOE will be implemented by USCIS and... information collection. (2) Title of the Form/Collection: Business Transformation-- Automated Integrated...

  19. The Feature Extraction Based on Texture Image Information for Emotion Sensing in Speech

    Directory of Open Access Journals (Sweden)

    Kun-Ching Wang

    2014-09-01

    Full Text Available In this paper, we present a novel texture image feature for Emotion Sensing in Speech (ESS. This idea is based on the fact that the texture images carry emotion-related information. The feature extraction is derived from time-frequency representation of spectrogram images. First, we transform the spectrogram as a recognizable image. Next, we use a cubic curve to enhance the image contrast. Then, the texture image information (TII derived from the spectrogram image can be extracted by using Laws’ masks to characterize emotional state. In order to evaluate the effectiveness of the proposed emotion recognition in different languages, we use two open emotional databases including the Berlin Emotional Speech Database (EMO-DB and eNTERFACE corpus and one self-recorded database (KHUSC-EmoDB, to evaluate the performance cross-corpora. The results of the proposed ESS system are presented using support vector machine (SVM as a classifier. Experimental results show that the proposed TII-based feature extraction inspired by visual perception can provide significant classification for ESS systems. The two-dimensional (2-D TII feature can provide the discrimination between different emotions in visual expressions except for the conveyance pitch and formant tracks. In addition, the de-noising in 2-D images can be more easily completed than de-noising in 1-D speech.

  20. A cascade of classifiers for extracting medication information from discharge summaries

    Directory of Open Access Journals (Sweden)

    Halgrim Scott

    2011-07-01

    Full Text Available Abstract Background Extracting medication information from clinical records has many potential applications, and recently published research, systems, and competitions reflect an interest therein. Much of the early extraction work involved rules and lexicons, but more recently machine learning has been applied to the task. Methods We present a hybrid system consisting of two parts. The first part, field detection, uses a cascade of statistical classifiers to identify medication-related named entities. The second part uses simple heuristics to link those entities into medication events. Results The system achieved performance that is comparable to other approaches to the same task. This performance is further improved by adding features that reference external medication name lists. Conclusions This study demonstrates that our hybrid approach outperforms purely statistical or rule-based systems. The study also shows that a cascade of classifiers works better than a single classifier in extracting medication information. The system is available as is upon request from the first author.

  1. Three-dimensional information extraction from GaoFen-1 satellite images for landslide monitoring

    Science.gov (United States)

    Wang, Shixin; Yang, Baolin; Zhou, Yi; Wang, Futao; Zhang, Rui; Zhao, Qing

    2018-05-01

    To more efficiently use GaoFen-1 (GF-1) satellite images for landslide emergency monitoring, a Digital Surface Model (DSM) can be generated from GF-1 across-track stereo image pairs to build a terrain dataset. This study proposes a landslide 3D information extraction method based on the terrain changes of slope objects. The slope objects are mergences of segmented image objects which have similar aspects; and the terrain changes are calculated from the post-disaster Digital Elevation Model (DEM) from GF-1 and the pre-disaster DEM from GDEM V2. A high mountain landslide that occurred in Wenchuan County, Sichuan Province is used to conduct a 3D information extraction test. The extracted total area of the landslide is 22.58 ha; the displaced earth volume is 652,100 m3; and the average sliding direction is 263.83°. The accuracies of them are 0.89, 0.87 and 0.95, respectively. Thus, the proposed method expands the application of GF-1 satellite images to the field of landslide emergency monitoring.

  2. How Does Alkali Aid Protein Extraction in Green Tea Leaf Residue: A Basis for Integrated Biorefinery of Leaves

    Science.gov (United States)

    Zhang, Chen; Sanders, Johan P. M.; Xiao, Ting T.; Bruins, Marieke E.

    2015-01-01

    Leaf protein can be obtained cost-efficiently by alkaline extraction, but overuse of chemicals and low quality of (denatured) protein limits its application. The research objective was to investigate how alkali aids protein extraction of green tea leaf residue, and use these results for further improvements in alkaline protein biorefinery. Protein extraction yield was studied for correlation to morphology of leaf tissue structure, protein solubility and hydrolysis degree, and yields of non-protein components obtained at various conditions. Alkaline protein extraction was not facilitated by increased solubility or hydrolysis of protein, but positively correlated to leaf tissue disruption. HG pectin, RGII pectin, and organic acids were extracted before protein extraction, which was followed by the extraction of cellulose and hemi-cellulose. RGI pectin and lignin were both linear to protein yield. The yields of these two components were 80% and 25% respectively when 95% protein was extracted, which indicated that RGI pectin is more likely to be the key limitation to leaf protein extraction. An integrated biorefinery was designed based on these results. PMID:26200774

  3. How Does Alkali Aid Protein Extraction in Green Tea Leaf Residue: A Basis for Integrated Biorefinery of Leaves.

    Directory of Open Access Journals (Sweden)

    Chen Zhang

    Full Text Available Leaf protein can be obtained cost-efficiently by alkaline extraction, but overuse of chemicals and low quality of (denatured protein limits its application. The research objective was to investigate how alkali aids protein extraction of green tea leaf residue, and use these results for further improvements in alkaline protein biorefinery. Protein extraction yield was studied for correlation to morphology of leaf tissue structure, protein solubility and hydrolysis degree, and yields of non-protein components obtained at various conditions. Alkaline protein extraction was not facilitated by increased solubility or hydrolysis of protein, but positively correlated to leaf tissue disruption. HG pectin, RGII pectin, and organic acids were extracted before protein extraction, which was followed by the extraction of cellulose and hemi-cellulose. RGI pectin and lignin were both linear to protein yield. The yields of these two components were 80% and 25% respectively when 95% protein was extracted, which indicated that RGI pectin is more likely to be the key limitation to leaf protein extraction. An integrated biorefinery was designed based on these results.

  4. The Effect of Information Security Management on Organizational Processes Integration in Supply Chain

    OpenAIRE

    Mohsen Shafiei Nikabadi; Ahmad Jafarian; Azam Jalili Bolhasani

    2012-01-01

    : The major purpose of this article was that how information security management has effect on supply chain integration and the effect of implementing "information security management system" on enhancing supplies chain integration. In this respect, current research was seeking a combination overview to these tow approaches (Information Security Management and Organizational Processes Integration by Enterprise Resources Planning System) and after that determined factors of these two import...

  5. Integrating Semantic Information in Metadata Descriptions for a Geoscience-wide Resource Inventory.

    Science.gov (United States)

    Zaslavsky, I.; Richard, S. M.; Gupta, A.; Valentine, D.; Whitenack, T.; Ozyurt, I. B.; Grethe, J. S.; Schachne, A.

    2016-12-01

    Integrating semantic information into legacy metadata catalogs is a challenging issue and so far has been mostly done on a limited scale. We present experience of CINERGI (Community Inventory of Earthcube Resources for Geoscience Interoperability), an NSF Earthcube Building Block project, in creating a large cross-disciplinary catalog of geoscience information resources to enable cross-domain discovery. The project developed a pipeline for automatically augmenting resource metadata, in particular generating keywords that describe metadata documents harvested from multiple geoscience information repositories or contributed by geoscientists through various channels including surveys and domain resource inventories. The pipeline examines available metadata descriptions using text parsing, vocabulary management and semantic annotation and graph navigation services of GeoSciGraph. GeoSciGraph, in turn, relies on a large cross-domain ontology of geoscience terms, which bridges several independently developed ontologies or taxonomies including SWEET, ENVO, YAGO, GeoSciML, GCMD, SWO, and CHEBI. The ontology content enables automatic extraction of keywords reflecting science domains, equipment used, geospatial features, measured properties, methods, processes, etc. We specifically focus on issues of cross-domain geoscience ontology creation, resolving several types of semantic conflicts among component ontologies or vocabularies, and constructing and managing facets for improved data discovery and navigation. The ontology and keyword generation rules are iteratively improved as pipeline results are presented to data managers for selective manual curation via a CINERGI Annotator user interface. We present lessons learned from applying CINERGI metadata augmentation pipeline to a number of federal agency and academic data registries, in the context of several use cases that require data discovery and integration across multiple earth science data catalogs of varying quality

  6. Integrated use of spatial and semantic relationships for extracting road networks from floating car data

    Science.gov (United States)

    Li, Jun; Qin, Qiming; Xie, Chao; Zhao, Yue

    2012-10-01

    The update frequency of digital road maps influences the quality of road-dependent services. However, digital road maps surveyed by probe vehicles or extracted from remotely sensed images still have a long updating circle and their cost remain high. With GPS technology and wireless communication technology maturing and their cost decreasing, floating car technology has been used in traffic monitoring and management, and the dynamic positioning data from floating cars become a new data source for updating road maps. In this paper, we aim to update digital road maps using the floating car data from China's National Commercial Vehicle Monitoring Platform, and present an incremental road network extraction method suitable for the platform's GPS data whose sampling frequency is low and which cover a large area. Based on both spatial and semantic relationships between a trajectory point and its associated road segment, the method classifies each trajectory point, and then merges every trajectory point into the candidate road network through the adding or modifying process according to its type. The road network is gradually updated until all trajectories have been processed. Finally, this method is applied in the updating process of major roads in North China and the experimental results reveal that it can accurately derive geometric information of roads under various scenes. This paper provides a highly-efficient, low-cost approach to update digital road maps.

  7. TechIP: A Methodology for Emerging Information Technology Insertion & Integration

    National Research Council Canada - National Science Library

    Patel, Has

    2004-01-01

    ...) processing and software agents. To implement these requirements, the system designers are required to insert, integrate and manage proven advances in Emerging Information Technology (EIT) in to the...

  8. THE EXTRACTION OF INDOOR BUILDING INFORMATION FROM BIM TO OGC INDOORGML

    Directory of Open Access Journals (Sweden)

    T.-A. Teo

    2017-07-01

    Full Text Available Indoor Spatial Data Infrastructure (indoor-SDI is an important SDI for geosptial analysis and location-based services. Building Information Model (BIM has high degree of details in geometric and semantic information for building. This study proposed direct conversion schemes to extract indoor building information from BIM to OGC IndoorGML. The major steps of the research include (1 topological conversion from building model into indoor network model; and (2 generation of IndoorGML. The topological conversion is a major process of generating and mapping nodes and edges from IFC to indoorGML. Node represents every space (e.g. IfcSpace and objects (e.g. IfcDoor in the building while edge shows the relationships between nodes. According to the definition of IndoorGML, the topological model in the dual space is also represented as a set of nodes and edges. These definitions of IndoorGML are the same as in the indoor network. Therefore, we can extract the necessary data in the indoor network and easily convert them into IndoorGML based on IndoorGML Schema. The experiment utilized a real BIM model to examine the proposed method. The experimental results indicated that the 3D indoor model (i.e. IndoorGML model can be automatically imported from IFC model by the proposed procedure. In addition, the geometric and attribute of building elements are completely and correctly converted from BIM to indoor-SDI.

  9. Methods from Information Extraction from LIDAR Intensity Data and Multispectral LIDAR Technology

    Science.gov (United States)

    Scaioni, M.; Höfle, B.; Baungarten Kersting, A. P.; Barazzetti, L.; Previtali, M.; Wujanz, D.

    2018-04-01

    LiDAR is a consolidated technology for topographic mapping and 3D reconstruction, which is implemented in several platforms On the other hand, the exploitation of the geometric information has been coupled by the use of laser intensity, which may provide additional data for multiple purposes. This option has been emphasized by the availability of sensors working on different wavelength, thus able to provide additional information for classification of surfaces and objects. Several applications ofmonochromatic and multi-spectral LiDAR data have been already developed in different fields: geosciences, agriculture, forestry, building and cultural heritage. The use of intensity data to extract measures of point cloud quality has been also developed. The paper would like to give an overview on the state-of-the-art of these techniques, and to present the modern technologies for the acquisition of multispectral LiDAR data. In addition, the ISPRS WG III/5 on `Information Extraction from LiDAR Intensity Data' has collected and made available a few open data sets to support scholars to do research on this field. This service is presented and data sets delivered so far as are described.

  10. The effect of informed consent on stress levels associated with extraction of impacted mandibular third molars.

    Science.gov (United States)

    Casap, Nardy; Alterman, Michael; Sharon, Guy; Samuni, Yuval

    2008-05-01

    To evaluate the effect of informed consent on stress levels associated with removal of impacted mandibular third molars. A total of 60 patients scheduled for extraction of impacted mandibular third molars participated in this study. The patients were unaware of the study's objectives. Data from 20 patients established the baseline levels of electrodermal activity (EDA). The remaining 40 patients were randomly assigned into 2 equal groups receiving either a detailed document of informed consent, disclosing the possible risks involved with the surgery, or a simplified version. Pulse, blood pressure, and EDA were monitored before, during, and after completion of the consent document. Changes in EDA, but not in blood pressure, were measured on completion of either version of the consent document. A greater increase in EDA was associated with the detailed version of the consent document (P = .004). A similar concomitant increase (although nonsignificant) in pulse values was monitored on completion of both versions. Completion of overdisclosed document of informed consent is associated with changes in physiological parameters. The results suggest that overdetailed listing and disclosure before extraction of impacted mandibular third molars can increase patient stress.

  11. METHODS FROM INFORMATION EXTRACTION FROM LIDAR INTENSITY DATA AND MULTISPECTRAL LIDAR TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    M. Scaioni

    2018-04-01

    Full Text Available LiDAR is a consolidated technology for topographic mapping and 3D reconstruction, which is implemented in several platforms On the other hand, the exploitation of the geometric information has been coupled by the use of laser intensity, which may provide additional data for multiple purposes. This option has been emphasized by the availability of sensors working on different wavelength, thus able to provide additional information for classification of surfaces and objects. Several applications ofmonochromatic and multi-spectral LiDAR data have been already developed in different fields: geosciences, agriculture, forestry, building and cultural heritage. The use of intensity data to extract measures of point cloud quality has been also developed. The paper would like to give an overview on the state-of-the-art of these techniques, and to present the modern technologies for the acquisition of multispectral LiDAR data. In addition, the ISPRS WG III/5 on ‘Information Extraction from LiDAR Intensity Data’ has collected and made available a few open data sets to support scholars to do research on this field. This service is presented and data sets delivered so far as are described.

  12. Risk Informed Design Using Integrated Vehicle Rapid Assessment Tools

    Data.gov (United States)

    National Aeronautics and Space Administration — A successful proof of concept was performed in FY 2012 integrating the Envision tool for parametric estimates of vehicle mass and the Rapid Response Risk Assessment...

  13. Multi-Sensor Information Integration and Automatic Understanding

    National Research Council Canada - National Science Library

    Welborn, Matthew

    2008-01-01

    ... anomalous behavior characteristics. The SIG team attended the PI Gathering at ONR in May 2008 and presented our current results as well as providing a demonstration of the integrated software behavior detection application...

  14. Information Exchange Architecture for Integrating Unmanned Vehicles into Maritime Missions

    National Research Council Canada - National Science Library

    Woolsey, Aaron

    2004-01-01

    .... The focus of this study is to analyze the structure of information flow for unmanned systems and suggest an exchange architecture to successfully inform and build decision maker understanding based...

  15. [Research on medical instrument information integration technology based on IHE PCD].

    Science.gov (United States)

    Zheng, Jianli; Liao, Yun; Yang, Yongyong

    2014-06-01

    Integrating medical instruments with medical information systems becomes more and more important in healthcare industry. To make medical instruments without standard communication interface possess the capability of interoperating and sharing information with medical information systems, we developed a medical instrument integration gateway based on Integrating the Healthcare Enterprise Patient Care Device (IHE PCD) integration profiles in this research. The core component is an integration engine which is implemented according to integration profiles and Health Level Seven (HL7) messages defined in IHE PCD. Working with instrument specific Javascripts, the engine transforms medical instrument data into HL7 ORU message. This research enables medical instruments to interoperate and exchange medical data with information systems in a standardized way, and is valuable for medical instrument integration, especially for traditional instruments.

  16. Integration of ceramic membrane and compressed air-assisted solvent extraction (CASX) for metal recovery.

    Science.gov (United States)

    Li, Chi-Wang; Chiu, Chun-Hao; Lee, Yu-Cheng; Chang, Chia-Hao; Lee, Yu-Hsun; Chen, Yi-Ming

    2010-01-01

    In our previous publications, compressed air-assisted solvent extraction process (CASX) was developed and proved to be kinetically efficient process for metal removal. In the current study, CASX with a ceramic MF membrane integrated for separation of spent solvent was employed to remove and recover metal from wastewater. MF was operated either in crossflow mode or dead-end with intermittent flushing mode. Under crossflow mode, three distinct stages of flux vs. TMP (trans-membrane pressure) relationship were observed. In the first stage, flux increases with increasing TMP which is followed by the stage of stable flux with increasing TMP. After reaching a threshold TMP which is dependent of crossflow velocity, flux increases again with increasing TMP. At the last stage, solvent was pushed through membrane pores as indicated by increasing permeate COD. In dead-end with intermittent flushing mode, an intermittent flushing flow (2 min after a 10-min or a 30-min dead-end filtration) was incorporated to reduce membrane fouling by flush out MSAB accumulated on membrane surface. Effects of solvent concentration and composition were also investigated. Solvent concentrations ranging from 0.1 to 1% (w/w) have no adverse effect in terms of membrane fouling. However, solvent composition, i.e. D(2)EHPA/kerosene ratio, shows impact on membrane fouling. The type of metal extractants employed in CASX has significant impact on both membrane fouling and the quality of filtrate due to the differences in their viscosity and water solubility. Separation of MSAB was the limiting process controlling metal removal efficiency, and the removal efficiency of Cd(II) and Cr(VI) followed the same trend as that for COD.

  17. About increasing informativity of diagnostic system of asynchronous electric motor by extracting additional information from values of consumed current parameter

    Science.gov (United States)

    Zhukovskiy, Y.; Korolev, N.; Koteleva, N.

    2018-05-01

    This article is devoted to expanding the possibilities of assessing the technical state of the current consumption of asynchronous electric drives, as well as increasing the information capacity of diagnostic methods, in conditions of limited access to equipment and incompleteness of information. The method of spectral analysis of the electric drive current can be supplemented by an analysis of the components of the current of the Park's vector. The research of the hodograph evolution in the moment of appearance and development of defects was carried out using the example of current asymmetry in the phases of an induction motor. The result of the study is the new diagnostic parameters of the asynchronous electric drive. During the research, it was proved that the proposed diagnostic parameters allow determining the type and level of the defect. At the same time, there is no need to stop the equipment and taky it out of service for repair. Modern digital control and monitoring systems can use the proposed parameters based on the stator current of an electrical machine to improve the accuracy and reliability of obtaining diagnostic patterns and predicting their changes in order to improve the equipment maintenance systems. This approach can also be used in systems and objects where there are significant parasitic vibrations and unsteady loads. The extraction of useful information can be carried out in electric drive systems in the structure of which there is a power electric converter.

  18. The Effect of Information Security Management on Organizational Processes Integration in Supply Chain

    Directory of Open Access Journals (Sweden)

    Mohsen Shafiei Nikabadi

    2012-03-01

    Full Text Available : The major purpose of this article was that how information security management has effect on supply chain integration and the effect of implementing "information security management system" on enhancing supplies chain integration. In this respect, current research was seeking a combination overview to these tow approaches (Information Security Management and Organizational Processes Integration by Enterprise Resources Planning System and after that determined factors of these two important issue by factor analysis. Researchers using a series of comments in the automotive experts (production planning and management and supply chain experts and caregivers car makers and suppliers in the first level and second level supply chain industry. In this way, it has been done that impact on how information security management processes enterprise supply chain integration with the help of statistical correlation analysis. The results of this investigation indicated effect of "information security management system" various dimensions that were coordination of information, prevent human errors and hardware, the accuracy of information and education for users on two dimensions of internal and external integration of business processes, supply chain and finally, it can increased integration of business processes in supply chain. At the end owing to quite these results, deployment of "information security management system" increased the integration of organizational processes in supply chain. It could be demonstrate with the consideration of relation of organizational integration processes whit the level of coordination of information, prevent errors and accuracy of information throughout the supply chain.

  19. The Need for Integration of Information and Communication ...

    African Journals Online (AJOL)

    Information and Communication Technology (ICT) is a major factor in shaping a new global economy and producing rapid changes in society. In order to function in this new world economy, students and their teachers have to learn to deal with large amount of information. This entails the analysis of such information and ...

  20. Integrity and dissemination control in administrative applications through information designators

    NARCIS (Netherlands)

    Teepe, W.G.

    When more and more information sources are being linked, it seems that it becomes ever more easy to track individuals in ways that are not deemed appropriate. However, increased linking of information does not need to imply increased dissemination of privacy-sensitive information. We present a new

  1. Towards a digitized and integrated health information system in ...

    African Journals Online (AJOL)

    Background: A strong health information system able to generate timely and accurate information is essential to ensure effective and efficient performance. Sudan's health information system is still paper-based and characterized by fragmentation and verticality. Efforts to overcome this have led to development of an ...

  2. Multi-Paradigm and Multi-Lingual Information Extraction as Support for Medical Web Labelling Authorities

    Directory of Open Access Journals (Sweden)

    Martin Labsky

    2010-10-01

    Full Text Available Until recently, quality labelling of medical web content has been a pre-dominantly manual activity. However, the advances in automated text processing opened the way to computerised support of this activity. The core enabling technology is information extraction (IE. However, the heterogeneity of websites offering medical content imposes particular requirements on the IE techniques to be applied. In the paper we discuss these requirements and describe a multi-paradigm approach to IE addressing them. Experiments on multi-lingual data are reported. The research has been carried out within the EU MedIEQ project.

  3. Scholarly Information Extraction Is Going to Make a Quantum Leap with PubMed Central (PMC).

    Science.gov (United States)

    Matthies, Franz; Hahn, Udo

    2017-01-01

    With the increasing availability of complete full texts (journal articles), rather than their surrogates (titles, abstracts), as resources for text analytics, entirely new opportunities arise for information extraction and text mining from scholarly publications. Yet, we gathered evidence that a range of problems are encountered for full-text processing when biomedical text analytics simply reuse existing NLP pipelines which were developed on the basis of abstracts (rather than full texts). We conducted experiments with four different relation extraction engines all of which were top performers in previous BioNLP Event Extraction Challenges. We found that abstract-trained engines loose up to 6.6% F-score points when run on full-text data. Hence, the reuse of existing abstract-based NLP software in a full-text scenario is considered harmful because of heavy performance losses. Given the current lack of annotated full-text resources to train on, our study quantifies the price paid for this short cut.

  4. Ensuring the integrity of information resources based methods dvooznakovoho structural data encoding

    Directory of Open Access Journals (Sweden)

    О.К. Юдін

    2009-01-01

    Full Text Available  Developed methods of estimation of noise stability and correction of structural code constructions to distortion in comunication of data in informatively communication systems and networks taking into account providing of integrity of informative resource.

  5. Tuberculosis Biomarker Extraction and Isothermal Amplification in an Integrated Diagnostic Device.

    Directory of Open Access Journals (Sweden)

    Amy Creecy

    Full Text Available In this study, we integrated magnetic bead-based sample preparation and isothermal loop mediated amplification (LAMP of TB in a single tube. Surrogate sputum samples produced by the Program for Appropriate Technology in Health containing inactivated TB bacteria were used to test the diagnostic. In order to test the sample preparation method, samples were lysed, and DNA was manually extracted and eluted into water in the tube. In a thermal cycler, LAMP amplified TB DNA from 103 TB cells/mL of sputum at 53.5 ± 3.3 minutes, 104 cells/mL at 46.3 ± 2.2 minutes, and 105 cells/mL at 41.6 ± 1.9 minutes. Negative control samples did not amplify. Next, sample preparation was combined with in-tubing isothermal LAMP amplification by replacing the water elution chamber with a LAMP reaction chamber. In this intermediate configuration, LAMP amplified 103 cells/mL at 74 ± 10 minutes, 104 cells/mL at 60 ± 9 minutes, and 105 TB cells/mL of sputum at 54 ± 9 minutes. Two of three negative controls did not amplify; one amplified at 100 minutes. In the semi-automated system, DNA was eluted directly into an isothermal reaction solution containing the faster OptiGene DNA polymerase. The low surrogate sputum concentration, 103 TB cells/mL, amplified at 52.8 ± 3.3 minutes, 104 cells/mL at 45.4 ± 11.3 minutes, and 105 cells/mL at 31.8 ± 2.9 minutes. TB negative samples amplified at 66.4 ± 7.4 minutes. This study demonstrated the feasibility of a single tube design for integrating sample preparation and isothermal amplification, which with further development could be useful for point-of-care applications, particularly in a low-resource setting.

  6. Obtaining bixin from semi-defatted annatto seeds by a mechanical method and solvent extraction: Process integration and economic evaluation.

    Science.gov (United States)

    Alcázar-Alay, Sylvia C; Osorio-Tobón, J Felipe; Forster-Carneiro, Tânia; Meireles, M Angela A

    2017-09-01

    This work involves the application of physical separation methods to concentrate the pigment of semi-defatted annatto seeds, a noble vegetal biomass rich in bixin pigments. Semi-defatted annatto seeds are the residue produced after the extraction of the lipid fraction from annatto seeds using supercritical fluid extraction (SFE). Semi-defatted annatto seeds are use in this work due to three important reasons: i) previous lipid extraction is necessary to recovery the tocotrienol-rich oil present in the annatto seeds, ii) an initial removal of the oil via SFE process favors bixin separation and iii) the cost of raw material is null. Physical methods including i) the mechanical fractionation method and ii) an integrated process of mechanical fractionation method and low-pressure solvent extraction (LPSE) were studied. The integrated process was proposed for processing two different semi-defatted annatto materials denoted Batches 1 and 2. The cost of manufacture (COM) was calculated for two different production scales (5 and 50L) considering the integrated process vs. only the mechanical fractionation method. The integrated process showed a significantly higher COM than mechanical fractionation method. This work suggests that mechanical fractionation method is an adequate and low-cost process to obtain a rich-pigment product from semi-defatted annatto seeds. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Accurate facade feature extraction method for buildings from three-dimensional point cloud data considering structural information

    Science.gov (United States)

    Wang, Yongzhi; Ma, Yuqing; Zhu, A.-xing; Zhao, Hui; Liao, Lixia

    2018-05-01

    Facade features represent segmentations of building surfaces and can serve as a building framework. Extracting facade features from three-dimensional (3D) point cloud data (3D PCD) is an efficient method for 3D building modeling. By combining the advantages of 3D PCD and two-dimensional optical images, this study describes the creation of a highly accurate building facade feature extraction method from 3D PCD with a focus on structural information. The new extraction method involves three major steps: image feature extraction, exploration of the mapping method between the image features and 3D PCD, and optimization of the initial 3D PCD facade features considering structural information. Results show that the new method can extract the 3D PCD facade features of buildings more accurately and continuously. The new method is validated using a case study. In addition, the effectiveness of the new method is demonstrated by comparing it with the range image-extraction method and the optical image-extraction method in the absence of structural information. The 3D PCD facade features extracted by the new method can be applied in many fields, such as 3D building modeling and building information modeling.

  8. Extracting Low-Frequency Information from Time Attenuation in Elastic Waveform Inversion

    Science.gov (United States)

    Guo, Xuebao; Liu, Hong; Shi, Ying; Wang, Weihong

    2017-03-01

    Low-frequency information is crucial for recovering background velocity, but the lack of low-frequency information in field data makes inversion impractical without accurate initial models. Laplace-Fourier domain waveform inversion can recover a smooth model from real data without low-frequency information, which can be used for subsequent inversion as an ideal starting model. In general, it also starts with low frequencies and includes higher frequencies at later inversion stages, while the difference is that its ultralow frequency information comes from the Laplace-Fourier domain. Meanwhile, a direct implementation of the Laplace-transformed wavefield using frequency domain inversion is also very convenient. However, because broad frequency bands are often used in the pure time domain waveform inversion, it is difficult to extract the wavefields dominated by low frequencies in this case. In this paper, low-frequency components are constructed by introducing time attenuation into the recorded residuals, and the rest of the method is identical to the traditional time domain inversion. Time windowing and frequency filtering are also applied to mitigate the ambiguity of the inverse problem. Therefore, we can start at low frequencies and to move to higher frequencies. The experiment shows that the proposed method can achieve a good inversion result in the presence of a linear initial model and records without low-frequency information.

  9. Audio-Visual Speech Recognition Using Lip Information Extracted from Side-Face Images

    Directory of Open Access Journals (Sweden)

    Koji Iwano

    2007-03-01

    Full Text Available This paper proposes an audio-visual speech recognition method using lip information extracted from side-face images as an attempt to increase noise robustness in mobile environments. Our proposed method assumes that lip images can be captured using a small camera installed in a handset. Two different kinds of lip features, lip-contour geometric features and lip-motion velocity features, are used individually or jointly, in combination with audio features. Phoneme HMMs modeling the audio and visual features are built based on the multistream HMM technique. Experiments conducted using Japanese connected digit speech contaminated with white noise in various SNR conditions show effectiveness of the proposed method. Recognition accuracy is improved by using the visual information in all SNR conditions. These visual features were confirmed to be effective even when the audio HMM was adapted to noise by the MLLR method.

  10. Approaching the largest ‘API’: extracting information from the Internet with Python

    Directory of Open Access Journals (Sweden)

    Jonathan E. Germann

    2018-02-01

    Full Text Available This article explores the need for libraries to algorithmically access and manipulate the world’s largest API: the Internet. The billions of pages on the ‘Internet API’ (HTTP, HTML, CSS, XPath, DOM, etc. are easily accessible and manipulable. Libraries can assist in creating meaning through the datafication of information on the world wide web. Because most information is created for human consumption, some programming is required for automated extraction. Python is an easy-to-learn programming language with extensive packages and community support for web page automation. Four packages (Urllib, Selenium, BeautifulSoup, Scrapy in Python can automate almost any web page for all sized projects. An example warrant data project is explained to illustrate how well Python packages can manipulate web pages to create meaning through assembling custom datasets.

  11. DEVELOPMENT OF AUTOMATIC EXTRACTION METHOD FOR ROAD UPDATE INFORMATION BASED ON PUBLIC WORK ORDER OUTLOOK

    Science.gov (United States)

    Sekimoto, Yoshihide; Nakajo, Satoru; Minami, Yoshitaka; Yamaguchi, Syohei; Yamada, Harutoshi; Fuse, Takashi

    Recently, disclosure of statistic data, representing financial effects or burden for public work, through each web site of national or local government, enables us to discuss macroscopic financial trends. However, it is still difficult to grasp a basic property nationwide how each spot was changed by public work. In this research, our research purpose is to collect road update information reasonably which various road managers provide, in order to realize efficient updating of various maps such as car navigation maps. In particular, we develop the system extracting public work concerned and registering summary including position information to database automatically from public work order outlook, released by each local government, combinating some web mining technologies. Finally, we collect and register several tens of thousands from web site all over Japan, and confirm the feasibility of our method.

  12. Information Integration and Communication in Plant Growth Regulation.

    Science.gov (United States)

    Chaiwanon, Juthamas; Wang, Wenfei; Zhu, Jia-Ying; Oh, Eunkyoo; Wang, Zhi-Yong

    2016-03-10

    Plants are equipped with the capacity to respond to a large number of diverse signals, both internal ones and those emanating from the environment, that are critical to their survival and adaption as sessile organisms. These signals need to be integrated through highly structured intracellular networks to ensure coherent cellular responses, and in addition, spatiotemporal actions of hormones and peptides both orchestrate local cell differentiation and coordinate growth and physiology over long distances. Further, signal interactions and signaling outputs vary significantly with developmental context. This review discusses our current understanding of the integrated intracellular and intercellular signaling networks that control plant growth. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Integrating public information activities on a technical project

    International Nuclear Information System (INIS)

    Little, Sh. K.; Vecchiola, S.F.

    1984-01-01

    Through gradual evolution and successful performance, the WIPP Communications group has gained respect and recognition as a dual service organization that offers numerous benefits to a technical project. Westinghouse assembled a team that has successfully coordinated and encouraged an exchange of information not only with the public information realm but also as a project service and function. WIPP has combined educational services, external and employee communication and public information into one unit called ''Communications''

  14. Experimentally verified inductance extraction and parameter study for superconductive integrated circuit wires crossing ground plane holes

    International Nuclear Information System (INIS)

    Fourie, Coenrad J; Wetzstein, Olaf; Kunert, Juergen; Meyer, Hans-Georg; Toepfer, Hannes

    2013-01-01

    As the complexity of rapid single flux quantum (RSFQ) circuits increases, both current and power consumption of the circuits become important design criteria. Various new concepts such as inductive biasing for energy efficient RSFQ circuits and inductively coupled RSFQ cells for current recycling have been proposed to overcome increasingly severe design problems. Both of these techniques use ground plane holes to increase the inductance or coupling factor of superconducting integrated circuit wires. New design tools are consequently required to handle the new topographies. One important issue in such circuit design is the accurate calculation of networks of inductances even in the presence of finite holes in the ground plane. We show how a fast network extraction method using InductEx, which is a pre- and post-processor for the magnetoquasistatic field solver FastHenry, is used to calculate the inductances of a set of SQUIDs (superconducting quantum interference devices) with ground plane holes of different sizes. The results are compared to measurements of physical structures fabricated with the IPHT Jena 1 kA cm −2 RSFQ niobium process to verify accuracy. We then do a parameter study and derive empirical equations for fast and useful estimation of the inductance of wires surrounded by ground plane holes. We also investigate practical circuits and show excellent accuracy. (paper)

  15. THE IMPORTANCE OF THE IMPLEMENTATION OF INTEGRATED INFORMATION SYSTEMS IN THE RESTRUCTURING AND EUROPEAN INTEGRATION PROCESS OF ORGANIZATIONS

    Directory of Open Access Journals (Sweden)

    Steliac Nela

    2010-12-01

    Full Text Available Many of the organizations that are part of the public and private domain in Romania have reached the stage in which the existing information systems can no longer comply with the requests of users. Therefore, we are compelled by necessity to use integrated information systems which should be able to control all kinds of data and to allow access to them, to ensure the coherence and consistency of the stored information. Managers must be aware of the importance of the implementation of integrated information systems in the background restructuring of the organization, which can thus become consistent and competitive with the European Union one, so the integration process becomes a real and possible one.

  16. Geopositioning with a quadcopter: Extracted feature locations and predicted accuracy without a priori sensor attitude information

    Science.gov (United States)

    Dolloff, John; Hottel, Bryant; Edwards, David; Theiss, Henry; Braun, Aaron

    2017-05-01

    This paper presents an overview of the Full Motion Video-Geopositioning Test Bed (FMV-GTB) developed to investigate algorithm performance and issues related to the registration of motion imagery and subsequent extraction of feature locations along with predicted accuracy. A case study is included corresponding to a video taken from a quadcopter. Registration of the corresponding video frames is performed without the benefit of a priori sensor attitude (pointing) information. In particular, tie points are automatically measured between adjacent frames using standard optical flow matching techniques from computer vision, an a priori estimate of sensor attitude is then computed based on supplied GPS sensor positions contained in the video metadata and a photogrammetric/search-based structure from motion algorithm, and then a Weighted Least Squares adjustment of all a priori metadata across the frames is performed. Extraction of absolute 3D feature locations, including their predicted accuracy based on the principles of rigorous error propagation, is then performed using a subset of the registered frames. Results are compared to known locations (check points) over a test site. Throughout this entire process, no external control information (e.g. surveyed points) is used other than for evaluation of solution errors and corresponding accuracy.

  17. Inexperienced clinicians can extract pathoanatomic information from MRI narrative reports with high reproducability for use in research/quality assurance

    DEFF Research Database (Denmark)

    Kent, Peter; Briggs, Andrew M; Albert, Hanne Birgit

    2011-01-01

    Background Although reproducibility in reading MRI images amongst radiologists and clinicians has been studied previously, no studies have examined the reproducibility of inexperienced clinicians in extracting pathoanatomic information from magnetic resonance imaging (MRI) narrative reports and t...

  18. [Extraction of buildings three-dimensional information from high-resolution satellite imagery based on Barista software].

    Science.gov (United States)

    Zhang, Pei-feng; Hu, Yuan-man; He, Hong-shi

    2010-05-01

    The demand for accurate and up-to-date spatial information of urban buildings is becoming more and more important for urban planning, environmental protection, and other vocations. Today's commercial high-resolution satellite imagery offers the potential to extract the three-dimensional information of urban buildings. This paper extracted the three-dimensional information of urban buildings from QuickBird imagery, and validated the precision of the extraction based on Barista software. It was shown that the extraction of three-dimensional information of the buildings from high-resolution satellite imagery based on Barista software had the advantages of low professional level demand, powerful universality, simple operation, and high precision. One pixel level of point positioning and height determination accuracy could be achieved if the digital elevation model (DEM) and sensor orientation model had higher precision and the off-Nadir View Angle was relatively perfect.

  19. Stock market integration and the speed of information transmission

    Czech Academy of Sciences Publication Activity Database

    Černý, Alexandr

    -, č. 242 (2004), s. 1-25 ISSN 1211-3298 R&D Projects: GA AV ČR KSK8002119; GA ČR GA402/04/0270 Institutional research plan: CEZ:AV0Z7085904 Keywords : stock market integration * market comovement * high-frequency data Subject RIV: AH - Economics

  20. Integration of information on climate, soil and cultivar to increase ...

    African Journals Online (AJOL)

    BH660 shows higher water productivity (9.46 kgmm-1 of rainfall) under 2*MMP tillage than late plantings in experimental years. About 84 % of the variability in grain yield (BH660), 88% (Bolondie), 76% (A-511) and 70% (Limat) can be explained by the available soil water in crop root zone at planting. Hence, integration of ...

  1. Fast mapping rapidly integrates information into existing memory networks.

    Science.gov (United States)

    Coutanche, Marc N; Thompson-Schill, Sharon L

    2014-12-01

    Successful learning involves integrating new material into existing memory networks. A learning procedure known as fast mapping (FM), thought to simulate the word-learning environment of children, has recently been linked to distinct neuroanatomical substrates in adults. This idea has suggested the (never-before tested) hypothesis that FM may promote rapid incorporation into cortical memory networks. We test this hypothesis here in 2 experiments. In our 1st experiment, we introduced 50 participants to 16 unfamiliar animals and names through FM or explicit encoding (EE) and tested participants on the training day, and again after sleep. Learning through EE produced strong declarative memories, without immediate lexical competition, as expected from slow-consolidation models. Learning through FM, however, led to almost immediate lexical competition, which continued to the next day. Additionally, the learned words began to prime related concepts on the day following FM (but not EE) training. In a 2nd experiment, we replicated the lexical integration results and determined that presenting an already-known item during learning was crucial for rapid integration through FM. The findings presented here indicate that learned items can be integrated into cortical memory networks at an accelerated rate through fast mapping. The retrieval of a related known concept, in order to infer the target of the FM question, is critical for this effect. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  2. developing a one stop shop model for integrated land information

    African Journals Online (AJOL)

    DEPT OF AGRICULTURAL ENGINEERING

    which will integrate the data on land ownership, land use and land value for all the land agen- ... services to the investor and other potential clients of land sector agencies involved in the land ..... account types such as a general user, re-.

  3. Assuring Integrity of Information Utility in Cyber-Learning Formats.

    Science.gov (United States)

    Morrison, James L.; Stein, Linda L.

    1999-01-01

    Describes a cyber-learning project for the World Wide Web developed by faculty and librarians at the University of Delaware that combined discovery learning with problem-based learning to develop critical thinking and quality management for information. Undergraduates were to find, evaluate, and use information to generate an Internet marketing…

  4. Vertical Integration: Corporate Strategy in the Information Industry.

    Science.gov (United States)

    Davenport, Lizzie; Cronin, Blaise

    1986-01-01

    Profiles the corporate strategies of three sectors of the information industry and the trend toward consolidation in electronic publishing. Three companies' acquisitions are examined in detail using qualitative data from information industry columns and interpreting it on the basis of game theory. (EM)

  5. Design of the Hospital Integrated Information Management System Based on Cloud Platform.

    Science.gov (United States)

    Aijing, L; Jin, Y

    2015-12-01

    At present, the outdated information management style cannot meet the needs of hospital management, and has become the bottleneck of hospital's management and development. In order to improve the integrated management of information, hospitals have increased their investment in integrated information management systems. On account of the lack of reasonable and scientific design, some hospital integrated information management systems have common problems, such as unfriendly interface, poor portability and maintainability, low security and efficiency, lack of interactivity and information sharing. To solve the problem, this paper carries out the research and design of a hospital information management system based on cloud platform, which can realize the optimized integration of hospital information resources and save money.

  6. 45 CFR 61.14 - Confidentiality of Healthcare Integrity and Protection Data Bank information.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Confidentiality of Healthcare Integrity and Protection Data Bank information. 61.14 Section 61.14 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION HEALTHCARE INTEGRITY AND PROTECTION DATA BANK FOR FINAL ADVERSE INFORMATION ON...

  7. 45 CFR 61.12 - Requesting information from the Healthcare Integrity and Protection Data Bank.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Requesting information from the Healthcare Integrity and Protection Data Bank. 61.12 Section 61.12 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION HEALTHCARE INTEGRITY AND PROTECTION DATA BANK FOR FINAL ADVERSE INFORMATION...

  8. Integrating information systems : linking global business goals to local database applications

    NARCIS (Netherlands)

    Dignum, F.P.M.; Houben, G.J.P.M.

    1999-01-01

    This paper describes a new approach to design modern information systems that offer an integrated access to the data and knowledge that is available in local applications. By integrating the local data management activities into one transparent information distribution process, modern organizations

  9. The Integration of the Information and Communication Functions, and the Marketing of the Resulting Products.

    Science.gov (United States)

    Harris, Susan C.

    1985-01-01

    Discusses the theoretical basis for integration of information functions and communication functions, the relevance of this integration in the scientific information cycle, and its positive effect on commodity research networks. The application of this theory is described using three commodity programs of the Centro Internacional de Agricultura…

  10. Overview of image processing tools to extract physical information from JET videos

    Science.gov (United States)

    Craciunescu, T.; Murari, A.; Gelfusa, M.; Tiseanu, I.; Zoita, V.; EFDA Contributors, JET

    2014-11-01

    In magnetic confinement nuclear fusion devices such as JET, the last few years have witnessed a significant increase in the use of digital imagery, not only for the surveying and control of experiments, but also for the physical interpretation of results. More than 25 cameras are routinely used for imaging on JET in the infrared (IR) and visible spectral regions. These cameras can produce up to tens of Gbytes per shot and their information content can be very different, depending on the experimental conditions. However, the relevant information about the underlying physical processes is generally of much reduced dimensionality compared to the recorded data. The extraction of this information, which allows full exploitation of these diagnostics, is a challenging task. The image analysis consists, in most cases, of inverse problems which are typically ill-posed mathematically. The typology of objects to be analysed is very wide, and usually the images are affected by noise, low levels of contrast, low grey-level in-depth resolution, reshaping of moving objects, etc. Moreover, the plasma events have time constants of ms or tens of ms, which imposes tough conditions for real-time applications. On JET, in the last few years new tools and methods have been developed for physical information retrieval. The methodology of optical flow has allowed, under certain assumptions, the derivation of information about the dynamics of video objects associated with different physical phenomena, such as instabilities, pellets and filaments. The approach has been extended in order to approximate the optical flow within the MPEG compressed domain, allowing the manipulation of the large JET video databases and, in specific cases, even real-time data processing. The fast visible camera may provide new information that is potentially useful for disruption prediction. A set of methods, based on the extraction of structural information from the visual scene, have been developed for the

  11. Overview of image processing tools to extract physical information from JET videos

    International Nuclear Information System (INIS)

    Craciunescu, T; Tiseanu, I; Zoita, V; Murari, A; Gelfusa, M

    2014-01-01

    In magnetic confinement nuclear fusion devices such as JET, the last few years have witnessed a significant increase in the use of digital imagery, not only for the surveying and control of experiments, but also for the physical interpretation of results. More than 25 cameras are routinely used for imaging on JET in the infrared (IR) and visible spectral regions. These cameras can produce up to tens of Gbytes per shot and their information content can be very different, depending on the experimental conditions. However, the relevant information about the underlying physical processes is generally of much reduced dimensionality compared to the recorded data. The extraction of this information, which allows full exploitation of these diagnostics, is a challenging task. The image analysis consists, in most cases, of inverse problems which are typically ill-posed mathematically. The typology of objects to be analysed is very wide, and usually the images are affected by noise, low levels of contrast, low grey-level in-depth resolution, reshaping of moving objects, etc. Moreover, the plasma events have time constants of ms or tens of ms, which imposes tough conditions for real-time applications. On JET, in the last few years new tools and methods have been developed for physical information retrieval. The methodology of optical flow has allowed, under certain assumptions, the derivation of information about the dynamics of video objects associated with different physical phenomena, such as instabilities, pellets and filaments. The approach has been extended in order to approximate the optical flow within the MPEG compressed domain, allowing the manipulation of the large JET video databases and, in specific cases, even real-time data processing. The fast visible camera may provide new information that is potentially useful for disruption prediction. A set of methods, based on the extraction of structural information from the visual scene, have been developed for the

  12. A Critical Review of the Integration of Geographic Information System and Building Information Modelling at the Data Level

    Directory of Open Access Journals (Sweden)

    Junxiang Zhu

    2018-02-01

    Full Text Available The benefits brought by the integration of Building Information Modelling (BIM and Geographic Information Systems (GIS are being proved by more and more research. The integration of the two systems is difficult for many reasons. Among them, data incompatibility is the most significant, as BIM and GIS data are created, managed, analyzed, stored, and visualized in different ways in terms of coordinate systems, scope of interest, and data structures. The objective of this paper is to review the relevant research papers to (1 identify the most relevant data models used in BIM/GIS integration and understand their advantages and disadvantages; (2 consider the possibility of other data models that are available for data level integration; and (3 provide direction on the future of BIM/GIS data integration.

  13. Extraction and Analysis of Information Related to Research & Development Declared Under an Additional Protocol

    International Nuclear Information System (INIS)

    Idinger, J.; Labella, R.; Rialhe, A.; Teller, N.

    2015-01-01

    The additional protocol (AP) provides important tools to strengthen and improve the effectiveness and efficiency of the safeguards system. Safeguards are designed to verify that States comply with their international commitments not to use nuclear material or to engage in nuclear-related activities for the purpose of developing nuclear weapons or other nuclear explosive devices. Under an AP based on INFCIRC/540, a State must provide to the IAEA additional information about, and inspector access to, all parts of its nuclear fuel cycle. In addition, the State has to supply information about its nuclear fuel cycle-related research and development (R&D) activities. The majority of States declare their R&D activities under the AP Articles 2.a.(i), 2.a.(x), and 2.b.(i) as part of initial declarations and their annual updates under the AP. In order to verify consistency and completeness of information provided under the AP by States, the Agency has started to analyze declared R&D information by identifying interrelationships between States in different R&D areas relevant to safeguards. The paper outlines the quality of R&D information provided by States to the Agency, describes how the extraction and analysis of relevant declarations are currently carried out at the Agency and specifies what kinds of difficulties arise during evaluation in respect to cross-linking international projects and finding gaps in reporting. In addition, the paper tries to elaborate how the reporting quality of AP information with reference to R&D activities and the assessment process of R&D information could be improved. (author)

  14. Management of information in development projects – a proposed integrated model

    Directory of Open Access Journals (Sweden)

    C. Bester

    2008-11-01

    Full Text Available The first section of the article focuses on the need for development in Africa and the specific challenges of development operations. It describes the need for a holistic and integrated information management model as part of the project management body of knowledge aimed at managing the information flow between communities and development project teams. It is argued that information, and access to information, is crucial in development projects and can therefore be seen as a critical success factor in any development project. In the second section of the article, the three information areas of the holistic and integrated information management model are described. In the section thereafter we suggest roles and actions for information managers to facilitate information processes integral to the model. These processes seek to create a developing information community that aligns itself with the development project, and supports and sustains it.

  15. Integrated management of information inside maintenance processes. From the building registry to BIM systems

    Directory of Open Access Journals (Sweden)

    Cinzia Talamo

    2014-10-01

    Full Text Available The paper presents objec- tives, methods and results of two researches dealing with the improvement of integrated information management within maintenance processes. Focusing on information needs regarding the last phases of the building process, the two researches draft approaches characterizing a path of progressive improve- ment of strategies for integration: from a building registry, unique for the whole construction process, to an integrated management of the building process with the support of BIM systems.

  16. Integrating information technologies as tools for surgical research.

    Science.gov (United States)

    Schell, Scott R

    2005-10-01

    Surgical research is dependent upon information technologies. Selection of the computer, operating system, and software tool that best support the surgical investigator's needs requires careful planning before research commences. This manuscript presents a brief tutorial on how surgical investigators can best select these information technologies, with comparisons and recommendations between existing systems, software, and solutions. Privacy concerns, based upon HIPAA and other regulations, now require careful proactive attention to avoid legal penalties, civil litigation, and financial loss. Security issues are included as part of the discussions related to selection and application of information technology. This material was derived from a segment of the Association for Academic Surgery's Fundamentals of Surgical Research course.

  17. Zone analysis in biology articles as a basis for information extraction.

    Science.gov (United States)

    Mizuta, Yoko; Korhonen, Anna; Mullen, Tony; Collier, Nigel

    2006-06-01

    In the field of biomedicine, an overwhelming amount of experimental data has become available as a result of the high throughput of research in this domain. The amount of results reported has now grown beyond the limits of what can be managed by manual means. This makes it increasingly difficult for the researchers in this area to keep up with the latest developments. Information extraction (IE) in the biological domain aims to provide an effective automatic means to dynamically manage the information contained in archived journal articles and abstract collections and thus help researchers in their work. However, while considerable advances have been made in certain areas of IE, pinpointing and organizing factual information (such as experimental results) remains a challenge. In this paper we propose tackling this task by incorporating into IE information about rhetorical zones, i.e. classification of spans of text in terms of argumentation and intellectual attribution. As the first step towards this goal, we introduce a scheme for annotating biological texts for rhetorical zones and provide a qualitative and quantitative analysis of the data annotated according to this scheme. We also discuss our preliminary research on automatic zone analysis, and its incorporation into our IE framework.

  18. Extract the Relational Information of Static Features and Motion Features for Human Activities Recognition in Videos

    Directory of Open Access Journals (Sweden)

    Li Yao

    2016-01-01

    Full Text Available Both static features and motion features have shown promising performance in human activities recognition task. However, the information included in these features is insufficient for complex human activities. In this paper, we propose extracting relational information of static features and motion features for human activities recognition. The videos are represented by a classical Bag-of-Word (BoW model which is useful in many works. To get a compact and discriminative codebook with small dimension, we employ the divisive algorithm based on KL-divergence to reconstruct the codebook. After that, to further capture strong relational information, we construct a bipartite graph to model the relationship between words of different feature set. Then we use a k-way partition to create a new codebook in which similar words are getting together. With this new codebook, videos can be represented by a new BoW vector with strong relational information. Moreover, we propose a method to compute new clusters from the divisive algorithm’s projective function. We test our work on the several datasets and obtain very promising results.

  19. Stock market integration and the speed of information transmission

    Czech Academy of Sciences Publication Activity Database

    Černý, Alexandr; Koblas, M.

    2008-01-01

    Roč. 58, 1-2 (2008), s. 2-20 ISSN 0015-1920 R&D Projects: GA MŠk LC542 Institutional research plan: CEZ:AV0Z70850503 Keywords : stock market integration * market comovement * intra-day data Subject RIV: AH - Economics Impact factor: 0.275, year: 2008 http://journal.fsv.cuni.cz/storage/1098_str_2_20_-_cerny-koblas.pdf

  20. Role of consciousness in temporal integration of semantic information.

    Science.gov (United States)

    Yang, Yung-Hao; Tien, Yung-Hsuan; Yang, Pei-Ling; Yeh, Su-Ling

    2017-10-01

    Previous studies found that word meaning can be processed unconsciously. Yet it remains unknown whether temporally segregated words can be integrated into a holistic meaningful phrase without consciousness. The first four experiments were designed to examine this by sequentially presenting the first three words of Chinese four-word idioms as prime to one eye and dynamic Mondrians to the other (i.e., the continuous flash suppression paradigm; CFS). An unmasked target word followed the three masked words in a lexical decision task. Results from such invisible (CFS) condition were compared with the visible condition where the preceding words were superimposed on the Mondrians and presented to both eyes. Lower performance in behavioral experiments and larger N400 event-related potentials (ERP) component for incongruent- than congruent-ending words were found in the visible condition. However, no such congruency effect was found in the invisible condition, even with enhanced statistical power and top-down attention, and with several potential confounding factors (contrast-dependent processing, long interval, no conscious training) excluded. Experiment 5 demonstrated that familiarity of word orientation without temporal integration can be processed unconsciously, excluding the possibility of general insensitivity of our paradigm. The overall result pattern therefore suggests that consciousness plays an important role in semantic temporal integration in the conditions we tested.

  1. Integrating Programming Language and Operating System Information Security Mechanisms

    Science.gov (United States)

    2016-08-31

    suggestions for reducing the burden, to the Department of Defense, Executive Service Directorate (0704-0188). Respondents should be aware that...improve the precision of security enforcement, and to provide greater assurance of information security. This grant focuses on two key projects: language...based control of authority; and formal guarantees for the correctness of audit information. 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF: 17

  2. Extending Current Theories of Cross-Boundary Information Sharing and Integration: A Case Study of Taiwan e-Government

    Science.gov (United States)

    Yang, Tung-Mou

    2011-01-01

    Information sharing and integration has long been considered an important approach for increasing organizational efficiency and performance. With advancements in information and communication technologies, sharing and integrating information across organizations becomes more attractive and practical to organizations. However, achieving…

  3. Automated granularity to integrate digital information: the "Antarctic Treaty Searchable Database" case study

    Directory of Open Access Journals (Sweden)

    Paul Arthur Berkman

    2006-06-01

    Full Text Available Access to information is necessary, but not sufficient in our digital era. The challenge is to objectively integrate digital resources based on user-defined objectives for the purpose of discovering information relationships that facilitate interpretations and decision making. The Antarctic Treaty Searchable Database (http://aspire.nvi.net, which is in its sixth edition, provides an example of digital integration based on the automated generation of information granules that can be dynamically combined to reveal objective relationships within and between digital information resources. This case study further demonstrates that automated granularity and dynamic integration can be accomplished simply by utilizing the inherent structure of the digital information resources. Such information integration is relevant to library and archival programs that require long-term preservation of authentic digital resources.

  4. Cortical integrity of the inferior alveolar canal as a predictor of paresthesia after third-molar extraction.

    Science.gov (United States)

    Park, Wonse; Choi, Ji-Wook; Kim, Jae-Young; Kim, Bong-Chul; Kim, Hyung Jun; Lee, Sang-Hwy

    2010-03-01

    Paresthesia is a well-known complication of extraction of mandibular third molars (MTMs). The authors evaluated the relationship between paresthesia after MTM extraction and the cortical integrity of the inferior alveolar canal (IAC) by using computed tomography (CT). The authors designed a retrospective cohort study involving participants considered, on the basis of panoramic imaging, to be at high risk of experiencing injury of the inferior alveolar nerve who subsequently underwent CT imaging and extraction of the MTMs. The primary predictor variable was the contact relationship between the IAC and the MTM as viewed on a CT image, classified into three groups: group 1, no contact; group 2, contact between the MTM and the intact IAC cortex; group 3, contact between the MTM and the interrupted IAC cortex. The secondary predictor variable was the number of CT image slices showing the cortical interruption around the MTM. The outcome variable was the presence or absence of postoperative paresthesia after MTM extraction. The study sample comprised 179 participants who underwent MTM extraction (a total of 259 MTMs). Their mean age was 23.6 years, and 85 (47.5 percent) were male. The overall prevalence of paresthesia was 4.2 percent (11 of 259 teeth). The prevalence of paresthesia in group 3 (involving an interrupted IAC cortex) was 11.8 percent (10 of 85 cases), while for group 2 (involving an intact IAC cortex) and group 1 (involving no contact) it was 1.0 percent (1 of 98 cases) and 0.0 percent (no cases), respectively. The frequency of nerve damage increased with the number of CT image slices showing loss of cortical integrity (P=.043). The results of this study indicate that loss of IAC cortical integrity is associated with an increased risk of experiencing paresthesia after MTM extraction.

  5. A State-of-the-Art Review on the Integration of Building Information Modeling (BIM and Geographic Information System (GIS

    Directory of Open Access Journals (Sweden)

    Xin Liu

    2017-02-01

    Full Text Available The integration of Building Information Modeling (BIM and Geographic Information System (GIS has been identified as a promising but challenging topic to transform information towards the generation of knowledge and intelligence. Achievement of integrating these two concepts and enabling technologies will have a significant impact on solving problems in the civil, building and infrastructure sectors. However, since GIS and BIM were originally developed for different purposes, numerous challenges are being encountered for the integration. To better understand these two different domains, this paper reviews the development and dissimilarities of GIS and BIM, the existing integration methods, and investigates their potential in various applications. This study shows that the integration methods are developed for various reasons and aim to solve different problems. The parameters influencing the choice can be summarized and named as “EEEF” criteria: effectiveness, extensibility, effort, and flexibility. Compared with other methods, semantic web technologies provide a promising and generalized integration solution. However, the biggest challenges of this method are the large efforts required at early stage and the isolated development of ontologies within one particular domain. The isolation problem also applies to other methods. Therefore, openness is the key of the success of BIM and GIS integration.

  6. Health Information Infrastructure for People with Intellectual and Developmental Disabilities (I/DD) Living in Supported Accommodation: Communication, Co-Ordination and Integration of Health Information.

    Science.gov (United States)

    Dahm, Maria R; Georgiou, Andrew; Balandin, Susan; Hill, Sophie; Hemsley, Bronwyn

    2017-10-25

    People with intellectual and/or developmental disability (I/DD) commonly have complex health care needs, but little is known about how their health information is managed in supported accommodation, and across health services providers. This study aimed to describe the current health information infrastructure (i.e., how data and information are collected, stored, communicated, and used) for people with I/DD living in supported accommodation in Australia. It involved a scoping review and synthesis of research, policies, and health documents relevant in this setting. Iterative database and hand searches were conducted across peer-reviewed articles internationally in English and grey literature in Australia (New South Wales) up to September 2015. Data were extracted from the selected relevant literature and analyzed for content themes. Expert stakeholders were consulted to verify the authors' interpretations of the information and content categories. The included 286 sources (peer-reviewed n = 27; grey literature n = 259) reflect that the health information for people with I/DD in supported accommodation is poorly communicated, coordinated and integrated across isolated systems. 'Work-as-imagined' as outlined in policies, does not align with 'work-as-done' in reality. This gap threatens the quality of care and safety of people with I/DD in these settings. The effectiveness of the health information infrastructure and services for people with I/DD can be improved by integrating the information sources and placing people with I/DD and their supporters at the centre of the information exchange process.

  7. The application of integrated safety management principles to the Tritium Extraction Facility project

    International Nuclear Information System (INIS)

    Hickman, M.O.; Viviano, R.R.

    2000-01-01

    The DOE has developed a program that is accomplishing a heightened safety posture across the complex. The Integrated Safety Management (ISM) System (ISMS) program utilizes five core functions and seven guiding principles as the basis for implementation. The core functions define the work scope, analyze the hazards, develop and implement hazard controls, perform the work, and provide feedback for improvement. The guiding principles include line management responsibility, clear roles and responsibilities, competence per responsibilities, identification of safety standards/requirements, tailored hazard control, balanced priorities, and operations authorization. There exists an unspecified eighth principle, that is, worker involvement. A program requiring the direct involvement of the employees who are actually performing the work has been shown to be quite an effective method of communicating safety requirements, controlling work in a safe manner, and reducing safety violations and injuries. The Tritium Extraction Facility (TEF) projects, a component of the DOE's Commercial Light Water Reactor Tritium Production program, has taken the ISM principles and core functions and applied them to the project's design. The task of the design team is to design a facility and systems that will meet the production requirements of the DOE tritium mission as well as a design that minimizes the workers' exposure to adverse safety situations and hazards/hazardous materials. During the development of the preliminary design for the TEF, design teams consisted of not only designers but also personnel who had operational experience in the existing tritium and personnel who had operational experience in the existing tritium and personnel who had specialized experience from across the DOE complex. This design team reviewed multiple documents associated with the TEF operation in order to identify and document the hazards associated with the tritium process. These documents include hazards

  8. Automated microfluidic devices integrating solid-phase extraction, fluorescent labeling, and microchip electrophoresis for preterm birth biomarker analysis.

    Science.gov (United States)

    Sahore, Vishal; Sonker, Mukul; Nielsen, Anna V; Knob, Radim; Kumar, Suresh; Woolley, Adam T

    2018-01-01

    We have developed multichannel integrated microfluidic devices for automated preconcentration, labeling, purification, and separation of preterm birth (PTB) biomarkers. We fabricated multilayer poly(dimethylsiloxane)-cyclic olefin copolymer (PDMS-COC) devices that perform solid-phase extraction (SPE) and microchip electrophoresis (μCE) for automated PTB biomarker analysis. The PDMS control layer had a peristaltic pump and pneumatic valves for flow control, while the PDMS fluidic layer had five input reservoirs connected to microchannels and a μCE system. The COC layers had a reversed-phase octyl methacrylate porous polymer monolith for SPE and fluorescent labeling of PTB biomarkers. We determined μCE conditions for two PTB biomarkers, ferritin (Fer) and corticotropin-releasing factor (CRF). We used these integrated microfluidic devices to preconcentrate and purify off-chip-labeled Fer and CRF in an automated fashion. Finally, we performed a fully automated on-chip analysis of unlabeled PTB biomarkers, involving SPE, labeling, and μCE separation with 1 h total analysis time. These integrated systems have strong potential to be combined with upstream immunoaffinity extraction, offering a compact sample-to-answer biomarker analysis platform. Graphical abstract Pressure-actuated integrated microfluidic devices have been developed for automated solid-phase extraction, fluorescent labeling, and microchip electrophoresis of preterm birth biomarkers.

  9. Developing Integrated Taxonomies for a Tiered Information Architecture

    Science.gov (United States)

    Dutra, Jayne E.

    2006-01-01

    This viewgraph presentation reviews the concept of developing taxonomies for an information architecture. In order to assist people in accessing information required to access information and retrieval, including cross repository searching, a system of nested taxonomies is being developed. Another facet of this developmental project is collecting and documenting attributes about people, to allow for several uses: access management, i.e., who are you and what can you see?; targeted content delivery i.e., what content helps you get your work done?; w ork force planning i.e., what skill sets do you have that we can appl y to work?; and IT Services i.e., How can we provision you with the proper IT services?

  10. Toshiba integrated information system for design of nuclear power plants

    International Nuclear Information System (INIS)

    Abe, Yoko; Kawamura, Hirobumi; Sasaki, Norio; Takasaka, Kiyoshi

    1993-01-01

    TOSHIBA aims to secure safety, increase reliability and improve efficiency through the engineering for nuclear power plants and has been introducing Computer Aided Engineering (CAE). Up to the present, TOSHIBA has been developing computer systems which support each field of design and applying them to the design of nuclear power plants. The new design support system has been developed to integrate each of those systems in order to realize much greater improvement in accuracy and increase of reliability in design using state-of-the-art computer technology

  11. Computer-integrated design and information management for nuclear projects

    International Nuclear Information System (INIS)

    Gonzalez, A.; Martin-Guirado, L.; Nebrera, F.

    1987-01-01

    Over the past seven years, Empresarios Agrupados has been developing a comprehensive, computer-integrated system to perform the majority of the engineering, design, procurement and construction management activities in nuclear, fossil-fired as well as hydro power plant projects. This system, which is already in a production environment, comprises a large number of computer programs and data bases designed using a modular approach. Each software module, dedicated to meeting the needs of a particular design group or project discipline, facilitates the performance of functional tasks characteristic of the power plant engineering process

  12. MedEx: a medication information extraction system for clinical narratives

    Science.gov (United States)

    Stenner, Shane P; Doan, Son; Johnson, Kevin B; Waitman, Lemuel R; Denny, Joshua C

    2010-01-01

    Medication information is one of the most important types of clinical data in electronic medical records. It is critical for healthcare safety and quality, as well as for clinical research that uses electronic medical record data. However, medication data are often recorded in clinical notes as free-text. As such, they are not accessible to other computerized applications that rely on coded data. We describe a new natural language processing system (MedEx), which extracts medication information from clinical notes. MedEx was initially developed using discharge summaries. An evaluation using a data set of 50 discharge summaries showed it performed well on identifying not only drug names (F-measure 93.2%), but also signature information, such as strength, route, and frequency, with F-measures of 94.5%, 93.9%, and 96.0% respectively. We then applied MedEx unchanged to outpatient clinic visit notes. It performed similarly with F-measures over 90% on a set of 25 clinic visit notes. PMID:20064797

  13. Videomicroscopic extraction of specific information on cell proliferation and migration in vitro

    International Nuclear Information System (INIS)

    Debeir, Olivier; Megalizzi, Veronique; Warzee, Nadine; Kiss, Robert; Decaestecker, Christine

    2008-01-01

    In vitro cell imaging is a useful exploratory tool for cell behavior monitoring with a wide range of applications in cell biology and pharmacology. Combined with appropriate image analysis techniques, this approach has been shown to provide useful information on the detection and dynamic analysis of cell events. In this context, numerous efforts have been focused on cell migration analysis. In contrast, the cell division process has been the subject of fewer investigations. The present work focuses on this latter aspect and shows that, in complement to cell migration data, interesting information related to cell division can be extracted from phase-contrast time-lapse image series, in particular cell division duration, which is not provided by standard cell assays using endpoint analyses. We illustrate our approach by analyzing the effects induced by two sigma-1 receptor ligands (haloperidol and 4-IBP) on the behavior of two glioma cell lines using two in vitro cell models, i.e., the low-density individual cell model and the high-density scratch wound model. This illustration also shows that the data provided by our approach are suggestive as to the mechanism of action of compounds, and are thus capable of informing the appropriate selection of further time-consuming and more expensive biological evaluations required to elucidate a mechanism

  14. 5W1H Information Extraction with CNN-Bidirectional LSTM

    Science.gov (United States)

    Nurdin, A.; Maulidevi, N. U.

    2018-03-01

    In this work, information about who, did what, when, where, why, and how on Indonesian news articles were extracted by combining Convolutional Neural Network and Bidirectional Long Short-Term Memory. Convolutional Neural Network can learn semantically meaningful representations of sentences. Bidirectional LSTM can analyze the relations among words in the sequence. We also use word embedding word2vec for word representation. By combining these algorithms, we obtained F-measure 0.808. Our experiments show that CNN-BLSTM outperforms other shallow methods, namely IBk, C4.5, and Naïve Bayes with the F-measure 0.655, 0.645, and 0.595, respectively.

  15. Developing a Process Model for the Forensic Extraction of Information from Desktop Search Applications

    Directory of Open Access Journals (Sweden)

    Timothy Pavlic

    2008-03-01

    Full Text Available Desktop search applications can contain cached copies of files that were deleted from the file system. Forensic investigators see this as a potential source of evidence, as documents deleted by suspects may still exist in the cache. Whilst there have been attempts at recovering data collected by desktop search applications, there is no methodology governing the process, nor discussion on the most appropriate means to do so. This article seeks to address this issue by developing a process model that can be applied when developing an information extraction application for desktop search applications, discussing preferred methods and the limitations of each. This work represents a more structured approach than other forms of current research.

  16. An innovative method for extracting isotopic information from low-resolution gamma spectra

    International Nuclear Information System (INIS)

    Miko, D.; Estep, R.J.; Rawool-Sullivan, M.W.

    1998-01-01

    A method is described for the extraction of isotopic information from attenuated gamma ray spectra using the gross-count material basis set (GC-MBS) model. This method solves for the isotopic composition of an unknown mixture of isotopes attenuated through an absorber of unknown material. For binary isotopic combinations the problem is nonlinear in only one variable and is easily solved using standard line optimization techniques. Results are presented for NaI spectrum analyses of various binary combinations of enriched uranium, depleted uranium, low burnup Pu, 137 Cs, and 133 Ba attenuated through a suite of absorbers ranging in Z from polyethylene through lead. The GC-MBS method results are compared to those computed using ordinary response function fitting and with a simple net peak area method. The GC-MBS method was found to be significantly more accurate than the other methods over the range of absorbers and isotopic blends studied

  17. Integrated processing of ECG's in a hospital information system

    NARCIS (Netherlands)

    Helder, J.C.; Schram, P.H.; Verwey, H.; Meijler, F.L.; Robles de Medina, E.O.

    The ECG handling in the University Hospital of Utrecht is composed by a system consisting of acquisition and storage of ECG signals, computer analysis, data management, and storage of readings in a patient data base. The last two modules are part of a Hospital Information System (HIS). The modular

  18. Empowerment of Cancer Survivors Through Information Technology : An Integrative Review

    NARCIS (Netherlands)

    Groen, Wim G.; Kuijpers, Wilma; Oldenburg, Hester S.A.; Wouters, Michel W.J.M.; Aaronson, Neil K.; van Harten, Willem H.

    2015-01-01

    Background: Patient empowerment may be an effective approach to strengthen the role of cancer survivors and to reduce the burden on health care. However, it is not well conceptualized, notably in oncology. Furthermore, it is unclear to what extent information technology (IT) services can contribute

  19. Empowerment of cancer survivors through information technology: an integrative review

    NARCIS (Netherlands)

    Groen, W.G.; Kuijpers, W.; Oldenburg, H.S.A.; Wouters, M.W.J.M.; Aaronson, N.K.; van Harten, W.H.

    2015-01-01

    Background: Patient empowerment may be an effective approach to strengthen the role of cancer survivors and to reduce the burden on health care. However, it is not well conceptualized, notably in oncology. Furthermore, it is unclear to what extent information technology (IT) services can contribute

  20. Exploitation of Unintentional Information Leakage from Integrated Circuits

    Science.gov (United States)

    Cobb, William E.

    2011-01-01

    The information leakage of electronic devices, especially those used in cryptographic or other vital applications, represents a serious practical threat to secure systems. While physical implementation attacks have evolved rapidly over the last decade, relatively little work has been done to allow system designers to effectively counter the…

  1. An Art Information System: From Integration to Interpretation.

    Science.gov (United States)

    Barnett, Patricia J.

    1988-01-01

    Explores the qualities of bibliographic and object entities that contribute to the similarities and differences in the data describing them and the possibility of cooperation between indexers of art objects and indexers of bibliographic information. The discussion covers the role of authority control as the linking component between bibliographic…

  2. Exploration into technical procedures for vertical integration. [information systems

    Science.gov (United States)

    Michel, R. J.; Maw, K. D.

    1979-01-01

    Issues in the design and use of a digital geographic information system incorporating landuse, zoning, hazard, LANDSAT, and other data are discussed. An eleven layer database was generated. Issues in spatial resolution, registration, grid versus polygonal structures, and comparison of photointerpreted landuse to LANDSAT land cover are examined.

  3. Smart Libraries: Integrating Communications Channels and Information Sources.

    Science.gov (United States)

    Webb, T. D.; Jensen, E. A.

    Noting that the changing nature of information delivery has established immediacy as the new basis for modern library service, this paper describes the new facilities design and floor plan for the library of Kapiolani Community College of the University of Hawaii. The new library was carefully designed so that students can move progressively from…

  4. Toward an Information Integration Approach to Issue Advertising.

    Science.gov (United States)

    Douglas, William; And Others

    Issue advertising is intended to inform an audience--most commonly with the intent of changing unfavorable opinions or reinforcing favorable ones--to affect cognition (in contrast to the behavioral emphasis of product and service advertising, intended to stimulate trail and adoption). To explore public reactions to printed and televised issues…

  5. Intensive care unit nurses' information needs and recommendations for integrated displays to improve nurses' situation awareness.

    Science.gov (United States)

    Koch, Sven H; Weir, Charlene; Haar, Maral; Staggers, Nancy; Agutter, Jim; Görges, Matthias; Westenskow, Dwayne

    2012-01-01

    Fatal errors can occur in intensive care units (ICUs). Researchers claim that information integration at the bedside may improve nurses' situation awareness (SA) of patients and decrease errors. However, it is unclear which information should be integrated and in what form. Our research uses the theory of SA to analyze the type of tasks, and their associated information gaps. We aimed to provide recommendations for integrated, consolidated information displays to improve nurses' SA. Systematic observations methods were used to follow 19 ICU nurses for 38 hours in 3 clinical practice settings. Storyboard methods and concept mapping helped to categorize the observed tasks, the associated information needs, and the information gaps of the most frequent tasks by SA level. Consensus and discussion of the research team was used to propose recommendations to improve information displays at the bedside based on information deficits. Nurses performed 46 different tasks at a rate of 23.4 tasks per hour. The information needed to perform the most common tasks was often inaccessible, difficult to see at a distance or located on multiple monitoring devices. Current devices at the ICU bedside do not adequately support a nurse's information-gathering activities. Medication management was the most frequent category of tasks. Information gaps were present at all levels of SA and across most of the tasks. Using a theoretical model to understand information gaps can aid in designing functional requirements. Integrated information that enhances nurses' Situation Awareness may decrease errors and improve patient safety in the future.

  6. Economic Analysis of an Integrated Annatto Seeds-Sugarcane Biorefinery Using Supercritical CO2 Extraction as a First Step

    Directory of Open Access Journals (Sweden)

    Juliana Q. Albarelli

    2016-06-01

    Full Text Available Recently, supercritical fluid extraction (SFE has been indicated to be utilized as part of a biorefinery, rather than as a stand-alone technology, since besides extracting added value compounds selectively it has been shown to have a positive effect on the downstream processing of biomass. To this extent, this work evaluates economically the encouraging experimental results regarding the use of SFE during annatto seeds valorization. Additionally, other features were discussed such as the benefits of enhancing the bioactive compounds concentration through physical processes and of integrating the proposed annatto seeds biorefinery to a hypothetical sugarcane biorefinery, which produces its essential inputs, e.g., CO2, ethanol, heat and electricity. For this, first, different configurations were modeled and simulated using the commercial simulator Aspen Plus® to determine the mass and energy balances. Next, each configuration was economically assessed using MATLAB. SFE proved to be decisive to the economic feasibility of the proposed annatto seeds-sugarcane biorefinery concept. SFE pretreatment associated with sequential fine particles separation process enabled higher bixin-rich extract production using low-pressure solvent extraction method employing ethanol, meanwhile tocotrienols-rich extract is obtained as a first product. Nevertheless, the economic evaluation showed that increasing tocotrienols-rich extract production has a more pronounced positive impact on the economic viability of the concept.

  7. Data reduction pipeline for the CHARIS integral-field spectrograph I: detector readout calibration and data cube extraction

    Science.gov (United States)

    Brandt, Timothy D.; Rizzo, Maxime; Groff, Tyler; Chilcote, Jeffrey; Greco, Johnny P.; Kasdin, N. Jeremy; Limbach, Mary Anne; Galvin, Michael; Loomis, Craig; Knapp, Gillian; McElwain, Michael W.; Jovanovic, Nemanja; Currie, Thayne; Mede, Kyle; Tamura, Motohide; Takato, Naruhisa; Hayashi, Masahiko

    2017-10-01

    We present the data reduction pipeline for CHARIS, a high-contrast integral-field spectrograph for the Subaru Telescope. The pipeline constructs a ramp from the raw reads using the measured nonlinear pixel response and reconstructs the data cube using one of three extraction algorithms: aperture photometry, optimal extraction, or χ2 fitting. We measure and apply both a detector flatfield and a lenslet flatfield and reconstruct the wavelength- and position-dependent lenslet point-spread function (PSF) from images taken with a tunable laser. We use these measured PSFs to implement a χ2-based extraction of the data cube, with typical residuals of ˜5% due to imperfect models of the undersampled lenslet PSFs. The full two-dimensional residual of the χ2 extraction allows us to model and remove correlated read noise, dramatically improving CHARIS's performance. The χ2 extraction produces a data cube that has been deconvolved with the line-spread function and never performs any interpolations of either the data or the individual lenslet spectra. The extracted data cube also includes uncertainties for each spatial and spectral measurement. CHARIS's software is parallelized, written in Python and Cython, and freely available on github with a separate documentation page. Astrometric and spectrophotometric calibrations of the data cubes and PSF subtraction will be treated in a forthcoming paper.

  8. EnvMine: A text-mining system for the automatic extraction of contextual information

    Directory of Open Access Journals (Sweden)

    de Lorenzo Victor

    2010-06-01

    Full Text Available Abstract Background For ecological studies, it is crucial to count on adequate descriptions of the environments and samples being studied. Such a description must be done in terms of their physicochemical characteristics, allowing a direct comparison between different environments that would be difficult to do otherwise. Also the characterization must include the precise geographical location, to make possible the study of geographical distributions and biogeographical patterns. Currently, there is no schema for annotating these environmental features, and these data have to be extracted from textual sources (published articles. So far, this had to be performed by manual inspection of the corresponding documents. To facilitate this task, we have developed EnvMine, a set of text-mining tools devoted to retrieve contextual information (physicochemical variables and geographical locations from textual sources of any kind. Results EnvMine is capable of retrieving the physicochemical variables cited in the text, by means of the accurate identification of their associated units of measurement. In this task, the system achieves a recall (percentage of items retrieved of 92% with less than 1% error. Also a Bayesian classifier was tested for distinguishing parts of the text describing environmental characteristics from others dealing with, for instance, experimental settings. Regarding the identification of geographical locations, the system takes advantage of existing databases such as GeoNames to achieve 86% recall with 92% precision. The identification of a location includes also the determination of its exact coordinates (latitude and longitude, thus allowing the calculation of distance between the individual locations. Conclusion EnvMine is a very efficient method for extracting contextual information from different text sources, like published articles or web pages. This tool can help in determining the precise location and physicochemical

  9. An Integrated Hydrologic Model and Remote Sensing Synthesis Approach to Study Groundwater Extraction During a Historic Drought in the California Central Valley

    Science.gov (United States)

    Thatch, L. M.; Maxwell, R. M.; Gilbert, J. M.

    2017-12-01

    Over the past century, groundwater levels in California's San Joaquin Valley have dropped more than 30 meters in some areas due to excessive groundwater extraction to irrigate agricultural lands and feed a growing population. Between 2012 and 2016 California experienced the worst drought in its recorded history, further exacerbating this groundwater depletion. Due to lack of groundwater regulation, exact quantities of extracted groundwater in California are unknown and hard to quantify. We use a synthesis of integrated hydrologic model simulations and remote sensing products to quantify the impact of drought and groundwater pumping on the Central Valley water tables. The Parflow-CLM model was used to evaluate groundwater depletion in the San Joaquin River basin under multiple groundwater extraction scenarios simulated from pre-drought through recent drought years. Extraction scenarios included pre-development conditions, with no groundwater pumping; historical conditions based on decreasing groundwater level measurements; and estimated groundwater extraction rates calculated from the deficit between the predicted crop water demand, based on county land use surveys, and available surface water supplies. Results were compared to NASA's Gravity Recover and Climate Experiment (GRACE) data products to constrain water table decline from groundwater extraction during severe drought. This approach untangles various factors leading to groundwater depletion within the San Joaquin Valley both during drought and years of normal recharge to help evaluate which areas are most susceptible to groundwater overdraft, as well as further evaluating the spatially and temporally variable sustainable yield. Recent efforts to improve water management and ensure reliable water supplies are highlighted by California's Sustainable Groundwater Management Act (SGMA) which mandates Groundwater Sustainability Agencies to determine the maximum quantity of groundwater that can be withdrawn through

  10. VaProS: a database-integration approach for protein/genome information retrieval

    KAUST Repository

    Gojobori, Takashi; Ikeo, Kazuho; Katayama, Yukie; Kawabata, Takeshi; Kinjo, Akira R.; Kinoshita, Kengo; Kwon, Yeondae; Migita, Ohsuke; Mizutani, Hisashi; Muraoka, Masafumi; Nagata, Koji; Omori, Satoshi; Sugawara, Hideaki; Yamada, Daichi; Yura, Kei

    2016-01-01

    Life science research now heavily relies on all sorts of databases for genome sequences, transcription, protein three-dimensional (3D) structures, protein–protein interactions, phenotypes and so forth. The knowledge accumulated by all the omics research is so vast that a computer-aided search of data is now a prerequisite for starting a new study. In addition, a combinatory search throughout these databases has a chance to extract new ideas and new hypotheses that can be examined by wet-lab experiments. By virtually integrating the related databases on the Internet, we have built a new web application that facilitates life science researchers for retrieving experts’ knowledge stored in the databases and for building a new hypothesis of the research target. This web application, named VaProS, puts stress on the interconnection between the functional information of genome sequences and protein 3D structures, such as structural effect of the gene mutation. In this manuscript, we present the notion of VaProS, the databases and tools that can be accessed without any knowledge of database locations and data formats, and the power of search exemplified in quest of the molecular mechanisms of lysosomal storage disease. VaProS can be freely accessed at http://p4d-info.nig.ac.jp/vapros/.

  11. VaProS: a database-integration approach for protein/genome information retrieval

    KAUST Repository

    Gojobori, Takashi

    2016-12-24

    Life science research now heavily relies on all sorts of databases for genome sequences, transcription, protein three-dimensional (3D) structures, protein–protein interactions, phenotypes and so forth. The knowledge accumulated by all the omics research is so vast that a computer-aided search of data is now a prerequisite for starting a new study. In addition, a combinatory search throughout these databases has a chance to extract new ideas and new hypotheses that can be examined by wet-lab experiments. By virtually integrating the related databases on the Internet, we have built a new web application that facilitates life science researchers for retrieving experts’ knowledge stored in the databases and for building a new hypothesis of the research target. This web application, named VaProS, puts stress on the interconnection between the functional information of genome sequences and protein 3D structures, such as structural effect of the gene mutation. In this manuscript, we present the notion of VaProS, the databases and tools that can be accessed without any knowledge of database locations and data formats, and the power of search exemplified in quest of the molecular mechanisms of lysosomal storage disease. VaProS can be freely accessed at http://p4d-info.nig.ac.jp/vapros/.

  12. A Concept Lattice for Semantic Integration of Geo-Ontologies Based on Weight of Inclusion Degree Importance and Information Entropy

    Directory of Open Access Journals (Sweden)

    Jia Xiao

    2016-11-01

    Full Text Available Constructing a merged concept lattice with formal concept analysis (FCA is an important research direction in the field of integrating multi-source geo-ontologies. Extracting essential geographical properties and reducing the concept lattice are two key points of previous research. A formal integration method is proposed to address the challenges in these two areas. We first extract essential properties from multi-source geo-ontologies and use FCA to build a merged formal context. Second, the combined importance weight of each single attribute of the formal context is calculated by introducing the inclusion degree importance from rough set theory and information entropy; then a weighted formal context is built from the merged formal context. Third, a combined weighted concept lattice is established from the weighted formal context with FCA and the importance weight value of every concept is defined as the sum of weight of attributes belonging to the concept’s intent. Finally, semantic granularity of concept is defined by its importance weight; we, then gradually reduce the weighted concept lattice by setting up diminishing threshold of semantic granularity. Additionally, all of those reduced lattices are organized into a regular hierarchy structure based on the threshold of semantic granularity. A workflow is designed to demonstrate this procedure. A case study is conducted to show feasibility and validity of this method and the procedure to integrate multi-source geo-ontologies.

  13. Implementation of integrated heterogeneous electronic electrocardiography data into Maharaj Nakorn Chiang Mai Hospital Information System.

    Science.gov (United States)

    Khumrin, Piyapong; Chumpoo, Pitupoom

    2016-03-01

    Electrocardiography is one of the most important non-invasive diagnostic tools for diagnosing coronary heart disease. The electrocardiography information system in Maharaj Nakorn Chiang Mai Hospital required a massive manual labor effort. In this article, we propose an approach toward the integration of heterogeneous electrocardiography data and the implementation of an integrated electrocardiography information system into the existing Hospital Information System. The system integrates different electrocardiography formats into a consistent electrocardiography rendering by using Java software. The interface acts as middleware to seamlessly integrate different electrocardiography formats. Instead of using a common electrocardiography protocol, we applied a central format based on Java classes for mapping different electrocardiography formats which contains a specific parser for each electrocardiography format to acquire the same information. Our observations showed that the new system improved the effectiveness of data management, work flow, and data quality; increased the availability of information; and finally improved quality of care. © The Author(s) 2014.

  14. Geographic Information Systems for the Regional Integration of Renewable Energies

    International Nuclear Information System (INIS)

    Amador Guerra, J.; Dominguez Bravo, J.

    2000-01-01

    This report is based on the project: The GIS in the regional integration of Renewable Energies for decentralised electricity production; developed by CIEMAT (Spanish Energy Research Centre) and UPM (Polytechnic University of Madrid, Spain) since 1997. The objective of this project is to analyse, evaluate and improve the GIS methodologies for application in RE and how GIS can aid in the evaluation and simulation of influence of technical, socio economical and geographical parameters. This project begin with the review of SOLARGIS methodology. SOLARGIS was developed by an european research team (included CIEMAT) in the frame of JOULE II Programme. In the first place this report described the state of the art in the application of GIS to Renewable Energies. In second place, the SOLARGIS review tasks and the application of this new product to Lorca (Murcia Region in Spain). Finally, the report describes the methodology for the spatial sensibility analysis. (Author) 24 refs

  15. Smoother Sailing Ahead: Integrating Information Technology into the Surface Navy.

    Science.gov (United States)

    1994-09-01

    foreign competition, Ford Motor Company examined many areas of its operations. In its supply procurement division Ford employed over 500 personnel...generating reams of paperwork that contained many inventory discrepancies and billing errors. Mazda Motors in Japan performed the same function better...with five employees. (Schnitt, 1993, p. 19) The competition forced Ford to adopt new methods using modem information technology. Instead of pieces of

  16. Integration of Heterogeneous Bibliographic Information through Data Abstractions.

    Science.gov (United States)

    1986-01-01

    11-12 [COMPENDEX] a) Electronics v 56 n 7 Apr 7 1983 p 155-157. b) IEEE Trans Magn v Mag-14 n 5 Sep 1978, INTERMAG (lnt Magn) Conf, I Florence, Italy ...developed a g0eograph.Cally distributed information systems as DOE/ PECaN . DOD/OROLS. NASA/RECON. CAS On-Line. OARC (France) and DECHEMA (West Germany

  17. Study on advanced systematic function of the JNC geological disposal technical information integration system. Research document

    International Nuclear Information System (INIS)

    Ishihara, Yoshinao; Fukui, Hiroshi; Sagawa, Hiroshi; Matsunaga, Kenichi; Ito Takaya

    2004-02-01

    In this study, while attaining systematization about the technical know-how mutually utilized between geology environmental field, disposal technology (design) field and safety assessment field, the share function of general information in which the formation of an information share and the use promotion between the technical information management databases built for every field were aimed at as an advancement of the function of JNC Geological Disposal Technical Information Integration System considered, and the system function for realizing considered in integration of technical information. (1) Since the concrete information about geology environment which is gradually updated with progress of stratum disposal research, or increases in reflected suitable for research of design and safety assessment. After arranging the form suitable for systematizing technical information, while arranging the technical information in both the fields of design and safety assessment with the form of two classes based on tasks/works, it systematized planning adjustment about delivery of technical information with geology environmental field. (2) In order to aim at integration of 3-fields technical information of geological disposal, based on the examination result of systematization of technical information, the function of mutual use of the information managed in two or more databases was considered. Moreover, while considering system functions, such as management of the use history of technical information, connection of information use, and a notice of common information, the system operation windows in consideration of the ease of operation was examined. (author)

  18. Extraction of prospecting information of uranium deposit based on high spatial resolution satellite data. Taking bashibulake region as an example

    International Nuclear Information System (INIS)

    Yang Xu; Liu Dechang; Zhang Jielin

    2008-01-01

    In this study, the signification and content of prospecting information of uranium deposit are expounded. Quickbird high spatial resolution satellite data are used to extract the prospecting information of uranium deposit in Bashibulake area in the north of Tarim Basin. By using the pertinent methods of image processing, the information of ore-bearing bed, ore-control structure and mineralized alteration have been extracted. The results show a high consistency with the field survey. The aim of this study is to explore practicability of high spatial resolution satellite data for prospecting minerals, and to broaden the thinking of prospectation at similar area. (authors)

  19. A Study on Information Technology Integrated Guided Iscovery Instruction towards Students' Learning Achievement and Learning Retention

    Science.gov (United States)

    Shieh, Chich-Jen; Yu, Lean

    2016-01-01

    In the information explosion era with constant changes of information, educators have promoted various effective learning strategies for students adapting to the complex modern society. The impact and influence of traditional teaching method have information technology integrated modern instruction and science concept learning play an important…

  20. Integrated care: an Information Model for Patient Safety and Vigilance Reporting Systems.

    Science.gov (United States)

    Rodrigues, Jean-Marie; Schulz, Stefan; Souvignet, Julien

    2015-01-01

    Quality management information systems for safety as a whole or for specific vigilances share the same information types but are not interoperable. An international initiative tries to develop an integrated information model for patient safety and vigilance reporting to support a global approach of heath care quality.

  1. Effects of Brief Integrated Information Literacy Education Sessions on Undergraduate Engineering Students' Interdisciplinary Research

    Science.gov (United States)

    Talikka, Marja; Soukka, Risto; Eskelinen, Harri

    2018-01-01

    Engineering students often conduct information searches without sufficient consideration of the context of their research topic. This article discusses how development of a new information literacy (IL) mindset through instruction in integrated IL education affects students' understanding of research problems and formulation of information search…

  2. Multi-fields' coordination information integrated platform for nuclear power plant operation preparation

    International Nuclear Information System (INIS)

    Yuan Chang; Li Yong; Ye Zhiqiang

    2011-01-01

    To realize the coordination in multi-fields' work and information sharing, by applying the method of Enterprise Architecture (EA), the business architecture, functional flow and application architecture of Nuclear Power Plant's operation preparation information integrated platform are designed, which can realize the information sharing and coordination of multi fields. (authors)

  3. Microscope-integrated intraoperative optical coherence tomography-guided small-incision lenticule extraction: New surgical technique.

    Science.gov (United States)

    Sharma, Namrata; Urkude, Jayanand; Chaniyara, Manthan; Titiyal, Jeewan S

    2017-10-01

    We describe the surgical technique of microscope-integrated intraoperative optical coherence tomography (OCT)-guided small-incision lenticule extraction. The technique enables manual tracking of surgical instruments and identification of the desired dissection plane. It also helps discern the relation between the dissector and the intrastromal lenticule. The dissection plane becomes hyperreflective on dissection, ensuring complete separation of the intrastromal lenticule from the overlying and underlying stroma. Inadvertent posterior plane entry, cap-lenticule adhesion, incomplete separation of the lenticule, creation of a false plane, and lenticule remnants may be recognized intraoperatively so corrective steps can be taken immediately. In cases with a hazy overlying cap, microscope-integrated intraoperative OCT enables localization and extraction of the lenticule. The technique is helpful for inexperienced surgeons, especially in cases with low amplitudes of refractive errors, ie, thin lenticules. Copyright © 2017 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  4. Efficacy of integrating information literacy education into a women's health course on information literacy for RN-BSN students.

    Science.gov (United States)

    Ku, Ya-Lie; Sheu, Sheila; Kuo, Shih-Ming

    2007-03-01

    Information literacy, essential to evidences-based nursing, can promote nurses' capability for life-long learning. Nursing education should strive to employ information literacy education in nursing curricula to improve information literacy abilities among nursing students. This study explored the effectiveness of information literacy education by comparing information literacy skills among a group of RN-BSN (Registered Nurse to Bachelors of Science in Nursing) students who received information literacy education with a group that did not. This quasi-experimental study was conducted during a women's health issues course taught between March and June 2004. Content was presented to the 32 RN-BSN students enrolled in this course, which also taught skills on searching and screening, integrating, analyzing, applying, and presenting information. At the beginning and end of the program, 75 RN-BSN student self-evaluated on a 10 point Likert scale their attained skills in searching and screening, integrating, analyzing, applying, and presenting information. Results identified no significant differences between the experimental (n = 32) and control groups (n = 43) in terms of age, marital status, job title, work unit, years of work experience, and information literacy skills as measured at the beginning of the semester. At the end of the semester during which content was taught, the information literacy of the experimental group in all categories, with the exception of information presentation, was significantly improved as compared to that of the control group. Results were especially significant in terms of integrating, analyzing, and applying skill categories. It is hoped that in the future nursing students will apply enhanced information literacy to address and resolve patients' health problems in clinical settings.

  5. Integrated photooxidative extractive deep desulfurization using metal doped TiO2 and eutectic based ionic liquid

    Science.gov (United States)

    Zaid, Hayyiratul Fatimah Mohd; Kait, Chong Fai; Mutalib, Mohamed Ibrahim Abdul

    2016-11-01

    A series of metal doped TiO2 namely Fe/TiO2, Cu/TiO2 and Cu-Fe/TiO2 were synthesized and characterized, to be used as a photocatalyst in the integrated photooxidative extractive deep desulfurization for model oil (dodecane) and diesel fuel. The order of the photocatalytic activity was Cu-Fe/TiO2 followed by Cu/TiO2 and then Fe/TiO2. Cu-Fe/TiO2 was an effective photocatalyst for sulfur conversion at ambient atmospheric pressure. Hydrogen peroxide was used as the source of oxidant and eutectic-based ionic liquid as the extractant. Sulfur conversion in model oil reached 100%. Removal of sulfur from model oil was done by two times extraction with a removal of 97.06% in the first run and 2.94% in the second run.

  6. Three-tiered integration of PACS and HIS toward next generation total hospital information system.

    Science.gov (United States)

    Kim, J H; Lee, D H; Choi, J W; Cho, H I; Kang, H S; Yeon, K M; Han, M C

    1998-01-01

    The Seoul National University Hospital (SNUH) started a project to innovate the hospital information facilities. This project includes installation of high speed hospital network, development of new HIS, OCS (order communication system), RIS and PACS. This project aims at the implementation of the first total hospital information system by seamlessly integrating these systems together. To achieve this goal, we took three-tiered systems integration approach: network level, database level, and workstation level integration. There are 3 loops of networks in SNUH: proprietary star network for host computer based HIS, Ethernet based hospital LAN for OCS and RIS, and ATM based network for PACS. They are linked together at the backbone level to allow high speed communication between these systems. We have developed special communication modules for each system that allow data interchange between different databases and computer platforms. We have also developed an integrated workstation in which both the OCS and PACS application programs run on a single computer in an integrated manner allowing the clinical users to access and display radiological images as well as textual clinical information within a single user environment. A study is in progress toward a total hospital information system in SNUH by seamlessly integrating the main hospital information resources such as HIS, OCS, and PACS. With the three-tiered systems integration approach, we could successfully integrate the systems from the network level to the user application level.

  7. Specification of an integrated information architecture for a mobile teleoperated robot for home telecare.

    Science.gov (United States)

    Iannuzzi, David; Grant, Andrew; Corriveau, Hélène; Boissy, Patrick; Michaud, Francois

    2016-12-01

    The objective of this study was to design effectively integrated information architecture for a mobile teleoperated robot in remote assistance to the delivery of home health care. Three role classes were identified related to the deployment of a telerobot, namely, engineer, technology integrator, and health professional. Patients and natural caregivers were indirectly considered, this being a component of future field studies. Interviewing representatives of each class provided the functions, and information content and flows for each function. Interview transcripts enabled the formulation of UML (Universal Modeling Language) diagrams for feedback from participants. The proposed information architecture was validated with a use-case scenario. The integrated information architecture incorporates progressive design, ergonomic integration, and the home care needs from medical specialist, nursing, physiotherapy, occupational therapy, and social worker care perspectives. The integrated architecture iterative process promoted insight among participants. The use-case scenario evaluation showed the design's robustness. Complex innovation such as a telerobot must coherently mesh with health-care service delivery needs. The deployment of integrated information architecture bridging development, with specialist and home care applications, is necessary for home care technology innovation. It enables continuing evolution of robot and novel health information design in the same integrated architecture, while accounting for patient ecological need.

  8. Integrating resource selection information with spatial capture--recapture

    Science.gov (United States)

    Royle, J. Andrew; Chandler, Richard B.; Sun, Catherine C.; Fuller, Angela K.

    2013-01-01

    1. Understanding space usage and resource selection is a primary focus of many studies of animal populations. Usually, such studies are based on location data obtained from telemetry, and resource selection functions (RSFs) are used for inference. Another important focus of wildlife research is estimation and modeling population size and density. Recently developed spatial capture–recapture (SCR) models accomplish this objective using individual encounter history data with auxiliary spatial information on location of capture. SCR models include encounter probability functions that are intuitively related to RSFs, but to date, no one has extended SCR models to allow for explicit inference about space usage and resource selection.

  9. Workstations for the wellsite; An integrated information management system

    Energy Technology Data Exchange (ETDEWEB)

    Morley, A.R. (Exlog Inc. (US))

    1991-03-01

    This paper describes and innovative management system designed to assist well drilling, engineering, and geological decision makers. The problem of providing powerful and flexible applications without creating complexity has been solved by an intuitive graphical user interface. Through the adoption of petroleum industry standards, an open system was designed to facilitate information input and output. Careful attention to emerging computer standards has resulted in a system that is portable across a wide range of current hardware from multiple vendors and that will be easily movable to new hardware platforms as they become available.

  10. Database and applications security integrating information security and data management

    CERN Document Server

    Thuraisingham, Bhavani

    2005-01-01

    This is the first book to provide an in-depth coverage of all the developments, issues and challenges in secure databases and applications. It provides directions for data and application security, including securing emerging applications such as bioinformatics, stream information processing and peer-to-peer computing. Divided into eight sections, each of which focuses on a key concept of secure databases and applications, this book deals with all aspects of technology, including secure relational databases, inference problems, secure object databases, secure distributed databases and emerging

  11. Information Technology Integration in Higher Education: A Novel Approach for Impact Assessment

    Directory of Open Access Journals (Sweden)

    Abdulkareem Al-Alwani

    2014-12-01

    Full Text Available In the current technological world of Information services, academic systems are also in the process of adapting information technology solutions. Information systems vary for different applications and specifically in academia domain, a range of information systems are available for different institutions worldwide. Integration of e-learning can optimize implementation of computer-based and computer-assisted educational processes at all levels. Therefore it is imperative to assess and evaluate integration of these information systems because they have serious impact on e-learning processes. In this study an instrument survey is presented for evaluating integration of information technology systems and practices in an educational environment. Survey is constructed using descriptive questions related to information technology tools to assess qualitative impact and usage of such tools. Critical feedback, analysis and suggestions from 25 educationists played a pivotal role in finalizing proposed survey questionnaire. A subsequent test evaluation by teachers and students is also carried out to assess adequate utilization of information systems in Yanbu University College. The results showed that feedback using this survey can help in identifying technological gaps and facilitate effective integration of information technology in an educational environment. Survey instrument proposed in this research can greatly enhance integration of IT tools as it can identify shortcomings by collecting statistical data from feedback of both faculty and students. Solution to these problems is deterministic and can be easily implemented to optimize overall performance of e-learning systems.

  12. Move to learn: Integrating spatial information from multiple viewpoints.

    Science.gov (United States)

    Holmes, Corinne A; Newcombe, Nora S; Shipley, Thomas F

    2018-05-11

    Recalling a spatial layout from multiple orientations - spatial flexibility - is challenging, even when the global configuration can be viewed from a single vantage point, but more so when it must be viewed piecemeal. In the current study, we examined whether experiencing the transition between multiple viewpoints enhances spatial memory and flexible recall for a spatial configuration viewed simultaneously (Exp. 1) and sequentially (Exp. 2), whether the type of transition matters, and whether action provides an additional advantage over passive experience. In Experiment 1, participants viewed an array of dollhouse furniture from four viewpoints, but with all furniture simultaneously visible. In Experiment 2, participants viewed the same array piecemeal, from four partitioned viewpoints that allowed for viewing only a segment at a time. The transition between viewpoints involved rotation of the array or participant movement around it. Rotation and participant movement were passively experienced or actively generated. The control condition presented the dollhouse as a series of static views. Across both experiments, participant movement significantly enhanced spatial memory relative to array rotation or static views. However, in Exp. 2, there was a further advantage for actively walking around the array compared to being passively pushed. These findings suggest that movement around a stable environment is key to spatial memory and flexible recall, with action providing an additional boost to the integration of temporally segmented spatial events. Thus, spatial memory may be more flexible than prior data indicate, when studied under more natural acquisition conditions. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Integrated Tokamak modeling: When physics informs engineering and research planning

    Science.gov (United States)

    Poli, Francesca Maria

    2018-05-01

    Modeling tokamaks enables a deeper understanding of how to run and control our experiments and how to design stable and reliable reactors. We model tokamaks to understand the nonlinear dynamics of plasmas embedded in magnetic fields and contained by finite size, conducting structures, and the interplay between turbulence, magneto-hydrodynamic instabilities, and wave propagation. This tutorial guides through the components of a tokamak simulator, highlighting how high-fidelity simulations can guide the development of reduced models that can be used to understand how the dynamics at a small scale and short time scales affects macroscopic transport and global stability of plasmas. It discusses the important role that reduced models have in the modeling of an entire plasma discharge from startup to termination, the limits of these models, and how they can be improved. It discusses the important role that efficient workflows have in the coupling between codes, in the validation of models against experiments and in the verification of theoretical models. Finally, it reviews the status of integrated modeling and addresses the gaps and needs towards predictions of future devices and fusion reactors.

  14. Extracting chemical information from high-resolution Kβ X-ray emission spectroscopy

    Science.gov (United States)

    Limandri, S.; Robledo, J.; Tirao, G.

    2018-06-01

    High-resolution X-ray emission spectroscopy allows studying the chemical environment of a wide variety of materials. Chemical information can be obtained by fitting the X-ray spectra and observing the behavior of some spectral features. Spectral changes can also be quantified by means of statistical parameters calculated by considering the spectrum as a probability distribution. Another possibility is to perform statistical multivariate analysis, such as principal component analysis. In this work the performance of these procedures for extracting chemical information in X-ray emission spectroscopy spectra for mixtures of Mn2+ and Mn4+ oxides are studied. A detail analysis of the parameters obtained, as well as the associated uncertainties is shown. The methodologies are also applied for Mn oxidation state characterization of double perovskite oxides Ba1+xLa1-xMnSbO6 (with 0 ≤ x ≤ 0.7). The results show that statistical parameters and multivariate analysis are the most suitable for the analysis of this kind of spectra.

  15. Information Extraction of Tourist Geological Resources Based on 3d Visualization Remote Sensing Image

    Science.gov (United States)

    Wang, X.

    2018-04-01

    Tourism geological resources are of high value in admiration, scientific research and universal education, which need to be protected and rationally utilized. In the past, most of the remote sensing investigations of tourism geological resources used two-dimensional remote sensing interpretation method, which made it difficult for some geological heritages to be interpreted and led to the omission of some information. This aim of this paper is to assess the value of a method using the three-dimensional visual remote sensing image to extract information of geological heritages. skyline software system is applied to fuse the 0.36 m aerial images and 5m interval DEM to establish the digital earth model. Based on the three-dimensional shape, color tone, shadow, texture and other image features, the distribution of tourism geological resources in Shandong Province and the location of geological heritage sites were obtained, such as geological structure, DaiGu landform, granite landform, Volcanic landform, sandy landform, Waterscapes, etc. The results show that using this method for remote sensing interpretation is highly recognizable, making the interpretation more accurate and comprehensive.

  16. Measuring nuclear reaction cross sections to extract information on neutrinoless double beta decay

    Science.gov (United States)

    Cavallaro, M.; Cappuzzello, F.; Agodi, C.; Acosta, L.; Auerbach, N.; Bellone, J.; Bijker, R.; Bonanno, D.; Bongiovanni, D.; Borello-Lewin, T.; Boztosun, I.; Branchina, V.; Bussa, M. P.; Calabrese, S.; Calabretta, L.; Calanna, A.; Calvo, D.; Carbone, D.; Chávez Lomelí, E. R.; Coban, A.; Colonna, M.; D'Agostino, G.; De Geronimo, G.; Delaunay, F.; Deshmukh, N.; de Faria, P. N.; Ferraresi, C.; Ferreira, J. L.; Finocchiaro, P.; Fisichella, M.; Foti, A.; Gallo, G.; Garcia, U.; Giraudo, G.; Greco, V.; Hacisalihoglu, A.; Kotila, J.; Iazzi, F.; Introzzi, R.; Lanzalone, G.; Lavagno, A.; La Via, F.; Lay, J. A.; Lenske, H.; Linares, R.; Litrico, G.; Longhitano, F.; Lo Presti, D.; Lubian, J.; Medina, N.; Mendes, D. R.; Muoio, A.; Oliveira, J. R. B.; Pakou, A.; Pandola, L.; Petrascu, H.; Pinna, F.; Reito, S.; Rifuggiato, D.; Rodrigues, M. R. D.; Russo, A. D.; Russo, G.; Santagati, G.; Santopinto, E.; Sgouros, O.; Solakci, S. O.; Souliotis, G.; Soukeras, V.; Spatafora, A.; Torresi, D.; Tudisco, S.; Vsevolodovna, R. I. M.; Wheadon, R. J.; Yildirin, A.; Zagatto, V. A. B.

    2018-02-01

    Neutrinoless double beta decay (0vββ) is considered the best potential resource to access the absolute neutrino mass scale. Moreover, if observed, it will signal that neutrinos are their own anti-particles (Majorana particles). Presently, this physics case is one of the most important research “beyond Standard Model” and might guide the way towards a Grand Unified Theory of fundamental interactions. Since the 0vββ decay process involves nuclei, its analysis necessarily implies nuclear structure issues. In the NURE project, supported by a Starting Grant of the European Research Council (ERC), nuclear reactions of double charge-exchange (DCE) are used as a tool to extract information on the 0vββ Nuclear Matrix Elements. In DCE reactions and ββ decay indeed the initial and final nuclear states are the same and the transition operators have similar structure. Thus the measurement of the DCE absolute cross-sections can give crucial information on ββ matrix elements. In a wider view, the NUMEN international collaboration plans a major upgrade of the INFN-LNS facilities in the next years in order to increase the experimental production of nuclei of at least two orders of magnitude, thus making feasible a systematic study of all the cases of interest as candidates for 0vββ.

  17. Unsupervised Symbolization of Signal Time Series for Extraction of the Embedded Information

    Directory of Open Access Journals (Sweden)

    Yue Li

    2017-03-01

    Full Text Available This paper formulates an unsupervised algorithm for symbolization of signal time series to capture the embedded dynamic behavior. The key idea is to convert time series of the digital signal into a string of (spatially discrete symbols from which the embedded dynamic information can be extracted in an unsupervised manner (i.e., no requirement for labeling of time series. The main challenges here are: (1 definition of the symbol assignment for the time series; (2 identification of the partitioning segment locations in the signal space of time series; and (3 construction of probabilistic finite-state automata (PFSA from the symbol strings that contain temporal patterns. The reported work addresses these challenges by maximizing the mutual information measures between symbol strings and PFSA states. The proposed symbolization method has been validated by numerical simulation as well as by experimentation in a laboratory environment. Performance of the proposed algorithm has been compared to that of two commonly used algorithms of time series partitioning.

  18. A methodology for the extraction of quantitative information from electron microscopy images at the atomic level

    International Nuclear Information System (INIS)

    Galindo, P L; Pizarro, J; Guerrero, E; Guerrero-Lebrero, M P; Scavello, G; Yáñez, A; Sales, D L; Herrera, M; Molina, S I; Núñez-Moraleda, B M; Maestre, J M

    2014-01-01

    In this paper we describe a methodology developed at the University of Cadiz (Spain) in the past few years for the extraction of quantitative information from electron microscopy images at the atomic level. This work is based on a coordinated and synergic activity of several research groups that have been working together over the last decade in two different and complementary fields: Materials Science and Computer Science. The aim of our joint research has been to develop innovative high-performance computing techniques and simulation methods in order to address computationally challenging problems in the analysis, modelling and simulation of materials at the atomic scale, providing significant advances with respect to existing techniques. The methodology involves several fundamental areas of research including the analysis of high resolution electron microscopy images, materials modelling, image simulation and 3D reconstruction using quantitative information from experimental images. These techniques for the analysis, modelling and simulation allow optimizing the control and functionality of devices developed using materials under study, and have been tested using data obtained from experimental samples

  19. Dual-wavelength phase-shifting digital holography selectively extracting wavelength information from wavelength-multiplexed holograms.

    Science.gov (United States)

    Tahara, Tatsuki; Mori, Ryota; Kikunaga, Shuhei; Arai, Yasuhiko; Takaki, Yasuhiro

    2015-06-15

    Dual-wavelength phase-shifting digital holography that selectively extracts wavelength information from five wavelength-multiplexed holograms is presented. Specific phase shifts for respective wavelengths are introduced to remove the crosstalk components and extract only the object wave at the desired wavelength from the holograms. Object waves in multiple wavelengths are selectively extracted by utilizing 2π ambiguity and the subtraction procedures based on phase-shifting interferometry. Numerical results show the validity of the proposed technique. The proposed technique is also experimentally demonstrated.

  20. Pomegranate extract protects against cerebral ischemia/reperfusion injury and preserves brain DNA integrity in rats.

    Science.gov (United States)

    Ahmed, Maha A E; El Morsy, Engy M; Ahmed, Amany A E

    2014-08-21

    Interruption to blood flow causes ischemia and infarction of brain tissues with consequent neuronal damage and brain dysfunction. Pomegranate extract is well tolerated, and safely consumed all over the world. Interestingly, pomegranate extract has shown remarkable antioxidant and anti-inflammatory effects in experimental models. Many investigators consider natural extracts as novel therapies for neurodegenerative disorders. Therefore, this study was carried out to investigate the protective effects of standardized pomegranate extract against cerebral ischemia/reperfusion-induced brain injury in rats. Adult male albino rats were randomly divided into sham-operated control group, ischemia/reperfusion (I/R) group, and two other groups that received standardized pomegranate extract at two dose levels (250, 500 mg/kg) for 15 days prior to ischemia/reperfusion (PMG250+I/R, and PMG500+I/R groups). After I/R or sham operation, all rats were sacrificed and brains were harvested for subsequent biochemical analysis. Results showed reduction in brain contents of MDA (malondialdehyde), and NO (nitric oxide), in addition to enhancement of SOD (superoxide dismutase), GPX (glutathione peroxidase), and GRD (glutathione reductase) activities in rats treated with pomegranate extract prior to cerebral I/R. Moreover, pomegranate extract decreased brain levels of NF-κB p65 (nuclear factor kappa B p65), TNF-α (tumor necrosis factor-alpha), caspase-3 and increased brain levels of IL-10 (interleukin-10), and cerebral ATP (adenosine triphosphate) production. Comet assay showed less brain DNA (deoxyribonucleic acid) damage in rats protected with pomegranate extract. The present study showed, for the first time, that pre-administration of pomegranate extract to rats, can offer a significant dose-dependent neuroprotective activity against cerebral I/R brain injury and DNA damage via antioxidant, anti-inflammatory, anti-apoptotic and ATP-replenishing effects. Copyright © 2014 Elsevier Inc